Sample records for high throughput experimental

  1. High-Throughput Experimental Approach Capabilities | Materials Science |

    Science.gov Websites

    NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non

  2. High Throughput Experimental Materials Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakutayev, Andriy; Perkins, John; Schwarting, Marcus

    The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).

  3. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  4. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  5. Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols

    DTIC Science & Technology

    2016-06-01

    unlimited. v List of Tables Table 1 Single-lap-joint experimental parameters ..............................................7 Table 2 Survey ...Joints: Experimental and Workflow Protocols by Robert E Jensen, Daniel C DeSchepper, and David P Flanagan Approved for...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental

  6. Experimental Design for Combinatorial and High Throughput Materials Development

    NASA Astrophysics Data System (ADS)

    Cawse, James N.

    2002-12-01

    In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.

  7. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.

  8. High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.

    2016-09-23

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less

  9. Microscale High-Throughput Experimentation as an Enabling Technology in Drug Discovery: Application in the Discovery of (Piperidinyl)pyridinyl-1H-benzimidazole Diacylglycerol Acyltransferase 1 Inhibitors.

    PubMed

    Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D

    2017-05-11

    Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.

  10. Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Mirsky, Vladimir M.

    New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.

  11. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  12. Biases in the Experimental Annotations of Protein Function and Their Effect on Our Understanding of Protein Function Space

    PubMed Central

    Schnoes, Alexandra M.; Ream, David C.; Thorman, Alexander W.; Babbitt, Patricia C.; Friedberg, Iddo

    2013-01-01

    The ongoing functional annotation of proteins relies upon the work of curators to capture experimental findings from scientific literature and apply them to protein sequence and structure data. However, with the increasing use of high-throughput experimental assays, a small number of experimental studies dominate the functional protein annotations collected in databases. Here, we investigate just how prevalent is the “few articles - many proteins” phenomenon. We examine the experimentally validated annotation of proteins provided by several groups in the GO Consortium, and show that the distribution of proteins per published study is exponential, with 0.14% of articles providing the source of annotations for 25% of the proteins in the UniProt-GOA compilation. Since each of the dominant articles describes the use of an assay that can find only one function or a small group of functions, this leads to substantial biases in what we know about the function of many proteins. Mass-spectrometry, microscopy and RNAi experiments dominate high throughput experiments. Consequently, the functional information derived from these experiments is mostly of the subcellular location of proteins, and of the participation of proteins in embryonic developmental pathways. For some organisms, the information provided by different studies overlap by a large amount. We also show that the information provided by high throughput experiments is less specific than those provided by low throughput experiments. Given the experimental techniques available, certain biases in protein function annotation due to high-throughput experiments are unavoidable. Knowing that these biases exist and understanding their characteristics and extent is important for database curators, developers of function annotation programs, and anyone who uses protein function annotation data to plan experiments. PMID:23737737

  13. HIGH-THROUGHPUT IDENTIFICATION OF CATALYTIC REDOX-ACTIVE CYSTEINE RESIDUES

    EPA Science Inventory

    Cysteine (Cys) residues often play critical roles in proteins; however, identification of their specific functions has been limited to case-by-case experimental approaches. We developed a procedure for high-throughput identification of catalytic redox-active Cys in proteins by se...

  14. 20180312 - Uncertainty and Variability in High-Throughput Toxicokinetics for Risk Prioritization (SOT)

    EPA Science Inventory

    Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...

  15. Combinatorial and high-throughput screening of materials libraries: review of state of the art.

    PubMed

    Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert

    2011-11-14

    Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.

  16. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    PubMed

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  17. The promise and challenge of high-throughput sequencing of the antibody repertoire

    PubMed Central

    Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R

    2014-01-01

    Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474

  18. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments.

    PubMed

    Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua

    2018-01-04

    Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    PubMed Central

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  20. A high throughput transformation system allows the regeneration of marker-free plum plants (Prunus domestica L.)

    USDA-ARS?s Scientific Manuscript database

    A high-throughput transformation system previously developed in our laboratory was used for the regeneration of transgenic plum plants without the use of antibiotic selection. The system was first tested with two experimental constructs, pGA482GGi and pCAMBIAgfp94(35S), that contain selective marke...

  1. Mining high-throughput experimental data to link gene and function

    PubMed Central

    Blaby-Haas, Crysten E.; de Crécy-Lagard, Valérie

    2011-01-01

    Nearly 2200 genomes encoding some 6 million proteins have now been sequenced. Around 40% of these proteins are of unknown function even when function is loosely and minimally defined as “belonging to a superfamily”. In addition to in silico methods, the swelling stream of high-throughput experimental data can give valuable clues for linking these “unknowns” with precise biological roles. The goal is to develop integrative data-mining platforms that allow the scientific community at large to access and utilize this rich source of experimental knowledge. To this end, we review recent advances in generating whole-genome experimental datasets, where this data can be accessed, and how it can be used to drive prediction of gene function. PMID:21310501

  2. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  3. Quality control methodology for high-throughput protein-protein interaction screening.

    PubMed

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  4. High throughput optical lithography by scanning a massive array of bowtie aperture antennas at near-field

    PubMed Central

    Wen, X.; Datta, A.; Traverso, L. M.; Pan, L.; Xu, X.; Moon, E. E.

    2015-01-01

    Optical lithography, the enabling process for defining features, has been widely used in semiconductor industry and many other nanotechnology applications. Advances of nanotechnology require developments of high-throughput optical lithography capabilities to overcome the optical diffraction limit and meet the ever-decreasing device dimensions. We report our recent experimental advancements to scale up diffraction unlimited optical lithography in a massive scale using the near field nanolithography capabilities of bowtie apertures. A record number of near-field optical elements, an array of 1,024 bowtie antenna apertures, are simultaneously employed to generate a large number of patterns by carefully controlling their working distances over the entire array using an optical gap metrology system. Our experimental results reiterated the ability of using massively-parallel near-field devices to achieve high-throughput optical nanolithography, which can be promising for many important nanotechnology applications such as computation, data storage, communication, and energy. PMID:26525906

  5. Experimental Study of an Advanced Concept of Moderate-resolution Holographic Spectrographs

    NASA Astrophysics Data System (ADS)

    Muslimov, Eduard; Valyavin, Gennady; Fabrika, Sergei; Musaev, Faig; Galazutdinov, Gazinur; Pavlycheva, Nadezhda; Emelianov, Eduard

    2018-07-01

    We present the results of an experimental study of an advanced moderate-resolution spectrograph based on a cascade of narrow-band holographic gratings. The main goal of the project is to achieve a moderately high spectral resolution with R up to 5000 simultaneously in the 4300–6800 Å visible spectral range on a single standard CCD, together with an increased throughput. The experimental study consisted of (1) resolution and image quality tests performed using the solar spectrum, and (2) a total throughput test performed for a number of wavelengths using a calibrated lab monochromator. The measured spectral resolving power reaches values over R > 4000 while the experimental throughput is as high as 55%, which agrees well with the modeling results. Comparing the obtained characteristics of the spectrograph under consideration with the best existing spectrographs, we conclude that the used concept can be considered as a very competitive and cheap alternative to the existing spectrographs of the given class. We propose several astrophysical applications for the instrument and discuss the prospect of creating its full-scale version.

  6. Carbohydrate Microarray Technology Applied to High-Throughput Mapping of Plant Cell Wall Glycans Using Comprehensive Microarray Polymer Profiling (CoMPP).

    PubMed

    Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho

    2017-01-01

    Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.

  7. Transition-metal-free catalysts for the sustainable epoxidation of alkenes: from discovery to optimisation by means of high throughput experimentation.

    PubMed

    Lueangchaichaweng, Warunee; Geukens, Inge; Peeters, Annelies; Jarry, Benjamin; Launay, Franck; Bonardet, Jean-Luc; Jacobs, Pierre A; Pescarmona, Paolo P

    2012-02-01

    Transition-metal-free oxides were studied as heterogeneous catalysts for the sustainable epoxidation of alkenes with aqueous H₂O₂ by means of high throughput experimentation (HTE) techniques. A full-factorial HTE approach was applied in the various stages of the development of the catalysts: the synthesis of the materials, their screening as heterogeneous catalysts in liquid-phase epoxidation and the optimisation of the reaction conditions. Initially, the chemical composition of transition-metal-free oxides was screened, leading to the discovery of gallium oxide as a novel, active and selective epoxidation catalyst. On the basis of these results, the research line was continued with the study of structured porous aluminosilicates, gallosilicates and silica-gallia composites. In general, the gallium-based materials showed the best catalytic performances. This family of materials represents a promising class of heterogeneous catalysts for the sustainable epoxidation of alkenes and offers a valid alternative to the transition-metal heterogeneous catalysts commonly used in epoxidation. High throughput experimentation played an important role in promoting the development of these catalytic systems.

  8. Mining high-throughput experimental data to link gene and function.

    PubMed

    Blaby-Haas, Crysten E; de Crécy-Lagard, Valérie

    2011-04-01

    Nearly 2200 genomes that encode around 6 million proteins have now been sequenced. Around 40% of these proteins are of unknown function, even when function is loosely and minimally defined as 'belonging to a superfamily'. In addition to in silico methods, the swelling stream of high-throughput experimental data can give valuable clues for linking these unknowns with precise biological roles. The goal is to develop integrative data-mining platforms that allow the scientific community at large to access and utilize this rich source of experimental knowledge. To this end, we review recent advances in generating whole-genome experimental datasets, where this data can be accessed, and how it can be used to drive prediction of gene function. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    PubMed

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  10. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    PubMed Central

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  11. Advances in high throughput DNA sequence data compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz

    2016-06-01

    Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.

  12. LOCATE: a mouse protein subcellular localization database

    PubMed Central

    Fink, J. Lynn; Aturaliya, Rajith N.; Davis, Melissa J.; Zhang, Fasheng; Hanson, Kelly; Teasdale, Melvena S.; Kai, Chikatoshi; Kawai, Jun; Carninci, Piero; Hayashizaki, Yoshihide; Teasdale, Rohan D.

    2006-01-01

    We present here LOCATE, a curated, web-accessible database that houses data describing the membrane organization and subcellular localization of proteins from the FANTOM3 Isoform Protein Sequence set. Membrane organization is predicted by the high-throughput, computational pipeline MemO. The subcellular locations of selected proteins from this set were determined by a high-throughput, immunofluorescence-based assay and by manually reviewing >1700 peer-reviewed publications. LOCATE represents the first effort to catalogue the experimentally verified subcellular location and membrane organization of mammalian proteins using a high-throughput approach and provides localization data for ∼40% of the mouse proteome. It is available at . PMID:16381849

  13. Achieving High Throughput for Data Transfer over ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  14. High-throughput ab-initio dilute solute diffusion database.

    PubMed

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-19

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  15. Outside-In Systems Pharmacology Combines Innovative Computational Methods With High-Throughput Whole Vertebrate Studies.

    PubMed

    Schulthess, Pascal; van Wijk, Rob C; Krekels, Elke H J; Yates, James W T; Spaink, Herman P; van der Graaf, Piet H

    2018-04-25

    To advance the systems approach in pharmacology, experimental models and computational methods need to be integrated from early drug discovery onward. Here, we propose outside-in model development, a model identification technique to understand and predict the dynamics of a system without requiring prior biological and/or pharmacological knowledge. The advanced data required could be obtained by whole vertebrate, high-throughput, low-resource dose-exposure-effect experimentation with the zebrafish larva. Combinations of these innovative techniques could improve early drug discovery. © 2018 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  16. Industrializing electrophysiology: HT automated patch clamp on SyncroPatch® 96 using instant frozen cells.

    PubMed

    Polonchuk, Liudmila

    2014-01-01

    Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.

  17. Perspective: Composition–structure–property mapping in high-throughput experiments: Turning data into knowledge

    DOE PAGES

    Hattrick-Simpers, Jason R.; Gregoire, John M.; Kusne, A. Gilad

    2016-05-26

    With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. Here, we review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams andmore » beyond.« less

  18. Combinatorial and high-throughput approaches in polymer science

    NASA Astrophysics Data System (ADS)

    Zhang, Huiqi; Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.

    2005-01-01

    Combinatorial and high-throughput approaches have become topics of great interest in the last decade due to their potential ability to significantly increase research productivity. Recent years have witnessed a rapid extension of these approaches in many areas of the discovery of new materials including pharmaceuticals, inorganic materials, catalysts and polymers. This paper mainly highlights our progress in polymer research by using an automated parallel synthesizer, microwave synthesizer and ink-jet printer. The equipment and methodologies in our experiments, the high-throughput experimentation of different polymerizations (such as atom transfer radical polymerization, cationic ring-opening polymerization and emulsion polymerization) and the automated matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS) sample preparation are described.

  19. High throughput light absorber discovery, Part 2: Establishing structure–band gap energy relationships

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; ...

    2016-09-23

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less

  20. High throughput light absorber discovery, Part 2: Establishing structure–band gap energy relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less

  1. High Throughput Light Absorber Discovery, Part 2: Establishing Structure-Band Gap Energy Relationships.

    PubMed

    Suram, Santosh K; Newhouse, Paul F; Zhou, Lan; Van Campen, Douglas G; Mehta, Apurva; Gregoire, John M

    2016-11-14

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4 V 1.5 Fe 0.5 O 10.5 as a light absorber with direct band gap near 2.7 eV. The strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platform for identifying new optical materials.

  2. High-Throughput Screening of Therapeutic Neural Stimulation Targets: Toward Principles of Preventing and Treating Post-Traumatic Stress Disorder

    DTIC Science & Technology

    2009-09-01

    onset and averaged across all excited units tested (mean ± SE). 7 SUPPLEMENTAL EXPERIMENTAL PROCEDURES Virus design and production...to baseline level 355 ± 505 ms later. The level of post -light firing did not vary with repeated light exposure (p > 0.7, paired t- test comparing...High-Throughput Screening of Therapeutic Neural Stimulation Targets: Toward Principles of Preventing and Treating Post - Traumatic Stress Disorder

  3. NREL Opens Large Database of Inorganic Thin-Film Materials | News | NREL

    Science.gov Websites

    Inorganic Thin-Film Materials April 3, 2018 An extensive experimental database of inorganic thin-film Energy Laboratory (NREL) is now publicly available. The High Throughput Experimental Materials (HTEM Schroeder / NREL) "All existing experimental databases either contain many entries or have all this

  4. High-throughput ab-initio dilute solute diffusion database

    PubMed Central

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  5. From cancer genomes to cancer models: bridging the gaps

    PubMed Central

    Baudot, Anaïs; Real, Francisco X.; Izarzugaza, José M. G.; Valencia, Alfonso

    2009-01-01

    Cancer genome projects are now being expanded in an attempt to provide complete landscapes of the mutations that exist in tumours. Although the importance of cataloguing genome variations is well recognized, there are obvious difficulties in bridging the gaps between high-throughput resequencing information and the molecular mechanisms of cancer evolution. Here, we describe the current status of the high-throughput genomic technologies, and the current limitations of the associated computational analysis and experimental validation of cancer genetic variants. We emphasize how the current cancer-evolution models will be influenced by the high-throughput approaches, in particular through efforts devoted to monitoring tumour progression, and how, in turn, the integration of data and models will be translated into mechanistic knowledge and clinical applications. PMID:19305388

  6. Microfluidics in microbiology: putting a magnifying glass on microbes.

    PubMed

    Siddiqui, Sanya; Tufenkji, Nathalie; Moraes, Christopher

    2016-09-12

    Microfluidic technologies enable unique studies in the field of microbiology to facilitate our understanding of microorganisms. Using miniaturized and high-throughput experimental capabilities in microfluidics, devices with controlled microenvironments can be created for microbial studies in research fields such as healthcare and green energy. In this research highlight, we describe recently developed tools for diagnostic assays, high-throughput mutant screening, and the study of human disease development as well as a future outlook on microbes for renewable energy.

  7. Optimization of High-Throughput Sequencing Kinetics for determining enzymatic rate constants of thousands of RNA substrates

    PubMed Central

    Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633

  8. High-throughput determination of structural phase diagram and constituent phases using GRENDEL

    NASA Astrophysics Data System (ADS)

    Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.

    2015-11-01

    Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.

  9. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  10. Non-Gaussian Distribution of DNA Barcode Extension In Nanochannels Using High-throughput Imaging

    NASA Astrophysics Data System (ADS)

    Sheats, Julian; Reinhart, Wesley; Reifenberger, Jeff; Gupta, Damini; Muralidhar, Abhiram; Cao, Han; Dorfman, Kevin

    2015-03-01

    We present experimental data for the extension of internal segments of highly confined DNA using a high-­throughput experimental setup. Barcode­-labeled E. coli genomic DNA molecules were imaged at a high areal density in square nanochannels with sizes ranging from 40 nm to 51 nm in width. Over 25,000 molecules were used to obtain more than 1,000,000 measurements for genomic distances between 2,500 bp and 100,000 bp. The distribution of extensions has positive excess kurtosis and is skew­ left due to weak backfolding in the channel. As a result, the two Odijk theories for the chain extension and variance bracket the experimental data. We compared to predictions of a harmonic approximation for the confinement free energy and show that it produces a substantial error in the variance. These results suggest an inherent error associated with any statistical analysis of barcoded DNA that relies on harmonic models for chain extension. Present address: Department of Chemical and Biological Engineering, Princeton University.

  11. Pot binding as a variable confounding plant phenotype: theoretical derivation and experimental observations.

    PubMed

    Sinclair, Thomas R; Manandhar, Anju; Shekoofa, Avat; Rosas-Anderson, Pablo; Bagherzadi, Laleh; Schoppach, Remy; Sadok, Walid; Rufty, Thomas W

    2017-04-01

    Theoretical derivation predicted growth retardation due to pot water limitations, i.e., pot binding. Experimental observations were consistent with these limitations. Combined, these results indicate a need for caution in high-throughput screening and phenotyping. Pot experiments are a mainstay in many plant studies, including the current emphasis on developing high-throughput, phenotyping systems. Pot studies can be vulnerable to decreased physiological activity of the plants particularly when pot volume is small, i.e., "pot binding". It is necessary to understand the conditions under which pot binding may exist to avoid the confounding influence of pot binding in interpreting experimental results. In this paper, a derivation is offered that gives well-defined conditions for the occurrence of pot binding based on restricted water availability. These results showed that not only are pot volume and plant size important variables, but the potting media is critical. Artificial potting mixtures used in many studies, including many high-throughput phenotyping systems, are particularly susceptible to the confounding influences of pot binding. Experimental studies for several crop species are presented that clearly show the existence of thresholds of plant leaf area at which various pot sizes and potting media result in the induction of pot binding even though there may be no immediate, visual plant symptoms. The derivation and experimental results showed that pot binding can readily occur in plant experiments if care is not given to have sufficiently large pots, suitable potting media, and maintenance of pot water status. Clear guidelines are provided for avoiding the confounding effects of water-limited pot binding in studying plant phenotype.

  12. Microreactor Cells for High-Throughput X-ray Absorption Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beesley, Angela; Tsapatsaris, Nikolaos; Weiher, Norbert

    2007-01-19

    High-throughput experimentation has been applied to X-ray Absorption spectroscopy as a novel route for increasing research productivity in the catalysis community. Suitable instrumentation has been developed for the rapid determination of the local structure in the metal component of precursors for supported catalysts. An automated analytical workflow was implemented that is much faster than traditional individual spectrum analysis. It allows the generation of structural data in quasi-real time. We describe initial results obtained from the automated high throughput (HT) data reduction and analysis of a sample library implemented through the 96 well-plate industrial standard. The results show that a fullymore » automated HT-XAS technology based on existing industry standards is feasible and useful for the rapid elucidation of geometric and electronic structure of materials.« less

  13. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy.

    PubMed

    Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred

    2011-10-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  14. Integrative Systems Biology for Data Driven Knowledge Discovery

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2015-01-01

    Integrative systems biology is an approach that brings together diverse high throughput experiments and databases to gain new insights into biological processes or systems at molecular through physiological levels. These approaches rely on diverse high-throughput experimental techniques that generate heterogeneous data by assaying varying aspects of complex biological processes. Computational approaches are necessary to provide an integrative view of these experimental results and enable data-driven knowledge discovery. Hypotheses generated from these approaches can direct definitive molecular experiments in a cost effective manner. Using integrative systems biology approaches, we can leverage existing biological knowledge and large-scale data to improve our understanding of yet unknown components of a system of interest and how its malfunction leads to disease. PMID:21044756

  15. A catalogue of polymorphisms related to xenobiotic metabolism and cancer susceptibility.

    PubMed

    Gemignani, Federica; Landi, Stefano; Vivant, Franck; Zienolddiny, Shanbeh; Brennan, Paul; Canzian, Federico

    2002-08-01

    High-throughput genotyping technology of multiple genes based on large samples of cases and controls are likely to be important in identifying common genes which have a moderate effect on the development of specific diseases. We present here a comprehensive list of 313 known experimentally confirmed polymorphisms in 54 genes which are particularly relevant for metabolism of drugs, alcohol, tobacco, and other potential carcinogens. We have compiled a catalog with a standardized format that summarizes the genetic and biochemical properties of the selected polymorphisms. We have also confirmed or redesigned experimental conditions for simplex or multiplex PCR amplification of a subset of 168 SNPs of particular interest, which will provide the basis for the design of assays compatible with high-throughput genotyping.

  16. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    PubMed Central

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-01-01

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283

  17. Review of high-throughput techniques for detecting solid phase Transformation from material libraries produced by combinatorial methods

    NASA Technical Reports Server (NTRS)

    Lee, Jonathan A.

    2005-01-01

    High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.

  18. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    PubMed

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  19. Methods for processing high-throughput RNA sequencing data.

    PubMed

    Ares, Manuel

    2014-11-03

    High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.

  20. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    PubMed

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  1. Quantitative monitoring of Arabidopsis thaliana growth and development using high-throughput plant phenotyping

    PubMed Central

    Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid

    2016-01-01

    With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments. PMID:27529152

  2. High-throughput GPU-based LDPC decoding

    NASA Astrophysics Data System (ADS)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  3. Quantitative monitoring of Arabidopsis thaliana growth and development using high-throughput plant phenotyping.

    PubMed

    Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid

    2016-08-16

    With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments.

  4. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    NASA Astrophysics Data System (ADS)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi

    2010-06-01

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.

  5. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  6. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  7. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  8. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  9. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  10. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    NASA Astrophysics Data System (ADS)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  11. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less

  12. Theory and implementation of a very high throughput true random number generator in field programmable gate array.

    PubMed

    Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao

    2016-04-01

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  13. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  14. High-throughput strategies for the discovery and engineering of enzymes for biocatalysis.

    PubMed

    Jacques, Philippe; Béchet, Max; Bigan, Muriel; Caly, Delphine; Chataigné, Gabrielle; Coutte, François; Flahaut, Christophe; Heuson, Egon; Leclère, Valérie; Lecouturier, Didier; Phalip, Vincent; Ravallec, Rozenn; Dhulster, Pascal; Froidevaux, Rénato

    2017-02-01

    Innovations in novel enzyme discoveries impact upon a wide range of industries for which biocatalysis and biotransformations represent a great challenge, i.e., food industry, polymers and chemical industry. Key tools and technologies, such as bioinformatics tools to guide mutant library design, molecular biology tools to create mutants library, microfluidics/microplates, parallel miniscale bioreactors and mass spectrometry technologies to create high-throughput screening methods and experimental design tools for screening and optimization, allow to evolve the discovery, development and implementation of enzymes and whole cells in (bio)processes. These technological innovations are also accompanied by the development and implementation of clean and sustainable integrated processes to meet the growing needs of chemical, pharmaceutical, environmental and biorefinery industries. This review gives an overview of the benefits of high-throughput screening approach from the discovery and engineering of biocatalysts to cell culture for optimizing their production in integrated processes and their extraction/purification.

  15. A high-throughput colorimetric assay for glucose detection based on glucose oxidase-catalyzed enlargement of gold nanoparticles

    NASA Astrophysics Data System (ADS)

    Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin

    2015-09-01

    We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a

  16. High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.

    PubMed

    Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen

    2013-09-09

    Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.

  17. High-throughput countercurrent microextraction in passive mode.

    PubMed

    Xie, Tingliang; Xu, Cong

    2018-05-15

    Although microextraction is much more efficient than conventional macroextraction, its practical application has been limited by low throughputs and difficulties in constructing robust countercurrent microextraction (CCME) systems. In this work, a robust CCME process was established based on a novel passive microextractor with four units without any moving parts. The passive microextractor has internal recirculation and can efficiently mix two immiscible liquids. The hydraulic characteristics as well as the extraction and back-extraction performance of the passive CCME were investigated experimentally. The recovery efficiencies of the passive CCME were 1.43-1.68 times larger than the best values achieved using cocurrent extraction. Furthermore, the total throughput of the passive CCME developed in this work was about one to three orders of magnitude higher than that of other passive CCME systems reported in the literature. Therefore, a robust CCME process with high throughputs has been successfully constructed, which may promote the application of passive CCME in a wide variety of fields.

  18. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    USDA-ARS?s Scientific Manuscript database

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  19. From genes to protein mechanics on a chip.

    PubMed

    Otten, Marcus; Ott, Wolfgang; Jobst, Markus A; Milles, Lukas F; Verdorfer, Tobias; Pippig, Diana A; Nash, Michael A; Gaub, Hermann E

    2014-11-01

    Single-molecule force spectroscopy enables mechanical testing of individual proteins, but low experimental throughput limits the ability to screen constructs in parallel. We describe a microfluidic platform for on-chip expression, covalent surface attachment and measurement of single-molecule protein mechanical properties. A dockerin tag on each protein molecule allowed us to perform thousands of pulling cycles using a single cohesin-modified cantilever. The ability to synthesize and mechanically probe protein libraries enables high-throughput mechanical phenotyping.

  20. Precise, High-throughput Analysis of Bacterial Growth.

    PubMed

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  1. Two-dimensional materials from high-throughput computational exfoliation of experimentally known compounds.

    PubMed

    Mounet, Nicolas; Gibertini, Marco; Schwaller, Philippe; Campi, Davide; Merkys, Andrius; Marrazzo, Antimo; Sohier, Thibault; Castelli, Ivano Eligio; Cepellotti, Andrea; Pizzi, Giovanni; Marzari, Nicola

    2018-03-01

    Two-dimensional (2D) materials have emerged as promising candidates for next-generation electronic and optoelectronic applications. Yet, only a few dozen 2D materials have been successfully synthesized or exfoliated. Here, we search for 2D materials that can be easily exfoliated from their parent compounds. Starting from 108,423 unique, experimentally known 3D compounds, we identify a subset of 5,619 compounds that appear layered according to robust geometric and bonding criteria. High-throughput calculations using van der Waals density functional theory, validated against experimental structural data and calculated random phase approximation binding energies, further allowed the identification of 1,825 compounds that are either easily or potentially exfoliable. In particular, the subset of 1,036 easily exfoliable cases provides novel structural prototypes and simple ternary compounds as well as a large portfolio of materials to search from for optimal properties. For a subset of 258 compounds, we explore vibrational, electronic, magnetic and topological properties, identifying 56 ferromagnetic and antiferromagnetic systems, including half-metals and half-semiconductors.

  2. Two-dimensional materials from high-throughput computational exfoliation of experimentally known compounds

    NASA Astrophysics Data System (ADS)

    Mounet, Nicolas; Gibertini, Marco; Schwaller, Philippe; Campi, Davide; Merkys, Andrius; Marrazzo, Antimo; Sohier, Thibault; Castelli, Ivano Eligio; Cepellotti, Andrea; Pizzi, Giovanni; Marzari, Nicola

    2018-02-01

    Two-dimensional (2D) materials have emerged as promising candidates for next-generation electronic and optoelectronic applications. Yet, only a few dozen 2D materials have been successfully synthesized or exfoliated. Here, we search for 2D materials that can be easily exfoliated from their parent compounds. Starting from 108,423 unique, experimentally known 3D compounds, we identify a subset of 5,619 compounds that appear layered according to robust geometric and bonding criteria. High-throughput calculations using van der Waals density functional theory, validated against experimental structural data and calculated random phase approximation binding energies, further allowed the identification of 1,825 compounds that are either easily or potentially exfoliable. In particular, the subset of 1,036 easily exfoliable cases provides novel structural prototypes and simple ternary compounds as well as a large portfolio of materials to search from for optimal properties. For a subset of 258 compounds, we explore vibrational, electronic, magnetic and topological properties, identifying 56 ferromagnetic and antiferromagnetic systems, including half-metals and half-semiconductors.

  3. High-throughput, pooled sequencing identifies mutations in NUBPL and FOXRED1 in human complex I deficiency

    PubMed Central

    Calvo, Sarah E; Tucker, Elena J; Compton, Alison G; Kirby, Denise M; Crawford, Gabriel; Burtt, Noel P; Rivas, Manuel A; Guiducci, Candace; Bruno, Damien L; Goldberger, Olga A; Redman, Michelle C; Wiltshire, Esko; Wilson, Callum J; Altshuler, David; Gabriel, Stacey B; Daly, Mark J; Thorburn, David R; Mootha, Vamsi K

    2010-01-01

    Discovering the molecular basis of mitochondrial respiratory chain disease is challenging given the large number of both mitochondrial and nuclear genes involved. We report a strategy of focused candidate gene prediction, high-throughput sequencing, and experimental validation to uncover the molecular basis of mitochondrial complex I (CI) disorders. We created five pools of DNA from a cohort of 103 patients and then performed deep sequencing of 103 candidate genes to spotlight 151 rare variants predicted to impact protein function. We used confirmatory experiments to establish genetic diagnoses in 22% of previously unsolved cases, and discovered that defects in NUBPL and FOXRED1 can cause CI deficiency. Our study illustrates how large-scale sequencing, coupled with functional prediction and experimental validation, can reveal novel disease-causing mutations in individual patients. PMID:20818383

  4. High-throughput search for caloric materials: the CaloriCool approach

    NASA Astrophysics Data System (ADS)

    Zarkevich, N. A.; Johnson, D. D.; Pecharsky, V. K.

    2018-01-01

    The high-throughput search paradigm adopted by the newly established caloric materials consortium—CaloriCool®—with the goal to substantially accelerate discovery and design of novel caloric materials is briefly discussed. We begin with describing material selection criteria based on known properties, which are then followed by heuristic fast estimates, ab initio calculations, all of which has been implemented in a set of automated computational tools and measurements. We also demonstrate how theoretical and computational methods serve as a guide for experimental efforts by considering a representative example from the field of magnetocaloric materials.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hattrick-Simpers, Jason R.; Gregoire, John M.; Kusne, A. Gilad

    With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. Here, we review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams andmore » beyond.« less

  6. High-throughput search for caloric materials: the CaloriCool approach

    DOE PAGES

    Zarkevich, Nikolai A.; Johnson, Duane D.; Pecharsky, V. K.

    2017-12-13

    The high-throughput search paradigm adopted by the newly established caloric materials consortium—CaloriCool ®—with the goal to substantially accelerate discovery and design of novel caloric materials is briefly discussed. Here, we begin with describing material selection criteria based on known properties, which are then followed by heuristic fast estimates, ab initio calculations, all of which has been implemented in a set of automated computational tools and measurements. We also demonstrate how theoretical and computational methods serve as a guide for experimental efforts by considering a representative example from the field of magnetocaloric materials.

  7. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    PubMed

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  8. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko

    2010-06-23

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less

  9. High-throughput heterodyne thermoreflectance: Application to thermal conductivity measurements of a Fe-Si-Ge thin film alloy library.

    PubMed

    d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan

    2017-07-01

    A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library Fe x Si y Ge 100-x-y (20

  10. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph

    2009-03-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  11. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Lopez, Rene; Hampton, Meredith; Desimone, Joseph

    2008-10-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  12. High-throughput heterodyne thermoreflectance: Application to thermal conductivity measurements of a Fe-Si-Ge thin film alloy library

    NASA Astrophysics Data System (ADS)

    d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan

    2017-07-01

    A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library FexSiyGe100-x-y (20

  13. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  14. toxoMine: an integrated omics data warehouse for Toxoplasma gondii systems biology research

    PubMed Central

    Rhee, David B.; Croken, Matthew McKnight; Shieh, Kevin R.; Sullivan, Julie; Micklem, Gos; Kim, Kami; Golden, Aaron

    2015-01-01

    Toxoplasma gondii (T. gondii) is an obligate intracellular parasite that must monitor for changes in the host environment and respond accordingly; however, it is still not fully known which genetic or epigenetic factors are involved in regulating virulence traits of T. gondii. There are on-going efforts to elucidate the mechanisms regulating the stage transition process via the application of high-throughput epigenomics, genomics and proteomics techniques. Given the range of experimental conditions and the typical yield from such high-throughput techniques, a new challenge arises: how to effectively collect, organize and disseminate the generated data for subsequent data analysis. Here, we describe toxoMine, which provides a powerful interface to support sophisticated integrative exploration of high-throughput experimental data and metadata, providing researchers with a more tractable means toward understanding how genetic and/or epigenetic factors play a coordinated role in determining pathogenicity of T. gondii. As a data warehouse, toxoMine allows integration of high-throughput data sets with public T. gondii data. toxoMine is also able to execute complex queries involving multiple data sets with straightforward user interaction. Furthermore, toxoMine allows users to define their own parameters during the search process that gives users near-limitless search and query capabilities. The interoperability feature also allows users to query and examine data available in other InterMine systems, which would effectively augment the search scope beyond what is available to toxoMine. toxoMine complements the major community database ToxoDB by providing a data warehouse that enables more extensive integrative studies for T. gondii. Given all these factors, we believe it will become an indispensable resource to the greater infectious disease research community. Database URL: http://toxomine.org PMID:26130662

  15. ElectroTaxis-on-a-Chip (ETC): an integrated quantitative high-throughput screening platform for electrical field-directed cell migration.

    PubMed

    Zhao, Siwei; Zhu, Kan; Zhang, Yan; Zhu, Zijie; Xu, Zhengping; Zhao, Min; Pan, Tingrui

    2014-11-21

    Both endogenous and externally applied electrical stimulation can affect a wide range of cellular functions, including growth, migration, differentiation and division. Among those effects, the electrical field (EF)-directed cell migration, also known as electrotaxis, has received broad attention because it holds great potential in facilitating clinical wound healing. Electrotaxis experiment is conventionally conducted in centimetre-sized flow chambers built in Petri dishes. Despite the recent efforts to adapt microfluidics for electrotaxis studies, the current electrotaxis experimental setup is still cumbersome due to the needs of an external power supply and EF controlling/monitoring systems. There is also a lack of parallel experimental systems for high-throughput electrotaxis studies. In this paper, we present a first independently operable microfluidic platform for high-throughput electrotaxis studies, integrating all functional components for cell migration under EF stimulation (except microscopy) on a compact footprint (the same as a credit card), referred to as ElectroTaxis-on-a-Chip (ETC). Inspired by the R-2R resistor ladder topology in digital signal processing, we develop a systematic approach to design an infinitely expandable microfluidic generator of EF gradients for high-throughput and quantitative studies of EF-directed cell migration. Furthermore, a vacuum-assisted assembly method is utilized to allow direct and reversible attachment of our device to existing cell culture media on biological surfaces, which separates the cell culture and device preparation/fabrication steps. We have demonstrated that our ETC platform is capable of screening human cornea epithelial cell migration under the stimulation of an EF gradient spanning over three orders of magnitude. The screening results lead to the identification of the EF-sensitive range of that cell type, which can provide valuable guidance to the clinical application of EF-facilitated wound healing.

  16. Applicability of discovery science approach to determine biological effects of mobile phone radiation.

    PubMed

    Leszczynski, Dariusz; Nylund, Reetta; Joenväärä, Sakari; Reivinen, Jukka

    2004-02-01

    We argue that the use of high-throughput screening techniques, although expensive and laborious, is justified and necessary in studies that examine biological effects of mobile phone radiation. The "case of hsp27 protein" presented here suggests that even proteins with only modestly altered (by exposure to mobile phone radiation) expression and activity might have an impact on cell physiology. However, this short communication does not attempt to present the full scientific evidence that is far too large to be presented in a single article and that is being prepared for publication in three separate research articles. Examples of the experimental evidence presented here were designed to show the flow of experimental process demonstrating that the use of high-throughput screening techniques might help in rapid identification of the responding proteins. This, in turn, can help in speeding up of the process of determining whether these changes might affect human health.*

  17. High pressure inertial focusing for separating and concentrating bacteria at high throughput

    NASA Astrophysics Data System (ADS)

    Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.

    2017-08-01

    Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.

  18. Model-based high-throughput design of ion exchange protein chromatography.

    PubMed

    Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo

    2016-08-12

    This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Tackling the widespread and critical impact of batch effects in high-throughput data.

    PubMed

    Leek, Jeffrey T; Scharpf, Robert B; Bravo, Héctor Corrada; Simcha, David; Langmead, Benjamin; Johnson, W Evan; Geman, Donald; Baggerly, Keith; Irizarry, Rafael A

    2010-10-01

    High-throughput technologies are widely used, for example to assay genetic variants, gene and protein expression, and epigenetic modifications. One often overlooked complication with such studies is batch effects, which occur because measurements are affected by laboratory conditions, reagent lots and personnel differences. This becomes a major problem when batch effects are correlated with an outcome of interest and lead to incorrect conclusions. Using both published studies and our own analyses, we argue that batch effects (as well as other technical and biological artefacts) are widespread and critical to address. We review experimental and computational approaches for doing so.

  20. Experimental demonstration of a real-time high-throughput digital DC blocker for compensating ADC imperfections in optical fast-OFDM receivers.

    PubMed

    Zhang, Lu; Ouyang, Xing; Shao, Xiaopeng; Zhao, Jian

    2016-06-27

    Performance degradation induced by the DC components at the output of real-time analogue-to-digital converter (ADC) is experimentally investigated for optical fast-OFDM receiver. To compensate this degradation, register transfer level (RTL) circuits for real-time digital DC blocker with 20GS/s throughput are proposed and implemented in field programmable gate array (FPGA). The performance of the proposed real-time digital DC blocker is experimentally investigated in a 15Gb/s optical fast-OFDM system with intensity modulation and direct detection over 40 km standard single-mode fibre. The results show that the fixed-point DC blocker has negligible performance penalty compared to the offline floating point one, and can overcome the error floor of the fast OFDM receiver caused by the DC components from the real-time ADC output.

  1. Drug Discovery in Fish, Flies, and Worms

    PubMed Central

    Strange, Kevin

    2016-01-01

    Abstract Nonmammalian model organisms such as the nematode Caenorhabditis elegans, the fruit fly Drosophila melanogaster, and the zebrafish Danio rerio provide numerous experimental advantages for drug discovery including genetic and molecular tractability, amenability to high-throughput screening methods and reduced experimental costs and increased experimental throughput compared to traditional mammalian models. An interdisciplinary approach that strategically combines the study of nonmammalian and mammalian animal models with diverse experimental tools has and will continue to provide deep molecular and genetic understanding of human disease and will significantly enhance the discovery and application of new therapies to treat those diseases. This review will provide an overview of C. elegans, Drosophila, and zebrafish biology and husbandry and will discuss how these models are being used for phenotype-based drug screening and for identification of drug targets and mechanisms of action. The review will also describe how these and other nonmammalian model organisms are uniquely suited for the discovery of drug-based regenerative medicine therapies. PMID:28053067

  2. Applications of high throughput (combinatorial) methodologies to electronic, magnetic, optical, and energy-related materials

    NASA Astrophysics Data System (ADS)

    Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.

    2013-06-01

    High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.

  3. Intuitive web-based experimental design for high-throughput biomedical data.

    PubMed

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  4. Mathematical and Computational Modeling in Complex Biological Systems

    PubMed Central

    Li, Wenyang; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558

  5. Mathematical and Computational Modeling in Complex Biological Systems.

    PubMed

    Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.

  6. Mapping the miRNA interactome by crosslinking ligation and sequencing of hybrids (CLASH)

    PubMed Central

    Helwak, Aleksandra; Tollervey, David

    2014-01-01

    RNA-RNA interactions play critical roles in many cellular processes but studying them is difficult and laborious. Here, we describe an experimental procedure, termed crosslinking ligation and sequencing of hybrids (CLASH), which allows high-throughput identification of sites of RNA-RNA interaction. During CLASH, a tagged bait protein is UV crosslinked in vivo to stabilise RNA interactions and purified under denaturing conditions. RNAs associated with the bait protein are partially truncated, and the ends of RNA-duplexes are ligated together. Following linker addition, cDNA library preparation and high-throughput sequencing, the ligated duplexes give rise to chimeric cDNAs, which unambiguously identify RNA-RNA interaction sites independent of bioinformatic predictions. This protocol is optimized for studying miRNA targets bound by Argonaute proteins, but should be easily adapted for other RNA-binding proteins and classes of RNA. The protocol requires around 5 days to complete, excluding the time required for high-throughput sequencing and bioinformatic analyses. PMID:24577361

  7. High-throughput characterization for solar fuels materials discovery

    NASA Astrophysics Data System (ADS)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  8. An automated high throughput tribometer for adhesion, wear, and friction measurements

    NASA Astrophysics Data System (ADS)

    Kalihari, Vivek; Timpe, Shannon J.; McCarty, Lyle; Ninke, Matthew; Whitehead, Jim

    2013-03-01

    Understanding the origin and correlation of different surface properties under a multitude of operating conditions is critical in tribology. Diverse tribological properties and a lack of a single instrument to measure all make it difficult to compare and correlate properties, particularly in light of the wide range of interfaces commonly investigated. In the current work, a novel automated tribometer has been designed and validated, providing a unique experimental platform capable of high throughput adhesion, wear, kinetic friction, and static friction measurements. The innovative design aspects are discussed that allow for a variety of probes, sample surfaces, and testing conditions. Critical components of the instrument and their design criteria are described along with examples of data collection schemes. A case study is presented with multiple surface measurements performed on a set of characteristic substrates. Adhesion, wear, kinetic friction, and static friction are analyzed and compared across surfaces, highlighting the comprehensive nature of the surface data that can be generated using the automated high throughput tribometer.

  9. Pre-amplification in the context of high-throughput qPCR gene expression experiment.

    PubMed

    Korenková, Vlasta; Scott, Justin; Novosadová, Vendula; Jindřichová, Marie; Langerová, Lucie; Švec, David; Šídová, Monika; Sjöback, Robert

    2015-03-11

    With the introduction of the first high-throughput qPCR instrument on the market it became possible to perform thousands of reactions in a single run compared to the previous hundreds. In the high-throughput reaction, only limited volumes of highly concentrated cDNA or DNA samples can be added. This necessity can be solved by pre-amplification, which became a part of the high-throughput experimental workflow. Here, we focused our attention on the limits of the specific target pre-amplification reaction and propose the optimal, general setup for gene expression experiment using BioMark instrument (Fluidigm). For evaluating different pre-amplification factors following conditions were combined: four human blood samples from healthy donors and five transcripts having high to low expression levels; each cDNA sample was pre-amplified at four cycles (15, 18, 21, and 24) and five concentrations (equivalent to 0.078 ng, 0.32 ng, 1.25 ng, 5 ng, and 20 ng of total RNA). Factors identified as critical for a success of cDNA pre-amplification were cycle of pre-amplification, total RNA concentration, and type of gene. The selected pre-amplification reactions were further tested for optimal Cq distribution in a BioMark Array. The following concentrations combined with pre-amplification cycles were optimal for good quality samples: 20 ng of total RNA with 15 cycles of pre-amplification, 20x and 40x diluted; and 5 ng and 20 ng of total RNA with 18 cycles of pre-amplification, both 20x and 40x diluted. We set up upper limits for the bulk gene expression experiment using gene expression Dynamic Array and provided an easy-to-obtain tool for measuring of pre-amplification success. We also showed that variability of the pre-amplification, introduced into the experimental workflow of reverse transcription-qPCR, is lower than variability caused by the reverse transcription step.

  10. Machine learning in computational biology to accelerate high-throughput protein expression.

    PubMed

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  12. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    PubMed

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.

  13. A device for high-throughput monitoring of degradation in soft tissue samples.

    PubMed

    Tzeranis, D S; Panagiotopoulos, I; Gkouma, S; Kanakaris, G; Georgiou, N; Vaindirlis, N; Vasileiou, G; Neidlin, M; Gkousioudi, A; Spitas, V; Macheras, G A; Alexopoulos, L G

    2018-06-06

    This work describes the design and validation of a novel device, the High-Throughput Degradation Monitoring Device (HDD), for monitoring the degradation of 24 soft tissue samples over incubation periods of several days inside a cell culture incubator. The device quantifies sample degradation by monitoring its deformation induced by a static gravity load. Initial instrument design and experimental protocol development focused on quantifying cartilage degeneration. Characterization of measurement errors, caused mainly by thermal transients and by translating the instrument sensor, demonstrated that HDD can quantify sample degradation with <6 μm precision and <10 μm temperature-induced errors. HDD capabilities were evaluated in a pilot study that monitored the degradation of fresh ex vivo human cartilage samples by collagenase solutions over three days. HDD could robustly resolve the effects of collagenase concentration as small as 0.5 mg/ml. Careful sample preparation resulted in measurements that did not suffer from donor-to-donor variation (coefficient of variance <70%). Due to its unique combination of sample throughput, measurement precision, temporal sampling and experimental versality, HDD provides a novel biomechanics-based experimental platform for quantifying the effects of proteins (cytokines, growth factors, enzymes, antibodies) or small molecules on the degradation of soft tissues or tissue engineering constructs. Thereby, HDD can complement established tools and in vitro models in important applications including drug screening and biomaterial development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. From Genes to Protein Mechanics on a Chip

    PubMed Central

    Milles, Lukas F.; Verdorfer, Tobias; Pippig, Diana A.; Nash, Michael A.; Gaub, Hermann E.

    2014-01-01

    Single-molecule force spectroscopy enables mechanical testing of individual proteins, however low experimental throughput limits the ability to screen constructs in parallel. We describe a microfluidic platform for on-chip protein expression and measurement of single-molecule mechanical properties. We constructed microarrays of proteins covalently attached to a chip surface, and found that a single cohesin-modified cantilever that bound to the terminal dockerin-tag of each protein remained stable over thousands of pulling cycles. The ability to synthesize and mechanically probe protein libraries presents new opportunities for high-throughput mechanical phenotyping. PMID:25194847

  15. High Throughput 600 Watt Hall Effect Thruster for Space Exploration

    NASA Technical Reports Server (NTRS)

    Szabo, James; Pote, Bruce; Tedrake, Rachel; Paintal, Surjeet; Byrne, Lawrence; Hruby, Vlad; Kamhawi, Hani; Smith, Tim

    2016-01-01

    A nominal 600-Watt Hall Effect Thruster was developed to propel unmanned space vehicles. Both xenon and iodine compatible versions were demonstrated. With xenon, peak measured thruster efficiency is 46-48% at 600-W, with specific impulse from 1400 s to 1700 s. Evolution of the thruster channel due to ion erosion was predicted through numerical models and calibrated with experimental measurements. Estimated xenon throughput is greater than 100 kg. The thruster is well sized for satellite station keeping and orbit maneuvering, either by itself or within a cluster.

  16. Zbrowse: An interactive GWAS results browser

    USDA-ARS?s Scientific Manuscript database

    The growing number of genotyped populations, the advent of high-throughput phenotyping techniques and the development of GWAS analysis software has rapidly accelerated the number of GWAS experimental results. Candidate gene discovery from these results files is often tedious, involving many manual s...

  17. Thermal Catalytic Syngas Cleanup for High-Efficiency Waste-to-Energy Converters

    DTIC Science & Technology

    2015-12-01

    characteristics for a full-scale WEC based on the collected experimental data. 20 RESULTS AND DISCUSSION Task 1 – Tar-Cracking Reactor...prepared to show the effect of reaching the target throughput rate of 50 lb/hr on conversion efficiency. In scaling up the experimental results , the...Midmoisture Full Moisture Fuel Feed Rate, kg/hr 22.3 22.3 22.3 Results Using the Experimental Recuperator Effectiveness Fuel Energy In, kWth 160 136 121

  18. A bioinformatics roadmap for the human vaccines project.

    PubMed

    Scheuermann, Richard H; Sinkovits, Robert S; Schenkelberg, Theodore; Koff, Wayne C

    2017-06-01

    Biomedical research has become a data intensive science in which high throughput experimentation is producing comprehensive data about biological systems at an ever-increasing pace. The Human Vaccines Project is a new public-private partnership, with the goal of accelerating development of improved vaccines and immunotherapies for global infectious diseases and cancers by decoding the human immune system. To achieve its mission, the Project is developing a Bioinformatics Hub as an open-source, multidisciplinary effort with the overarching goal of providing an enabling infrastructure to support the data processing, analysis and knowledge extraction procedures required to translate high throughput, high complexity human immunology research data into biomedical knowledge, to determine the core principles driving specific and durable protective immune responses.

  19. High-throughput protein analysis integrating bioinformatics and experimental assays

    PubMed Central

    del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan

    2004-01-01

    The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins. PMID:14762202

  20. Identification and Characterization of Prostate Cancer Associated Protein Biomarkers Using High-Throughput Mass Spectrometry

    DTIC Science & Technology

    2006-09-01

    Specific examples using serum samples from prostate cancer and hepatocellular carcinoma subjects are provided, along with suggested experimental...from prostate cancer and hepatocellular carcinoma subjects are provided, along with suggested experimental strategies for integration of lectin based...application of different lectins to enrich for serum glycoforms found in sera from prostate cancer and hepatocellular carcinoma subjects. For the

  1. Using iterative cluster merging with improved gap statistics to perform online phenotype discovery in the context of high-throughput RNAi screens

    PubMed Central

    Yin, Zheng; Zhou, Xiaobo; Bakal, Chris; Li, Fuhai; Sun, Youxian; Perrimon, Norbert; Wong, Stephen TC

    2008-01-01

    Background The recent emergence of high-throughput automated image acquisition technologies has forever changed how cell biologists collect and analyze data. Historically, the interpretation of cellular phenotypes in different experimental conditions has been dependent upon the expert opinions of well-trained biologists. Such qualitative analysis is particularly effective in detecting subtle, but important, deviations in phenotypes. However, while the rapid and continuing development of automated microscope-based technologies now facilitates the acquisition of trillions of cells in thousands of diverse experimental conditions, such as in the context of RNA interference (RNAi) or small-molecule screens, the massive size of these datasets precludes human analysis. Thus, the development of automated methods which aim to identify novel and biological relevant phenotypes online is one of the major challenges in high-throughput image-based screening. Ideally, phenotype discovery methods should be designed to utilize prior/existing information and tackle three challenging tasks, i.e. restoring pre-defined biological meaningful phenotypes, differentiating novel phenotypes from known ones and clarifying novel phenotypes from each other. Arbitrarily extracted information causes biased analysis, while combining the complete existing datasets with each new image is intractable in high-throughput screens. Results Here we present the design and implementation of a novel and robust online phenotype discovery method with broad applicability that can be used in diverse experimental contexts, especially high-throughput RNAi screens. This method features phenotype modelling and iterative cluster merging using improved gap statistics. A Gaussian Mixture Model (GMM) is employed to estimate the distribution of each existing phenotype, and then used as reference distribution in gap statistics. This method is broadly applicable to a number of different types of image-based datasets derived from a wide spectrum of experimental conditions and is suitable to adaptively process new images which are continuously added to existing datasets. Validations were carried out on different dataset, including published RNAi screening using Drosophila embryos [Additional files 1, 2], dataset for cell cycle phase identification using HeLa cells [Additional files 1, 3, 4] and synthetic dataset using polygons, our methods tackled three aforementioned tasks effectively with an accuracy range of 85%–90%. When our method is implemented in the context of a Drosophila genome-scale RNAi image-based screening of cultured cells aimed to identifying the contribution of individual genes towards the regulation of cell-shape, it efficiently discovers meaningful new phenotypes and provides novel biological insight. We also propose a two-step procedure to modify the novelty detection method based on one-class SVM, so that it can be used to online phenotype discovery. In different conditions, we compared the SVM based method with our method using various datasets and our methods consistently outperformed SVM based method in at least two of three tasks by 2% to 5%. These results demonstrate that our methods can be used to better identify novel phenotypes in image-based datasets from a wide range of conditions and organisms. Conclusion We demonstrate that our method can detect various novel phenotypes effectively in complex datasets. Experiment results also validate that our method performs consistently under different order of image input, variation of starting conditions including the number and composition of existing phenotypes, and dataset from different screens. In our findings, the proposed method is suitable for online phenotype discovery in diverse high-throughput image-based genetic and chemical screens. PMID:18534020

  2. Predicting organ toxicity using in vitro bioactivity data and chemical structure

    EPA Science Inventory

    Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches together with high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a superv...

  3. Automated Analysis of siRNA Screens of Virus Infected Cells Based on Immunofluorescence Microscopy

    NASA Astrophysics Data System (ADS)

    Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl

    We present an image analysis approach as part of a high-throughput microscopy screening system based on cell arrays for the identification of genes involved in Hepatitis C and Dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in cells, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behavior of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.

  4. Microplate-Based Method for High-Throughput Screening (HTS) of Chromatographic Conditions Studies for Recombinant Protein Purification.

    PubMed

    Carvalho, Rimenys J; Cruz, Thayana A

    2018-01-01

    High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.

  5. Techniques for Mapping Synthetic Aperture Radar Processing Algorithms to Multi-GPU Clusters

    DTIC Science & Technology

    2012-12-01

    Experimental results were generated with 10 nVidia Tesla C2050 GPUs having maximum throughput of 972 Gflop /s. Our approach scales well for output...Experimental results were generated with 10 nVidia Tesla C2050 GPUs having maximum throughput of 972 Gflop /s. Our approach scales well for output

  6. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A sorting system with automated gates permits individual operant experiments with mice from a social home cage.

    PubMed

    Winter, York; Schaefers, Andrea T U

    2011-03-30

    Behavioral experiments based on operant procedures can be time-consuming for small amounts of data. While individual testing and handling of animals can influence attention, emotion, and behavior, and interfere with experimental outcome, many operant protocols require individual testing. We developed an RFID-technology- and transponder-based sorting system that allows removing the human factor for longer-term experiments. Identity detectors and automated gates route mice individually from their social home cage to an adjacent operant compartment with 24/7 operation. CD1-mice learnt quickly to individually pass through the sorting system. At no time did more than a single mouse enter the operant compartment. After 3 days of adjusting to the sorting system, groups of 4 mice completed about 50 experimental trials per day in the operant compartment without experimenter intervention. The automated sorting system eliminates handling, isolation, and disturbance of the animals, eliminates experimenter-induced variability, saves experimenter time, and is financially economical. It makes possible a new approach for high-throughput experimentation, and is a viable tool for increasing quality and efficiency of many behavioral and neurobiological investigations. It can connect a social home cage, through individual sorting automation, to diverse setups including classical operant chambers, mazes, or arenas with video-based behavior classification. Such highly automated systems will permit efficient high-throughput screening even for transgenic animals with only subtle neurological or psychiatric symptoms where elaborate or longer-term protocols are required for behavioral diagnosis. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Broadband ion mobility deconvolution for rapid analysis of complex mixtures.

    PubMed

    Pettit, Michael E; Brantley, Matthew R; Donnarumma, Fabrizio; Murray, Kermit K; Solouki, Touradj

    2018-05-04

    High resolving power ion mobility (IM) allows for accurate characterization of complex mixtures in high-throughput IM mass spectrometry (IM-MS) experiments. We previously demonstrated that pure component IM-MS data can be extracted from IM unresolved post-IM/collision-induced dissociation (CID) MS data using automated ion mobility deconvolution (AIMD) software [Matthew Brantley, Behrooz Zekavat, Brett Harper, Rachel Mason, and Touradj Solouki, J. Am. Soc. Mass Spectrom., 2014, 25, 1810-1819]. In our previous reports, we utilized a quadrupole ion filter for m/z-isolation of IM unresolved monoisotopic species prior to post-IM/CID MS. Here, we utilize a broadband IM-MS deconvolution strategy to remove the m/z-isolation requirement for successful deconvolution of IM unresolved peaks. Broadband data collection has throughput and multiplexing advantages; hence, elimination of the ion isolation step reduces experimental run times and thus expands the applicability of AIMD to high-throughput bottom-up proteomics. We demonstrate broadband IM-MS deconvolution of two separate and unrelated pairs of IM unresolved isomers (viz., a pair of isomeric hexapeptides and a pair of isomeric trisaccharides) in a simulated complex mixture. Moreover, we show that broadband IM-MS deconvolution improves high-throughput bottom-up characterization of a proteolytic digest of rat brain tissue. To our knowledge, this manuscript is the first to report successful deconvolution of pure component IM and MS data from an IM-assisted data-independent analysis (DIA) or HDMSE dataset.

  9. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitablymore » designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.« less

  10. NMRbot: Python scripts enable high-throughput data collection on current Bruker BioSpin NMR spectrometers.

    PubMed

    Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L

    2013-06-01

    To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.

  11. High-throughput Screening of Recalcitrance Variations in Lignocellulosic Biomass: Total Lignin, Lignin Monomers, and Enzymatic Sugar Release

    PubMed Central

    Decker, Stephen R.; Sykes, Robert W.; Turner, Geoffrey B.; Lupoi, Jason S.; Doepkke, Crissa; Tucker, Melvin P.; Schuster, Logan A.; Mazza, Kimberly; Himmel, Michael E.; Davis, Mark F.; Gjersing, Erica

    2015-01-01

    The conversion of lignocellulosic biomass to fuels, chemicals, and other commodities has been explored as one possible pathway toward reductions in the use of non-renewable energy sources. In order to identify which plants, out of a diverse pool, have the desired chemical traits for downstream applications, attributes, such as cellulose and lignin content, or monomeric sugar release following an enzymatic saccharification, must be compared. The experimental and data analysis protocols of the standard methods of analysis can be time-consuming, thereby limiting the number of samples that can be measured. High-throughput (HTP) methods alleviate the shortcomings of the standard methods, and permit the rapid screening of available samples to isolate those possessing the desired traits. This study illustrates the HTP sugar release and pyrolysis-molecular beam mass spectrometry pipelines employed at the National Renewable Energy Lab. These pipelines have enabled the efficient assessment of thousands of plants while decreasing experimental time and costs through reductions in labor and consumables. PMID:26437006

  12. On-the-fly machine-learning for high-throughput experiments: search for rare-earth-free permanent magnets

    PubMed Central

    Kusne, Aaron Gilad; Gao, Tieren; Mehta, Apurva; Ke, Liqin; Nguyen, Manh Cuong; Ho, Kai-Ming; Antropov, Vladimir; Wang, Cai-Zhuang; Kramer, Matthew J.; Long, Christian; Takeuchi, Ichiro

    2014-01-01

    Advanced materials characterization techniques with ever-growing data acquisition speed and storage capabilities represent a challenge in modern materials science, and new procedures to quickly assess and analyze the data are needed. Machine learning approaches are effective in reducing the complexity of data and rapidly homing in on the underlying trend in multi-dimensional data. Here, we show that by employing an algorithm called the mean shift theory to a large amount of diffraction data in high-throughput experimentation, one can streamline the process of delineating the structural evolution across compositional variations mapped on combinatorial libraries with minimal computational cost. Data collected at a synchrotron beamline are analyzed on the fly, and by integrating experimental data with the inorganic crystal structure database (ICSD), we can substantially enhance the accuracy in classifying the structural phases across ternary phase spaces. We have used this approach to identify a novel magnetic phase with enhanced magnetic anisotropy which is a candidate for rare-earth free permanent magnet. PMID:25220062

  13. Gene Ontology annotations at SGD: new data sources and annotation methods

    PubMed Central

    Hong, Eurie L.; Balakrishnan, Rama; Dong, Qing; Christie, Karen R.; Park, Julie; Binkley, Gail; Costanzo, Maria C.; Dwight, Selina S.; Engel, Stacia R.; Fisk, Dianna G.; Hirschman, Jodi E.; Hitz, Benjamin C.; Krieger, Cynthia J.; Livstone, Michael S.; Miyasato, Stuart R.; Nash, Robert S.; Oughtred, Rose; Skrzypek, Marek S.; Weng, Shuai; Wong, Edith D.; Zhu, Kathy K.; Dolinski, Kara; Botstein, David; Cherry, J. Michael

    2008-01-01

    The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org/) collects and organizes biological information about the chromosomal features and gene products of the budding yeast Saccharomyces cerevisiae. Although published data from traditional experimental methods are the primary sources of evidence supporting Gene Ontology (GO) annotations for a gene product, high-throughput experiments and computational predictions can also provide valuable insights in the absence of an extensive body of literature. Therefore, GO annotations available at SGD now include high-throughput data as well as computational predictions provided by the GO Annotation Project (GOA UniProt; http://www.ebi.ac.uk/GOA/). Because the annotation method used to assign GO annotations varies by data source, GO resources at SGD have been modified to distinguish data sources and annotation methods. In addition to providing information for genes that have not been experimentally characterized, GO annotations from independent sources can be compared to those made by SGD to help keep the literature-based GO annotations current. PMID:17982175

  14. High-Throughput Screening of Recalcitrance Variations in Lignocellulosic Biomass: Total Lignin, Lignin Monomers, and Enzymatic Sugar Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Decker, Stephen R.; Sykes, Robert W.; Turner, Geoffrey B.

    The conversion of lignocellulosic biomass to fuels, chemicals, and other commodities has been explored as one possible pathway toward reductions in the use of non-renewable energy sources. In order to identify which plants, out of a diverse pool, have the desired chemical traits for downstream applications, attributes, such as cellulose and lignin content, or monomeric sugar release following an enzymatic saccharification, must be compared. The experimental and data analysis protocols of the standard methods of analysis can be time-consuming, thereby limiting the number of samples that can be measured. High-throughput (HTP) methods alleviate the shortcomings of the standard methods, andmore » permit the rapid screening of available samples to isolate those possessing the desired traits. This study illustrates the HTP sugar release and pyrolysis-molecular beam mass spectrometry pipelines employed at the National Renewable Energy Lab. These pipelines have enabled the efficient assessment of thousands of plants while decreasing experimental time and costs through reductions in labor and consumables.« less

  15. BlackOPs: increasing confidence in variant detection through mappability filtering.

    PubMed

    Cabanski, Christopher R; Wilkerson, Matthew D; Soloway, Matthew; Parker, Joel S; Liu, Jinze; Prins, Jan F; Marron, J S; Perou, Charles M; Hayes, D Neil

    2013-10-01

    Identifying variants using high-throughput sequencing data is currently a challenge because true biological variants can be indistinguishable from technical artifacts. One source of technical artifact results from incorrectly aligning experimentally observed sequences to their true genomic origin ('mismapping') and inferring differences in mismapped sequences to be true variants. We developed BlackOPs, an open-source tool that simulates experimental RNA-seq and DNA whole exome sequences derived from the reference genome, aligns these sequences by custom parameters, detects variants and outputs a blacklist of positions and alleles caused by mismapping. Blacklists contain thousands of artifact variants that are indistinguishable from true variants and, for a given sample, are expected to be almost completely false positives. We show that these blacklist positions are specific to the alignment algorithm and read length used, and BlackOPs allows users to generate a blacklist specific to their experimental setup. We queried the dbSNP and COSMIC variant databases and found numerous variants indistinguishable from mapping errors. We demonstrate how filtering against blacklist positions reduces the number of potential false variants using an RNA-seq glioblastoma cell line data set. In summary, accounting for mapping-caused variants tuned to experimental setups reduces false positives and, therefore, improves genome characterization by high-throughput sequencing.

  16. Nanowire-nanopore transistor sensor for DNA detection during translocation

    NASA Astrophysics Data System (ADS)

    Xie, Ping; Xiong, Qihua; Fang, Ying; Qing, Quan; Lieber, Charles

    2011-03-01

    Nanopore sequencing, as a promising low cost, high throughput sequencing technique, has been proposed more than a decade ago. Due to the incompatibility between small ionic current signal and fast translocation speed and the technical difficulties on large scale integration of nanopore for direct ionic current sequencing, alternative methods rely on integrated DNA sensors have been proposed, such as using capacitive coupling or tunnelling current etc. But none of them have been experimentally demonstrated yet. Here we show that for the first time an amplified sensor signal has been experimentally recorded from a nanowire-nanopore field effect transistor sensor during DNA translocation. Independent multi-channel recording was also demonstrated for the first time. Our results suggest that the signal is from highly localized potential change caused by DNA translocation in none-balanced buffer condition. Given this method may produce larger signal for smaller nanopores, we hope our experiment can be a starting point for a new generation of nanopore sequencing devices with larger signal, higher bandwidth and large-scale multiplexing capability and finally realize the ultimate goal of low cost high throughput sequencing.

  17. High-throughput annotation of full-length long noncoding RNAs with capture long-read sequencing.

    PubMed

    Lagarde, Julien; Uszczynska-Ratajczak, Barbara; Carbonell, Silvia; Pérez-Lluch, Sílvia; Abad, Amaya; Davis, Carrie; Gingeras, Thomas R; Frankish, Adam; Harrow, Jennifer; Guigo, Roderic; Johnson, Rory

    2017-12-01

    Accurate annotation of genes and their transcripts is a foundation of genomics, but currently no annotation technique combines throughput and accuracy. As a result, reference gene collections remain incomplete-many gene models are fragmentary, and thousands more remain uncataloged, particularly for long noncoding RNAs (lncRNAs). To accelerate lncRNA annotation, the GENCODE consortium has developed RNA Capture Long Seq (CLS), which combines targeted RNA capture with third-generation long-read sequencing. Here we present an experimental reannotation of the GENCODE intergenic lncRNA populations in matched human and mouse tissues that resulted in novel transcript models for 3,574 and 561 gene loci, respectively. CLS approximately doubled the annotated complexity of targeted loci, outperforming existing short-read techniques. Full-length transcript models produced by CLS enabled us to definitively characterize the genomic features of lncRNAs, including promoter and gene structure, and protein-coding potential. Thus, CLS removes a long-standing bottleneck in transcriptome annotation and generates manual-quality full-length transcript models at high-throughput scales.

  18. High-throughput and reliable protocols for animal microRNA library cloning.

    PubMed

    Xiao, Caide

    2011-01-01

    MicroRNAs are short single-stranded RNA molecules (18-25 nucleotides). Because of their ability to silence gene expressions, they can be used to diagnose and treat tumors. Experimental construction of microRNA libraries was the most important step to identify microRNAs from animal tissues. Although there are many commercial kits with special protocols to construct microRNA libraries, this chapter provides the most reliable, high-throughput, and affordable protocols for microRNA library construction. The high-throughput capability of our protocols came from a double concentration (3 and 15%, thickness 1.5 mm) polyacrylamide gel electrophoresis (PAGE), which could directly extract microRNA-size RNAs from up to 400 μg total RNA (enough for two microRNA libraries). The reliability of our protocols was assured by a third PAGE, which selected PCR products of microRNA-size RNAs ligated with 5' and 3' linkers by a miRCat™ kit. Also, a MathCAD program was provided to automatically search short RNAs inserted between 5' and 3' linkers from thousands of sequencing text files.

  19. High-throughput imaging of adult fluorescent zebrafish with an LED fluorescence macroscope

    PubMed Central

    Blackburn, Jessica S; Liu, Sali; Raimondi, Aubrey R; Ignatius, Myron S; Salthouse, Christopher D; Langenau, David M

    2011-01-01

    Zebrafish are a useful vertebrate model for the study of development, behavior, disease and cancer. A major advantage of zebrafish is that large numbers of animals can be economically used for experimentation; however, high-throughput methods for imaging live adult zebrafish had not been developed. Here, we describe protocols for building a light-emitting diode (LED) fluorescence macroscope and for using it to simultaneously image up to 30 adult animals that transgenically express a fluorescent protein, are transplanted with fluorescently labeled tumor cells or are tagged with fluorescent elastomers. These protocols show that the LED fluorescence macroscope is capable of distinguishing five fluorescent proteins and can image unanesthetized swimming adult zebrafish in multiple fluorescent channels simultaneously. The macroscope can be built and used for imaging within 1 day, whereas creating fluorescently labeled adult zebrafish requires 1 hour to several months, depending on the method chosen. The LED fluorescence macroscope provides a low-cost, high-throughput method to rapidly screen adult fluorescent zebrafish and it will be useful for imaging transgenic animals, screening for tumor engraftment, and tagging individual fish for long-term analysis. PMID:21293462

  20. Experimental and Study Design Considerations for Uncovering Oncometabolites.

    PubMed

    Haznadar, Majda; Mathé, Ewy A

    2017-01-01

    Metabolomics as a field has gained attention due to its potential for biomarker discovery, namely because it directly reflects disease phenotype and is the downstream effect of posttranslational modifications. The field provides a "top-down," integrated view of biochemistry in complex organisms, as opposed to the traditional "bottom-up" approach that aims to analyze networks of interactions between genes, proteins and metabolites. It also allows for the detection of thousands of endogenous metabolites in various clinical biospecimens in a high-throughput manner, including tissue and biofluids such as blood and urine. Of note, because biological fluid samples can be collected relatively easily, the time-dependent fluctuations of metabolites can be readily studied in detail.In this chapter, we aim to provide an overview of (1) analytical methods that are currently employed in the field, and (2) study design concepts that should be considered prior to conducting high-throughput metabolomics studies. While widely applicable, the concepts presented here are namely applicable to high-throughput untargeted studies that aim to search for metabolite biomarkers that are associated with a particular human disease.

  1. Model-driven analysis of experimentally determined growth phenotypes for 465 yeast gene deletion mutants under 16 different conditions

    PubMed Central

    Snitkin, Evan S; Dudley, Aimée M; Janse, Daniel M; Wong, Kaisheen; Church, George M; Segrè, Daniel

    2008-01-01

    Background Understanding the response of complex biochemical networks to genetic perturbations and environmental variability is a fundamental challenge in biology. Integration of high-throughput experimental assays and genome-scale computational methods is likely to produce insight otherwise unreachable, but specific examples of such integration have only begun to be explored. Results In this study, we measured growth phenotypes of 465 Saccharomyces cerevisiae gene deletion mutants under 16 metabolically relevant conditions and integrated them with the corresponding flux balance model predictions. We first used discordance between experimental results and model predictions to guide a stage of experimental refinement, which resulted in a significant improvement in the quality of the experimental data. Next, we used discordance still present in the refined experimental data to assess the reliability of yeast metabolism models under different conditions. In addition to estimating predictive capacity based on growth phenotypes, we sought to explain these discordances by examining predicted flux distributions visualized through a new, freely available platform. This analysis led to insight into the glycerol utilization pathway and the potential effects of metabolic shortcuts on model results. Finally, we used model predictions and experimental data to discriminate between alternative raffinose catabolism routes. Conclusions Our study demonstrates how a new level of integration between high throughput measurements and flux balance model predictions can improve understanding of both experimental and computational results. The added value of a joint analysis is a more reliable platform for specific testing of biological hypotheses, such as the catabolic routes of different carbon sources. PMID:18808699

  2. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  3. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  4. Capturing anharmonicity in a lattice thermal conductivity model for high-throughput predictions

    DOE PAGES

    Miller, Samuel A.; Gorai, Prashun; Ortiz, Brenden R.; ...

    2017-01-06

    High-throughput, low-cost, and accurate predictions of thermal properties of new materials would be beneficial in fields ranging from thermal barrier coatings and thermoelectrics to integrated circuits. To date, computational efforts for predicting lattice thermal conductivity (κ L) have been hampered by the complexity associated with computing multiple phonon interactions. In this work, we develop and validate a semiempirical model for κ L by fitting density functional theory calculations to experimental data. Experimental values for κ L come from new measurements on SrIn 2O 4, Ba 2SnO 4, Cu 2ZnSiTe 4, MoTe 2, Ba 3In 2O 6, Cu 3TaTe 4, SnO,more » and InI as well as 55 compounds from across the published literature. Here, to capture the anharmonicity in phonon interactions, we incorporate a structural parameter that allows the model to predict κ L within a factor of 1.5 of the experimental value across 4 orders of magnitude in κ L values and over a diverse chemical and structural phase space, with accuracy similar to or better than that of computationally more expensive models.« less

  5. MULTI-SENSOR REPORTER CELL TECHNOLOGY TO ASSESS HAZARD INVOLVING ENDOCRINE SIGNALING PATHWAYS

    EPA Science Inventory

    Results will define an experimental approach that can be used in a high-throughput format to evaluate the response of hormone signaling pathways and networks to individual chemicals or mixtures. The assay also will have application across species and would significantly reduce...

  6. A Perspective on the Future of High-Throughput RNAi Screening: Will CRISPR Cut Out the Competition or Can RNAi Help Guide the Way?

    PubMed

    Taylor, Jessica; Woodcock, Simon

    2015-09-01

    For more than a decade, RNA interference (RNAi) has brought about an entirely new approach to functional genomics screening. Enabling high-throughput loss-of-function (LOF) screens against the human genome, identifying new drug targets, and significantly advancing experimental biology, RNAi is a fast, flexible technology that is compatible with existing high-throughput systems and processes; however, the recent advent of clustered regularly interspaced palindromic repeats (CRISPR)-Cas, a powerful new precise genome-editing (PGE) technology, has opened up vast possibilities for functional genomics. CRISPR-Cas is novel in its simplicity: one piece of easily engineered guide RNA (gRNA) is used to target a gene sequence, and Cas9 expression is required in the cells. The targeted double-strand break introduced by the gRNA-Cas9 complex is highly effective at removing gene expression compared to RNAi. Together with the reduced cost and complexity of CRISPR-Cas, there is the realistic opportunity to use PGE to screen for phenotypic effects in a total gene knockout background. This review summarizes the exciting development of CRISPR-Cas as a high-throughput screening tool, comparing its future potential to that of well-established RNAi screening techniques, and highlighting future challenges and opportunities within these disciplines. We conclude that the two technologies actually complement rather than compete with each other, enabling greater understanding of the genome in relation to drug discovery. © 2015 Society for Laboratory Automation and Screening.

  7. Detecting and removing multiplicative spatial bias in high-throughput screening technologies.

    PubMed

    Caraus, Iurie; Mazoure, Bogdan; Nadon, Robert; Makarenkov, Vladimir

    2017-10-15

    Considerable attention has been paid recently to improve data quality in high-throughput screening (HTS) and high-content screening (HCS) technologies widely used in drug development and chemical toxicity research. However, several environmentally- and procedurally-induced spatial biases in experimental HTS and HCS screens decrease measurement accuracy, leading to increased numbers of false positives and false negatives in hit selection. Although effective bias correction methods and software have been developed over the past decades, almost all of these tools have been designed to reduce the effect of additive bias only. Here, we address the case of multiplicative spatial bias. We introduce three new statistical methods meant to reduce multiplicative spatial bias in screening technologies. We assess the performance of the methods with synthetic and real data affected by multiplicative spatial bias, including comparisons with current bias correction methods. We also describe a wider data correction protocol that integrates methods for removing both assay and plate-specific spatial biases, which can be either additive or multiplicative. The methods for removing multiplicative spatial bias and the data correction protocol are effective in detecting and cleaning experimental data generated by screening technologies. As our protocol is of a general nature, it can be used by researchers analyzing current or next-generation high-throughput screens. The AssayCorrector program, implemented in R, is available on CRAN. makarenkov.vladimir@uqam.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. High-Throughput Non-Contact Vitrification of Cell-Laden Droplets Based on Cell Printing

    NASA Astrophysics Data System (ADS)

    Shi, Meng; Ling, Kai; Yong, Kar Wey; Li, Yuhui; Feng, Shangsheng; Zhang, Xiaohui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng

    2015-12-01

    Cryopreservation is the most promising way for long-term storage of biological samples e.g., single cells and cellular structures. Among various cryopreservation methods, vitrification is advantageous by employing high cooling rate to avoid the formation of harmful ice crystals in cells. Most existing vitrification methods adopt direct contact of cells with liquid nitrogen to obtain high cooling rates, which however causes the potential contamination and difficult cell collection. To address these limitations, we developed a non-contact vitrification device based on an ultra-thin freezing film to achieve high cooling/warming rate and avoid direct contact between cells and liquid nitrogen. A high-throughput cell printer was employed to rapidly generate uniform cell-laden microdroplets into the device, where the microdroplets were hung on one side of the film and then vitrified by pouring the liquid nitrogen onto the other side via boiling heat transfer. Through theoretical and experimental studies on vitrification processes, we demonstrated that our device offers a high cooling/warming rate for vitrification of the NIH 3T3 cells and human adipose-derived stem cells (hASCs) with maintained cell viability and differentiation potential. This non-contact vitrification device provides a novel and effective way to cryopreserve cells at high throughput and avoid the contamination and collection problems.

  9. High-Throughput Non-Contact Vitrification of Cell-Laden Droplets Based on Cell Printing

    PubMed Central

    Shi, Meng; Ling, Kai; Yong, Kar Wey; Li, Yuhui; Feng, Shangsheng; Zhang, Xiaohui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng

    2015-01-01

    Cryopreservation is the most promising way for long-term storage of biological samples e.g., single cells and cellular structures. Among various cryopreservation methods, vitrification is advantageous by employing high cooling rate to avoid the formation of harmful ice crystals in cells. Most existing vitrification methods adopt direct contact of cells with liquid nitrogen to obtain high cooling rates, which however causes the potential contamination and difficult cell collection. To address these limitations, we developed a non-contact vitrification device based on an ultra-thin freezing film to achieve high cooling/warming rate and avoid direct contact between cells and liquid nitrogen. A high-throughput cell printer was employed to rapidly generate uniform cell-laden microdroplets into the device, where the microdroplets were hung on one side of the film and then vitrified by pouring the liquid nitrogen onto the other side via boiling heat transfer. Through theoretical and experimental studies on vitrification processes, we demonstrated that our device offers a high cooling/warming rate for vitrification of the NIH 3T3 cells and human adipose-derived stem cells (hASCs) with maintained cell viability and differentiation potential. This non-contact vitrification device provides a novel and effective way to cryopreserve cells at high throughput and avoid the contamination and collection problems. PMID:26655688

  10. Reconstruction of metabolic networks from high-throughput metabolite profiling data: in silico analysis of red blood cell metabolism.

    PubMed

    Nemenman, Ilya; Escola, G Sean; Hlavacek, William S; Unkefer, Pat J; Unkefer, Clifford J; Wall, Michael E

    2007-12-01

    We investigate the ability of algorithms developed for reverse engineering of transcriptional regulatory networks to reconstruct metabolic networks from high-throughput metabolite profiling data. For benchmarking purposes, we generate synthetic metabolic profiles based on a well-established model for red blood cell metabolism. A variety of data sets are generated, accounting for different properties of real metabolic networks, such as experimental noise, metabolite correlations, and temporal dynamics. These data sets are made available online. We use ARACNE, a mainstream algorithm for reverse engineering of transcriptional regulatory networks from gene expression data, to predict metabolic interactions from these data sets. We find that the performance of ARACNE on metabolic data is comparable to that on gene expression data.

  11. The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, C.A.; Cohen, A.E.

    2009-05-26

    The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screenedmore » in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.« less

  12. Automatic Segmentation of High-Throughput RNAi Fluorescent Cellular Images

    PubMed Central

    Yan, Pingkum; Zhou, Xiaobo; Shah, Mubarak; Wong, Stephen T. C.

    2010-01-01

    High-throughput genome-wide RNA interference (RNAi) screening is emerging as an essential tool to assist biologists in understanding complex cellular processes. The large number of images produced in each study make manual analysis intractable; hence, automatic cellular image analysis becomes an urgent need, where segmentation is the first and one of the most important steps. In this paper, a fully automatic method for segmentation of cells from genome-wide RNAi screening images is proposed. Nuclei are first extracted from the DNA channel by using a modified watershed algorithm. Cells are then extracted by modeling the interaction between them as well as combining both gradient and region information in the Actin and Rac channels. A new energy functional is formulated based on a novel interaction model for segmenting tightly clustered cells with significant intensity variance and specific phenotypes. The energy functional is minimized by using a multiphase level set method, which leads to a highly effective cell segmentation method. Promising experimental results demonstrate that automatic segmentation of high-throughput genome-wide multichannel screening can be achieved by using the proposed method, which may also be extended to other multichannel image segmentation problems. PMID:18270043

  13. HIGH-THROUGHPUT CELLULAR ASSAYS FOR MODELING TOXICITY IN THE FISH REPRODUCTIVE SYSTEM

    EPA Science Inventory

    The most important benefit of this project is the experimental evaluation of all essential steps in the development and testing of adverse outcome pathways (AOP) for a diverse set of reproductive and non-reproductive toxicants. In contrast to human testing and the toxicity pat...

  14. pH measurement and a rational and practical pH control strategy for high throughput cell culture system.

    PubMed

    Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli

    2010-01-01

    The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers

  15. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  16. A Protocol for Functional Assessment of Whole-Protein Saturation Mutagenesis Libraries Utilizing High-Throughput Sequencing.

    PubMed

    Stiffler, Michael A; Subramanian, Subu K; Salinas, Victor H; Ranganathan, Rama

    2016-07-03

    Site-directed mutagenesis has long been used as a method to interrogate protein structure, function and evolution. Recent advances in massively-parallel sequencing technology have opened up the possibility of assessing the functional or fitness effects of large numbers of mutations simultaneously. Here, we present a protocol for experimentally determining the effects of all possible single amino acid mutations in a protein of interest utilizing high-throughput sequencing technology, using the 263 amino acid antibiotic resistance enzyme TEM-1 β-lactamase as an example. In this approach, a whole-protein saturation mutagenesis library is constructed by site-directed mutagenic PCR, randomizing each position individually to all possible amino acids. The library is then transformed into bacteria, and selected for the ability to confer resistance to β-lactam antibiotics. The fitness effect of each mutation is then determined by deep sequencing of the library before and after selection. Importantly, this protocol introduces methods which maximize sequencing read depth and permit the simultaneous selection of the entire mutation library, by mixing adjacent positions into groups of length accommodated by high-throughput sequencing read length and utilizing orthogonal primers to barcode each group. Representative results using this protocol are provided by assessing the fitness effects of all single amino acid mutations in TEM-1 at a clinically relevant dosage of ampicillin. The method should be easily extendable to other proteins for which a high-throughput selection assay is in place.

  17. Multiplexed ChIP-Seq Using Direct Nucleosome Barcoding: A Tool for High-Throughput Chromatin Analysis.

    PubMed

    Chabbert, Christophe D; Adjalley, Sophie H; Steinmetz, Lars M; Pelechano, Vicent

    2018-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-Seq) or microarray hybridization (ChIP-on-chip) are standard methods for the study of transcription factor binding sites and histone chemical modifications. However, these approaches only allow profiling of a single factor or protein modification at a time.In this chapter, we present Bar-ChIP, a higher throughput version of ChIP-Seq that relies on the direct ligation of molecular barcodes to chromatin fragments. Bar-ChIP enables the concurrent profiling of multiple DNA-protein interactions and is therefore amenable to experimental scale-up, without the need for any robotic instrumentation.

  18. Diffraction efficiency of radially-profiled off-plane reflection gratings

    NASA Astrophysics Data System (ADS)

    Miles, Drew M.; Tutt, James H.; DeRoo, Casey T.; Marlowe, Hannah; Peterson, Thomas J.; McEntaffer, Randall L.; Menz, Benedikt; Burwitz, Vadim; Hartner, Gisela; Laubis, Christian; Scholze, Frank

    2015-09-01

    Future X-ray missions will require gratings with high throughput and high spectral resolution. Blazed off-plane reflection gratings are capable of meeting these demands. A blazed grating profile optimizes grating efficiency, providing higher throughput to one side of zero-order on the arc of diffraction. This paper presents efficiency measurements made in the 0.3 - 1.5 keV energy band at the Physikalisch-Technische Bundesanstalt (PTB) BESSY II facility for three holographically-ruled gratings, two of which are blazed. Each blazed grating was tested in both the Littrow configuration and anti-Littrow configuration in order to test the alignment sensitivity of these gratings with regard to throughput. This paper outlines the procedure of the grating experiment performed at BESSY II and discuss the resulting efficiency measurements across various energies. Experimental results are generally consistent with theory and demonstrate that the blaze does increase throughput to one side of zero-order. However, the total efficiency of the non-blazed, sinusoidal grating is greater than that of the blazed gratings, which suggests that the method of manufacturing these blazed profiles fails to produce facets with the desired level of precision. Finally, evidence of a successful blaze implementation from first diffraction results of prototype blazed gratings produce via a new fabrication technique at the University of Iowa are presented.

  19. Combinatorial and High Throughput Discovery of High Temperature Piezoelectric Ceramics

    DTIC Science & Technology

    2011-10-10

    the known candidate piezoelectric ferroelectric perovskites. Unlike most computational studies on crystal chemistry, where the starting point is some...studies on crystal chemistry, where the starting point is some form of electronic structure calculation, we use a data driven approach to initiate our...experimental measurements reported in the literature. Given that our models are based solely on crystal and electronic structure data and did not

  20. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships

    PubMed Central

    2010-01-01

    Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245

  1. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships.

    PubMed

    Seok, Junhee; Kaushal, Amit; Davis, Ronald W; Xiao, Wenzhong

    2010-01-18

    The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data.

  2. SMARTIV: combined sequence and structure de-novo motif discovery for in-vivo RNA binding data.

    PubMed

    Polishchuk, Maya; Paz, Inbal; Yakhini, Zohar; Mandel-Gutfreund, Yael

    2018-05-25

    Gene expression regulation is highly dependent on binding of RNA-binding proteins (RBPs) to their RNA targets. Growing evidence supports the notion that both RNA primary sequence and its local secondary structure play a role in specific Protein-RNA recognition and binding. Despite the great advance in high-throughput experimental methods for identifying sequence targets of RBPs, predicting the specific sequence and structure binding preferences of RBPs remains a major challenge. We present a novel webserver, SMARTIV, designed for discovering and visualizing combined RNA sequence and structure motifs from high-throughput RNA-binding data, generated from in-vivo experiments. The uniqueness of SMARTIV is that it predicts motifs from enriched k-mers that combine information from ranked RNA sequences and their predicted secondary structure, obtained using various folding methods. Consequently, SMARTIV generates Position Weight Matrices (PWMs) in a combined sequence and structure alphabet with assigned P-values. SMARTIV concisely represents the sequence and structure motif content as a single graphical logo, which is informative and easy for visual perception. SMARTIV was examined extensively on a variety of high-throughput binding experiments for RBPs from different families, generated from different technologies, showing consistent and accurate results. Finally, SMARTIV is a user-friendly webserver, highly efficient in run-time and freely accessible via http://smartiv.technion.ac.il/.

  3. Large-scale protein-protein interactions detection by integrating big biosensing data with computational model.

    PubMed

    You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen

    2014-01-01

    Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.

  4. DockoMatic: automated peptide analog creation for high throughput virtual screening.

    PubMed

    Jacob, Reed B; Bullock, Casey W; Andersen, Tim; McDougal, Owen M

    2011-10-01

    The purpose of this manuscript is threefold: (1) to describe an update to DockoMatic that allows the user to generate cyclic peptide analog structure files based on protein database (pdb) files, (2) to test the accuracy of the peptide analog structure generation utility, and (3) to evaluate the high throughput capacity of DockoMatic. The DockoMatic graphical user interface interfaces with the software program Treepack to create user defined peptide analogs. To validate this approach, DockoMatic produced cyclic peptide analogs were tested for three-dimensional structure consistency and binding affinity against four experimentally determined peptide structure files available in the Research Collaboratory for Structural Bioinformatics database. The peptides used to evaluate this new functionality were alpha-conotoxins ImI, PnIA, and their published analogs. Peptide analogs were generated by DockoMatic and tested for their ability to bind to X-ray crystal structure models of the acetylcholine binding protein originating from Aplysia californica. The results, consisting of more than 300 simulations, demonstrate that DockoMatic predicts the binding energy of peptide structures to within 3.5 kcal mol(-1), and the orientation of bound ligand compares to within 1.8 Å root mean square deviation for ligand structures as compared to experimental data. Evaluation of high throughput virtual screening capacity demonstrated that Dockomatic can collect, evaluate, and summarize the output of 10,000 AutoDock jobs in less than 2 hours of computational time, while 100,000 jobs requires approximately 15 hours and 1,000,000 jobs is estimated to take up to a week. Copyright © 2011 Wiley Periodicals, Inc.

  5. High-throughput determination of RNA structure by proximity ligation.

    PubMed

    Ramani, Vijay; Qiu, Ruolan; Shendure, Jay

    2015-09-01

    We present an unbiased method to globally resolve RNA structures through pairwise contact measurements between interacting regions. RNA proximity ligation (RPL) uses proximity ligation of native RNA followed by deep sequencing to yield chimeric reads with ligation junctions in the vicinity of structurally proximate bases. We apply RPL in both baker's yeast (Saccharomyces cerevisiae) and human cells and generate contact probability maps for ribosomal and other abundant RNAs, including yeast snoRNAs, the RNA subunit of the signal recognition particle and the yeast U2 spliceosomal RNA homolog. RPL measurements correlate with established secondary structures for these RNA molecules, including stem-loop structures and long-range pseudoknots. We anticipate that RPL will complement the current repertoire of computational and experimental approaches in enabling the high-throughput determination of secondary and tertiary RNA structures.

  6. Peroxisystem: harnessing systems cell biology to study peroxisomes.

    PubMed

    Schuldiner, Maya; Zalckvar, Einat

    2015-04-01

    In recent years, high-throughput experimentation with quantitative analysis and modelling of cells, recently dubbed systems cell biology, has been harnessed to study the organisation and dynamics of simple biological systems. Here, we suggest that the peroxisome, a fascinating dynamic organelle, can be used as a good candidate for studying a complete biological system. We discuss several aspects of peroxisomes that can be studied using high-throughput systematic approaches and be integrated into a predictive model. Such approaches can be used in the future to study and understand how a more complex biological system, like a cell and maybe even ultimately a whole organism, works. © 2015 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.

  7. High-Throughput Quantification of SH2 Domain-Phosphopeptide Interactions with Cellulose-Peptide Conjugate Microarrays.

    PubMed

    Engelmann, Brett W

    2017-01-01

    The Src Homology 2 (SH2) domain family primarily recognizes phosphorylated tyrosine (pY) containing peptide motifs. The relative affinity preferences among competing SH2 domains for phosphopeptide ligands define "specificity space," and underpins many functional pY mediated interactions within signaling networks. The degree of promiscuity exhibited and the dynamic range of affinities supported by individual domains or phosphopeptides is best resolved by a carefully executed and controlled quantitative high-throughput experiment. Here, I describe the fabrication and application of a cellulose-peptide conjugate microarray (CPCMA) platform to the quantitative analysis of SH2 domain specificity space. Included herein are instructions for optimal experimental design with special attention paid to common sources of systematic error, phosphopeptide SPOT synthesis, microarray fabrication, analyte titrations, data capture, and analysis.

  8. A predicted protein interactome identifies conserved global networks and disease resistance subnetworks in maize

    USDA-ARS?s Scientific Manuscript database

    An interactome is the genome-wide roadmap of protein-protein interactions that occur within an organism. Interactomes for humans, the fruit fly, and now plants such as Arabidopsis thaliana and Oryza sativa have been generated using high throughput experimental methods. It is possible to use these ...

  9. A new approach to the rationale discovery of polymeric biomaterials

    PubMed Central

    Kohn, Joachim; Welsh, William J.; Knight, Doyle

    2007-01-01

    This paper attempts to illustrate both the need for new approaches to biomaterials discovery as well as the significant promise inherent in the use of combinatorial and computational design strategies. The key observation of this Leading Opinion Paper is that the biomaterials community has been slow to embrace advanced biomaterials discovery tools such as combinatorial methods, high throughput experimentation, and computational modeling in spite of the significant promise shown by these discovery tools in materials science, medicinal chemistry and the pharmaceutical industry. It seems that the complexity of living cells and their interactions with biomaterials has been a conceptual as well as a practical barrier to the use of advanced discovery tools in biomaterials science. However, with the continued increase in computer power, the goal of predicting the biological response of cells in contact with biomaterials surfaces is within reach. Once combinatorial synthesis, high throughput experimentation, and computational modeling are integrated into the biomaterials discovery process, a significant acceleration is possible in the pace of development of improved medical implants, tissue regeneration scaffolds, and gene/drug delivery systems. PMID:17644176

  10. Computational approaches to protein inference in shotgun proteomics

    PubMed Central

    2012-01-01

    Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300

  11. Trade-Offs in Thin Film Solar Cells with Layered Chalcostibite Photovoltaic Absorbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Adam W.; Baranowski, Lauryn L.; Peng, Haowei

    Discovery of novel semiconducting materials is needed for solar energy conversion and other optoelectronic applications. However, emerging low-dimensional solar absorbers often have unconventional crystal structures and unusual combinations of optical absorption and electrical transport properties, which considerably slows down the research and development progress. Here, the effect of stronger absorption and weaker carrier collection of 2D-like absorber materials are studied using a high-throughput combinatorial experimental approach, complemented by advanced characterization and computations. It is found that the photoexcited charge carrier collection in CuSbSe 2 solar cells is enhanced by drift in an electric field, addressing a different absorption/collection balance. Themore » resulting drift solar cells efficiency is <5% due to inherent J SC/ V OC trade-off, suggesting that improved carrier diffusion and better contacts are needed to further increase the CuSbSe 2 performance. Furthermore, this study also illustrates the advantages of high-throughput experimental methods for fast optimization of the optoelectronic devices based on emerging low-dimensional semiconductor materials.« less

  12. Trade-Offs in Thin Film Solar Cells with Layered Chalcostibite Photovoltaic Absorbers

    DOE PAGES

    Welch, Adam W.; Baranowski, Lauryn L.; Peng, Haowei; ...

    2017-01-25

    Discovery of novel semiconducting materials is needed for solar energy conversion and other optoelectronic applications. However, emerging low-dimensional solar absorbers often have unconventional crystal structures and unusual combinations of optical absorption and electrical transport properties, which considerably slows down the research and development progress. Here, the effect of stronger absorption and weaker carrier collection of 2D-like absorber materials are studied using a high-throughput combinatorial experimental approach, complemented by advanced characterization and computations. It is found that the photoexcited charge carrier collection in CuSbSe 2 solar cells is enhanced by drift in an electric field, addressing a different absorption/collection balance. Themore » resulting drift solar cells efficiency is <5% due to inherent J SC/ V OC trade-off, suggesting that improved carrier diffusion and better contacts are needed to further increase the CuSbSe 2 performance. Furthermore, this study also illustrates the advantages of high-throughput experimental methods for fast optimization of the optoelectronic devices based on emerging low-dimensional semiconductor materials.« less

  13. Advances in single-cell experimental design made possible by automated imaging platforms with feedback through segmentation.

    PubMed

    Crick, Alex J; Cammarota, Eugenia; Moulang, Katie; Kotar, Jurij; Cicuta, Pietro

    2015-01-01

    Live optical microscopy has become an essential tool for studying the dynamical behaviors and variability of single cells, and cell-cell interactions. However, experiments and data analysis in this area are often extremely labor intensive, and it has often not been achievable or practical to perform properly standardized experiments on a statistically viable scale. We have addressed this challenge by developing automated live imaging platforms, to help standardize experiments, increasing throughput, and unlocking previously impossible ones. Our real-time cell tracking programs communicate in feedback with microscope and camera control software, and they are highly customizable, flexible, and efficient. As examples of our current research which utilize these automated platforms, we describe two quite different applications: egress-invasion interactions of malaria parasites and red blood cells, and imaging of immune cells which possess high motility and internal dynamics. The automated imaging platforms are able to track a large number of motile cells simultaneously, over hours or even days at a time, greatly increasing data throughput and opening up new experimental possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. The impact of the condenser on cytogenetic image quality in digital microscope system.

    PubMed

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.

  15. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. PMID:26703418

  16. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    PubMed

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Identifying kinase dependency in cancer cells by integrating high-throughput drug screening and kinase inhibition data.

    PubMed

    Ryall, Karen A; Shin, Jimin; Yoo, Minjae; Hinz, Trista K; Kim, Jihye; Kang, Jaewoo; Heasley, Lynn E; Tan, Aik Choon

    2015-12-01

    Targeted kinase inhibitors have dramatically improved cancer treatment, but kinase dependency for an individual patient or cancer cell can be challenging to predict. Kinase dependency does not always correspond with gene expression and mutation status. High-throughput drug screens are powerful tools for determining kinase dependency, but drug polypharmacology can make results difficult to interpret. We developed Kinase Addiction Ranker (KAR), an algorithm that integrates high-throughput drug screening data, comprehensive kinase inhibition data and gene expression profiles to identify kinase dependency in cancer cells. We applied KAR to predict kinase dependency of 21 lung cancer cell lines and 151 leukemia patient samples using published datasets. We experimentally validated KAR predictions of FGFR and MTOR dependence in lung cancer cell line H1581, showing synergistic reduction in proliferation after combining ponatinib and AZD8055. KAR can be downloaded as a Python function or a MATLAB script along with example inputs and outputs at: http://tanlab.ucdenver.edu/KAR/. aikchoon.tan@ucdenver.edu. Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Detecting and overcoming systematic bias in high-throughput screening technologies: a comprehensive review of practical issues and methodological solutions.

    PubMed

    Caraus, Iurie; Alsuwailem, Abdulaziz A; Nadon, Robert; Makarenkov, Vladimir

    2015-11-01

    Significant efforts have been made recently to improve data throughput and data quality in screening technologies related to drug design. The modern pharmaceutical industry relies heavily on high-throughput screening (HTS) and high-content screening (HCS) technologies, which include small molecule, complementary DNA (cDNA) and RNA interference (RNAi) types of screening. Data generated by these screening technologies are subject to several environmental and procedural systematic biases, which introduce errors into the hit identification process. We first review systematic biases typical of HTS and HCS screens. We highlight that study design issues and the way in which data are generated are crucial for providing unbiased screening results. Considering various data sets, including the publicly available ChemBank data, we assess the rates of systematic bias in experimental HTS by using plate-specific and assay-specific error detection tests. We describe main data normalization and correction techniques and introduce a general data preprocessing protocol. This protocol can be recommended for academic and industrial researchers involved in the analysis of current or next-generation HTS data. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. Fast log P determination by ultra-high-pressure liquid chromatography coupled with UV and mass spectrometry detections.

    PubMed

    Henchoz, Yveline; Guillarme, Davy; Martel, Sophie; Rudaz, Serge; Veuthey, Jean-Luc; Carrupt, Pierre-Alain

    2009-08-01

    Ultra-high-pressure liquid chromatography (UHPLC) systems able to work with columns packed with sub-2 microm particles offer very fast methods to determine the lipophilicity of new chemical entities. The careful development of the most suitable experimental conditions presented here will help medicinal chemists for high-throughput screening (HTS) log P(oct) measurements. The approach was optimized using a well-balanced set of 38 model compounds and a series of 28 basic compounds such as beta-blockers, local anesthetics, piperazines, clonidine, and derivatives. Different organic modifiers and hybrid stationary phases packed with 1.7-microm particles were evaluated in isocratic as well as gradient modes, and the advantages and limitations of tested conditions pointed out. The UHPLC approach offered a significant enhancement over the classical HPLC methods, by a factor 50 in the lipophilicity determination throughput. The hyphenation of UHPLC with MS detection allowed a further increase in the throughput. Data and results reported herein prove that the UHPLC-MS method can represent a progress in the HTS-measurement of lipophilicity due to its speed (at least a factor of 500 with respect to HPLC approaches) and to an extended field of application.

  20. SIPSim: A Modeling Toolkit to Predict Accuracy and Aid Design of DNA-SIP Experiments.

    PubMed

    Youngblut, Nicholas D; Barnett, Samuel E; Buckley, Daniel H

    2018-01-01

    DNA Stable isotope probing (DNA-SIP) is a powerful method that links identity to function within microbial communities. The combination of DNA-SIP with multiplexed high throughput DNA sequencing enables simultaneous mapping of in situ assimilation dynamics for thousands of microbial taxonomic units. Hence, high throughput sequencing enabled SIP has enormous potential to reveal patterns of carbon and nitrogen exchange within microbial food webs. There are several different methods for analyzing DNA-SIP data and despite the power of SIP experiments, it remains difficult to comprehensively evaluate method accuracy across a wide range of experimental parameters. We have developed a toolset (SIPSim) that simulates DNA-SIP data, and we use this toolset to systematically evaluate different methods for analyzing DNA-SIP data. Specifically, we employ SIPSim to evaluate the effects that key experimental parameters (e.g., level of isotopic enrichment, number of labeled taxa, relative abundance of labeled taxa, community richness, community evenness, and beta-diversity) have on the specificity, sensitivity, and balanced accuracy (defined as the product of specificity and sensitivity) of DNA-SIP analyses. Furthermore, SIPSim can predict analytical accuracy and power as a function of experimental design and community characteristics, and thus should be of great use in the design and interpretation of DNA-SIP experiments.

  1. SIPSim: A Modeling Toolkit to Predict Accuracy and Aid Design of DNA-SIP Experiments

    PubMed Central

    Youngblut, Nicholas D.; Barnett, Samuel E.; Buckley, Daniel H.

    2018-01-01

    DNA Stable isotope probing (DNA-SIP) is a powerful method that links identity to function within microbial communities. The combination of DNA-SIP with multiplexed high throughput DNA sequencing enables simultaneous mapping of in situ assimilation dynamics for thousands of microbial taxonomic units. Hence, high throughput sequencing enabled SIP has enormous potential to reveal patterns of carbon and nitrogen exchange within microbial food webs. There are several different methods for analyzing DNA-SIP data and despite the power of SIP experiments, it remains difficult to comprehensively evaluate method accuracy across a wide range of experimental parameters. We have developed a toolset (SIPSim) that simulates DNA-SIP data, and we use this toolset to systematically evaluate different methods for analyzing DNA-SIP data. Specifically, we employ SIPSim to evaluate the effects that key experimental parameters (e.g., level of isotopic enrichment, number of labeled taxa, relative abundance of labeled taxa, community richness, community evenness, and beta-diversity) have on the specificity, sensitivity, and balanced accuracy (defined as the product of specificity and sensitivity) of DNA-SIP analyses. Furthermore, SIPSim can predict analytical accuracy and power as a function of experimental design and community characteristics, and thus should be of great use in the design and interpretation of DNA-SIP experiments. PMID:29643843

  2. ISA software suite: supporting standards-compliant experimental annotation and enabling curation at the community level

    PubMed Central

    Rocca-Serra, Philippe; Brandizi, Marco; Maguire, Eamonn; Sklyar, Nataliya; Taylor, Chris; Begley, Kimberly; Field, Dawn; Harris, Stephen; Hide, Winston; Hofmann, Oliver; Neumann, Steffen; Sterk, Peter; Tong, Weida; Sansone, Susanna-Assunta

    2010-01-01

    Summary: The first open source software suite for experimentalists and curators that (i) assists in the annotation and local management of experimental metadata from high-throughput studies employing one or a combination of omics and other technologies; (ii) empowers users to uptake community-defined checklists and ontologies; and (iii) facilitates submission to international public repositories. Availability and Implementation: Software, documentation, case studies and implementations at http://www.isa-tools.org Contact: isatools@googlegroups.com PMID:20679334

  3. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    PubMed

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

  4. Detection of protein-small molecule binding using a self-referencing external cavity laser biosensor.

    PubMed

    Meng Zhang; Peh, Jessie; Hergenrother, Paul J; Cunningham, Brian T

    2014-01-01

    High throughput screening of protein-small molecule binding interactions using label-free optical biosensors is challenging, as the detected signals are often similar in magnitude to experimental noise. Here, we describe a novel self-referencing external cavity laser (ECL) biosensor approach that achieves high resolution and high sensitivity, while eliminating thermal noise with sub-picometer wavelength accuracy. Using the self-referencing ECL biosensor, we demonstrate detection of binding between small molecules and a variety of immobilized protein targets with binding affinities or inhibition constants in the sub-nanomolar to low micromolar range. The demonstrated ability to perform detection in the presence of several interfering compounds opens the potential for increasing the throughput of the approach. As an example application, we performed a "needle-in-the-haystack" screen for inhibitors against carbonic anhydrase isozyme II (CA II), in which known inhibitors are clearly differentiated from inactive molecules within a compound library.

  5. Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data

    PubMed Central

    2014-01-01

    Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193

  6. Numerical techniques for high-throughput reflectance interference biosensing

    NASA Astrophysics Data System (ADS)

    Sevenler, Derin; Ünlü, M. Selim

    2016-06-01

    We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.

  7. HiCTMap: Detection and analysis of chromosome territory structure and position by high-throughput imaging.

    PubMed

    Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom

    2018-06-01

    The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.

  8. High-throughput Molecular Simulations of MOFs for CO2 Separation: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Erucar, Ilknur; Keskin, Seda

    2018-02-01

    Metal organic frameworks (MOFs) have emerged as great alternatives to traditional nanoporous materials for CO2 separation applications. MOFs are porous materials that are formed by self-assembly of transition metals and organic ligands. The most important advantage of MOFs over well-known porous materials is the possibility to generate multiple materials with varying structural properties and chemical functionalities by changing the combination of metal centers and organic linkers during the synthesis. This leads to a large diversity of materials with various pore sizes and shapes that can be efficiently used for CO2 separations. Since the number of synthesized MOFs has already reached to several thousand, experimental investigation of each MOF at the lab-scale is not practical. High-throughput computational screening of MOFs is a great opportunity to identify the best materials for CO2 separation and to gain molecular-level insights into the structure-performance relationships. This type of knowledge can be used to design new materials with the desired structural features that can lead to extraordinarily high CO2 selectivities. In this mini-review, we focused on developments in high-throughput molecular simulations of MOFs for CO2 separations. After reviewing the current studies on this topic, we discussed the opportunities and challenges in the field and addressed the potential future developments.

  9. Heat*seq: an interactive web tool for high-throughput sequencing experiment comparison with public data.

    PubMed

    Devailly, Guillaume; Mantsoki, Anna; Joshi, Anagha

    2016-11-01

    Better protocols and decreasing costs have made high-throughput sequencing experiments now accessible even to small experimental laboratories. However, comparing one or few experiments generated by an individual lab to the vast amount of relevant data freely available in the public domain might be limited due to lack of bioinformatics expertise. Though several tools, including genome browsers, allow such comparison at a single gene level, they do not provide a genome-wide view. We developed Heat*seq, a web-tool that allows genome scale comparison of high throughput experiments chromatin immuno-precipitation followed by sequencing, RNA-sequencing and Cap Analysis of Gene Expression) provided by a user, to the data in the public domain. Heat*seq currently contains over 12 000 experiments across diverse tissues and cell types in human, mouse and drosophila. Heat*seq displays interactive correlation heatmaps, with an ability to dynamically subset datasets to contextualize user experiments. High quality figures and tables are produced and can be downloaded in multiple formats. Web application: http://www.heatstarseq.roslin.ed.ac.uk/ Source code: https://github.com/gdevailly CONTACT: Guillaume.Devailly@roslin.ed.ac.uk or Anagha.Joshi@roslin.ed.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. Link and Network Layers Design for Ultra-High-Speed Terahertz-Band Communications Networks

    DTIC Science & Technology

    2017-01-01

    throughput, and identify the optimal parameter values for their design (Sec. 6.2.3). Moreover, we validate and test the scheme with experimental data obtained...LINK AND NETWORK LAYERS DESIGN FOR ULTRA-HIGH- SPEED TERAHERTZ-BAND COMMUNICATIONS NETWORKS STATE UNIVERSITY OF NEW YORK (SUNY) AT BUFFALO JANUARY...TYPE FINAL TECHNICAL REPORT 3. DATES COVERED (From - To) FEB 2015 – SEP 2016 4. TITLE AND SUBTITLE LINK AND NETWORK LAYERS DESIGN FOR ULTRA-HIGH

  11. Fishing on chips: up-and-coming technological advances in analysis of zebrafish and Xenopus embryos.

    PubMed

    Zhu, Feng; Skommer, Joanna; Huang, Yushi; Akagi, Jin; Adams, Dany; Levin, Michael; Hall, Chris J; Crosier, Philip S; Wlodkowic, Donald

    2014-11-01

    Biotests performed on small vertebrate model organisms provide significant investigative advantages as compared with bioassays that employ cell lines, isolated primary cells, or tissue samples. The main advantage offered by whole-organism approaches is that the effects under study occur in the context of intact physiological milieu, with all its intercellular and multisystem interactions. The gap between the high-throughput cell-based in vitro assays and low-throughput, disproportionally expensive and ethically controversial mammal in vivo tests can be closed by small model organisms such as zebrafish or Xenopus. The optical transparency of their tissues, the ease of genetic manipulation and straightforward husbandry, explain the growing popularity of these model organisms. Nevertheless, despite the potential for miniaturization, automation and subsequent increase in throughput of experimental setups, the manipulation, dispensing and analysis of living fish and frog embryos remain labor-intensive. Recently, a new generation of miniaturized chip-based devices have been developed for zebrafish and Xenopus embryo on-chip culture and experimentation. In this work, we review the critical developments in the field of Lab-on-a-Chip devices designed to alleviate the limits of traditional platforms for studies on zebrafish and clawed frog embryo and larvae. © 2014 International Society for Advancement of Cytometry. © 2014 International Society for Advancement of Cytometry.

  12. Mapping specificity landscapes of RNA-protein interactions by high throughput sequencing.

    PubMed

    Jankowsky, Eckhard; Harris, Michael E

    2017-04-15

    To function in a biological setting, RNA binding proteins (RBPs) have to discriminate between alternative binding sites in RNAs. This discrimination can occur in the ground state of an RNA-protein binding reaction, in its transition state, or in both. The extent by which RBPs discriminate at these reaction states defines RBP specificity landscapes. Here, we describe the HiTS-Kin and HiTS-EQ techniques, which combine kinetic and equilibrium binding experiments with high throughput sequencing to quantitatively assess substrate discrimination for large numbers of substrate variants at ground and transition states of RNA-protein binding reactions. We discuss experimental design, practical considerations and data analysis and outline how a combination of HiTS-Kin and HiTS-EQ allows the mapping of RBP specificity landscapes. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Identification of antigen-specific human monoclonal antibodies using high-throughput sequencing of the antibody repertoire.

    PubMed

    Liu, Ju; Li, Ruihua; Liu, Kun; Li, Liangliang; Zai, Xiaodong; Chi, Xiangyang; Fu, Ling; Xu, Junjie; Chen, Wei

    2016-04-22

    High-throughput sequencing of the antibody repertoire provides a large number of antibody variable region sequences that can be used to generate human monoclonal antibodies. However, current screening methods for identifying antigen-specific antibodies are inefficient. In the present study, we developed an antibody clone screening strategy based on clone dynamics and relative frequency, and used it to identify antigen-specific human monoclonal antibodies. Enzyme-linked immunosorbent assay showed that at least 52% of putative positive immunoglobulin heavy chains composed antigen-specific antibodies. Combining information on dynamics and relative frequency improved identification of positive clones and elimination of negative clones. and increase the credibility of putative positive clones. Therefore the screening strategy could simplify the subsequent experimental screening and may facilitate the generation of antigen-specific antibodies. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Screening small-molecule compound microarrays for protein ligands without fluorescence labeling with a high-throughput scanning microscope.

    PubMed

    Fei, Yiyan; Landry, James P; Sun, Yungshin; Zhu, Xiangdong; Wang, Xiaobing; Luo, Juntao; Wu, Chun-Yi; Lam, Kit S

    2010-01-01

    We describe a high-throughput scanning optical microscope for detecting small-molecule compound microarrays on functionalized glass slides. It is based on measurements of oblique-incidence reflectivity difference and employs a combination of a y-scan galvometer mirror and an x-scan translation stage with an effective field of view of 2 cm x 4 cm. Such a field of view can accommodate a printed small-molecule compound microarray with as many as 10,000 to 20,000 targets. The scanning microscope is capable of measuring kinetics as well as endpoints of protein-ligand reactions simultaneously. We present the experimental results on solution-phase protein reactions with small-molecule compound microarrays synthesized from one-bead, one-compound combinatorial chemistry and immobilized on a streptavidin-functionalized glass slide.

  15. Screening small-molecule compound microarrays for protein ligands without fluorescence labeling with a high-throughput scanning microscope

    PubMed Central

    Fei, Yiyan; Landry, James P.; Sun, Yungshin; Zhu, Xiangdong; Wang, Xiaobing; Luo, Juntao; Wu, Chun-Yi; Lam, Kit S.

    2010-01-01

    We describe a high-throughput scanning optical microscope for detecting small-molecule compound microarrays on functionalized glass slides. It is based on measurements of oblique-incidence reflectivity difference and employs a combination of a y-scan galvometer mirror and an x-scan translation stage with an effective field of view of 2 cm×4 cm. Such a field of view can accommodate a printed small-molecule compound microarray with as many as 10,000 to 20,000 targets. The scanning microscope is capable of measuring kinetics as well as endpoints of protein-ligand reactions simultaneously. We present the experimental results on solution-phase protein reactions with small-molecule compound microarrays synthesized from one-bead, one-compound combinatorial chemistry and immobilized on a streptavidin-functionalized glass slide. PMID:20210464

  16. Automated recycling of chemistry for virtual screening and library design.

    PubMed

    Vainio, Mikko J; Kogej, Thierry; Raubacher, Florian

    2012-07-23

    An early stage drug discovery project needs to identify a number of chemically diverse and attractive compounds. These hit compounds are typically found through high-throughput screening campaigns. The diversity of the chemical libraries used in screening is therefore important. In this study, we describe a virtual high-throughput screening system called Virtual Library. The system automatically "recycles" validated synthetic protocols and available starting materials to generate a large number of virtual compound libraries, and allows for fast searches in the generated libraries using a 2D fingerprint based screening method. Virtual Library links the returned virtual hit compounds back to experimental protocols to quickly assess the synthetic accessibility of the hits. The system can be used as an idea generator for library design to enrich the screening collection and to explore the structure-activity landscape around a specific active compound.

  17. A Microfluidics and Agent-Based Modeling Framework for Investigating Spatial Organization in Bacterial Colonies: The Case of Pseudomonas Aeruginosa and H1-Type VI Secretion Interactions.

    PubMed

    Wilmoth, Jared L; Doak, Peter W; Timm, Andrea; Halsted, Michelle; Anderson, John D; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T; Fuentes-Cabrera, Miguel

    2018-01-01

    The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P . aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models.

  18. A Microfluidics and Agent-Based Modeling Framework for Investigating Spatial Organization in Bacterial Colonies: The Case of Pseudomonas Aeruginosa and H1-Type VI Secretion Interactions

    PubMed Central

    Wilmoth, Jared L.; Doak, Peter W.; Timm, Andrea; Halsted, Michelle; Anderson, John D.; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T.; Fuentes-Cabrera, Miguel

    2018-01-01

    The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P. aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models. PMID:29467721

  19. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  20. Strategic and Operational Plan for Integrating Transcriptomics ...

    EPA Pesticide Factsheets

    Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016

  1. A high-throughput approach to profile RNA structure.

    PubMed

    Delli Ponti, Riccardo; Marti, Stefanie; Armaos, Alexandros; Tartaglia, Gian Gaetano

    2017-03-17

    Here we introduce the Computational Recognition of Secondary Structure (CROSS) method to calculate the structural profile of an RNA sequence (single- or double-stranded state) at single-nucleotide resolution and without sequence length restrictions. We trained CROSS using data from high-throughput experiments such as Selective 2΄-Hydroxyl Acylation analyzed by Primer Extension (SHAPE; Mouse and HIV transcriptomes) and Parallel Analysis of RNA Structure (PARS; Human and Yeast transcriptomes) as well as high-quality NMR/X-ray structures (PDB database). The algorithm uses primary structure information alone to predict experimental structural profiles with >80% accuracy, showing high performances on large RNAs such as Xist (17 900 nucleotides; Area Under the ROC Curve AUC of 0.75 on dimethyl sulfate (DMS) experiments). We integrated CROSS in thermodynamics-based methods to predict secondary structure and observed an increase in their predictive power by up to 30%. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Living-Cell Microarrays

    PubMed Central

    Yarmush, Martin L.; King, Kevin R.

    2011-01-01

    Living cells are remarkably complex. To unravel this complexity, living-cell assays have been developed that allow delivery of experimental stimuli and measurement of the resulting cellular responses. High-throughput adaptations of these assays, known as living-cell microarrays, which are based on microtiter plates, high-density spotting, microfabrication, and microfluidics technologies, are being developed for two general applications: (a) to screen large-scale chemical and genomic libraries and (b) to systematically investigate the local cellular microenvironment. These emerging experimental platforms offer exciting opportunities to rapidly identify genetic determinants of disease, to discover modulators of cellular function, and to probe the complex and dynamic relationships between cells and their local environment. PMID:19413510

  3. Development of a phenotyping platform for high throughput screening of nodal root angle in sorghum.

    PubMed

    Joshi, Dinesh C; Singh, Vijaya; Hunt, Colleen; Mace, Emma; van Oosterom, Erik; Sulman, Richard; Jordan, David; Hammer, Graeme

    2017-01-01

    In sorghum, the growth angle of nodal roots is a major component of root system architecture. It strongly influences the spatial distribution of roots of mature plants in the soil profile, which can impact drought adaptation. However, selection for nodal root angle in sorghum breeding programs has been restricted by the absence of a suitable high throughput phenotyping platform. The aim of this study was to develop a phenotyping platform for the rapid, non-destructive and digital measurement of nodal root angle of sorghum at the seedling stage. The phenotyping platform comprises of 500 soil filled root chambers (50 × 45 × 0.3 cm in size), made of transparent perspex sheets that were placed in metal tubs and covered with polycarbonate sheets. Around 3 weeks after sowing, once the first flush of nodal roots was visible, roots were imaged in situ using an imaging box that included two digital cameras that were remotely controlled by two android tablets. Free software ( openGelPhoto.tcl ) allowed precise measurement of nodal root angle from the digital images. The reliability and efficiency of the platform was evaluated by screening a large nested association mapping population of sorghum and a set of hybrids in six independent experimental runs that included up to 500 plants each. The platform revealed extensive genetic variation and high heritability (repeatability) for nodal root angle. High genetic correlations and consistent ranking of genotypes across experimental runs confirmed the reproducibility of the platform. This low cost, high throughput root phenotyping platform requires no sophisticated equipment, is adaptable to most glasshouse environments and is well suited to dissect the genetic control of nodal root angle of sorghum. The platform is suitable for use in sorghum breeding programs aiming to improve drought adaptation through root system architecture manipulation.

  4. Validation of high-throughput single cell analysis methodology.

    PubMed

    Devonshire, Alison S; Baradez, Marc-Olivier; Morley, Gary; Marshall, Damian; Foy, Carole A

    2014-05-01

    High-throughput quantitative polymerase chain reaction (qPCR) approaches enable profiling of multiple genes in single cells, bringing new insights to complex biological processes and offering opportunities for single cell-based monitoring of cancer cells and stem cell-based therapies. However, workflows with well-defined sources of variation are required for clinical diagnostics and testing of tissue-engineered products. In a study of neural stem cell lines, we investigated the performance of lysis, reverse transcription (RT), preamplification (PA), and nanofluidic qPCR steps at the single cell level in terms of efficiency, precision, and limit of detection. We compared protocols using a separate lysis buffer with cell capture directly in RT-PA reagent. The two methods were found to have similar lysis efficiencies, whereas the direct RT-PA approach showed improved precision. Digital PCR was used to relate preamplified template copy numbers to Cq values and reveal where low-quality signals may affect the analysis. We investigated the impact of calibration and data normalization strategies as a means of minimizing the impact of inter-experimental variation on gene expression values and found that both approaches can improve data comparability. This study provides validation and guidance for the application of high-throughput qPCR workflows for gene expression profiling of single cells. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Supplemental treatment of air in airborne infection isolation rooms using high-throughput in-room air decontamination units.

    PubMed

    Bergeron, Vance; Chalfine, Annie; Misset, Benoît; Moules, Vincent; Laudinet, Nicolas; Carlet, Jean; Lina, Bruno

    2011-05-01

    Evidence has recently emerged indicating that in addition to large airborne droplets, fine aerosol particles can be an important mode of influenza transmission that may have been hitherto underestimated. Furthermore, recent performance studies evaluating airborne infection isolation (AII) rooms designed to house infectious patients have revealed major discrepancies between what is prescribed and what is actually measured. We conducted an experimental study to investigate the use of high-throughput in-room air decontamination units for supplemental protection against airborne contamination in areas that host infectious patients. The study included both intrinsic performance tests of the air-decontamination unit against biological aerosols of particular epidemiologic interest and field tests in a hospital AII room under different ventilation scenarios. The unit tested efficiently eradicated airborne H5N2 influenza and Mycobacterium bovis (a 4- to 5-log single-pass reduction) and, when implemented with a room extractor, reduced the peak contamination levels by a factor of 5, with decontamination rates at least 33% faster than those achieved with the extractor alone. High-throughput in-room air treatment units can provide supplemental control of airborne pathogen levels in patient isolation rooms. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  6. Validation of a high-throughput real-time polymerase chain reaction assay for the detection of capripoxviral DNA.

    PubMed

    Stubbs, Samuel; Oura, Chris A L; Henstock, Mark; Bowden, Timothy R; King, Donald P; Tuppurainen, Eeva S M

    2012-02-01

    Capripoxviruses, which are endemic in much of Africa and Asia, are the aetiological agents of economically devastating poxviral diseases in cattle, sheep and goats. The aim of this study was to validate a high-throughput real-time PCR assay for routine diagnostic use in a capripoxvirus reference laboratory. The performance of two previously published real-time PCR methods were compared using commercially available reagents including the amplification kits recommended in the original publication. Furthermore, both manual and robotic extraction methods used to prepare template nucleic acid were evaluated using samples collected from experimentally infected animals. The optimised assay had an analytical sensitivity of at least 63 target DNA copies per reaction, displayed a greater diagnostic sensitivity compared to conventional gel-based PCR, detected capripoxviruses isolated from outbreaks around the world and did not amplify DNA from related viruses in the genera Orthopoxvirus or Parapoxvirus. The high-throughput robotic DNA extraction procedure did not adversely affect the sensitivity of the assay compared to manual preparation of PCR templates. This laboratory-based assay provides a rapid and robust method to detect capripoxviruses following suspicion of disease in endemic or disease-free countries. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  7. Microfabrication of a platform to measure and manipulate the mechanics of engineered microtissues.

    PubMed

    Ramade, Alexandre; Legant, Wesley R; Picart, Catherine; Chen, Christopher S; Boudou, Thomas

    2014-01-01

    Engineered tissues can be used to understand fundamental features of biology, develop organotypic in vitro model systems, and as engineered tissue constructs for replacing damaged tissue in vivo. However, a key limitation is an inability to test the wide range of parameters that might impact the engineered tissue in a high-throughput manner and in an environment that mimics the three-dimensional (3D) native architecture. We developed a microfabricated platform to generate arrays of microtissues embedded within 3D micropatterned matrices. Microcantilevers simultaneously constrain microtissue formation and report forces generated by the microtissues in real time, opening the possibility to use high-throughput, low-volume screening for studies on engineered tissues. Thanks to the micrometer scale of the microtissues, this platform is also suitable for high-throughput monitoring of drug-induced effect on architecture and contractility in engineered tissues. Moreover, independent variations of the mechanical stiffness of the cantilevers and collagen matrix allow the measurement and manipulation of the mechanics of the microtissues. Thus, our approach will likely provide valuable opportunities to elucidate how biomechanical, electrical, biochemical, and genetic/epigenetic cues modulate the formation and maturation of 3D engineered tissues. In this chapter, we describe the microfabrication, preparation, and experimental use of such microfabricated tissue gauges. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Sources of PCR-induced distortions in high-throughput sequencing data sets

    PubMed Central

    Kebschull, Justus M.; Zador, Anthony M.

    2015-01-01

    PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991

  9. The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System

    PubMed Central

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284

  10. The Introduction of High-Throughput Experimentation Methods for Suzuki-Miyaura Coupling Reactions in University Education

    ERIC Educational Resources Information Center

    Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.

    2005-01-01

    A laboratory project permits for the discussion of the reaction mechanism of the Suzuki-Miyaura coupling reaction. The practical part of the project makes the students familiar with working under inert atmosphere but if the appropriate equipment for working under inert atmosphere is not available in a laboratory, novel catalysts that do not…

  11. Fly-scan ptychography

    DOE PAGES

    Huang, Xiaojing; Lauer, Kenneth; Clark, Jesse N.; ...

    2015-03-13

    We report an experimental ptychography measurement performed in fly-scan mode. With a visible-light laser source, we demonstrate a 5-fold reduction of data acquisition time. By including multiple mutually incoherent modes into the incident illumination, high quality images were successfully reconstructed from blurry diffraction patterns. Thus, this approach significantly increases the throughput of ptychography, especially for three-dimensional applications and the visualization of dynamic systems.

  12. Deriving Signatures of In Vivo Toxicity Using Both Efficacy and Potency Information from In Vitro Assays: Evaluating Model Performance as a Function of Increasing Variability in Experimental Data

    EPA Science Inventory

    The US EPA ToxCast program aims to develop methods for mechanistically-based chemical prioritization using a suite of high throughput, in vitro assays that probe relevant biological pathways, and coupling them with statistical and machine learning methods that produce predictive ...

  13. Meta-analysis of RNA-Seq data across cohorts in a multi-season feed efficiency study of crossbred beef steers accounts for biological and technical variability within season

    USDA-ARS?s Scientific Manuscript database

    High-throughput sequencing is often used for studies of the transcriptome, particularly for comparisons between experimental conditions. Due to sequencing costs, a limited number of biological replicates are typically considered in such experiments, leading to low detection power for differential ex...

  14. Quantification of locomotor activity in larval zebrafish: considerations for the design of high-throughput behavioral studies.

    PubMed

    Ingebretson, Justin J; Masino, Mark A

    2013-01-01

    High-throughput behavioral studies using larval zebrafish often assess locomotor activity to determine the effects of experimental perturbations. However, the results reported by different groups are difficult to compare because there is not a standardized experimental paradigm or measure of locomotor activity. To address this, we investigated the effects that several factors, including the stage of larval development and the physical dimensions (depth and diameter) of the behavioral arena, have on the locomotor activity produced by larval zebrafish. We provide evidence for differences in locomotor activity between larvae at different stages and when recorded in wells of different depths, but not in wells of different diameters. We also show that the variability for most properties of locomotor activity is less for older than younger larvae, which is consistent with previous reports. Finally, we show that conflicting interpretations of activity level can occur when activity is assessed with a single measure of locomotor activity. Thus, we conclude that although a combination of factors should be considered when designing behavioral experiments, the use of older larvae in deep wells will reduce the variability of locomotor activity, and that multiple properties of locomotor activity should be measured to determine activity level.

  15. New fluorescence techniques for high-throughput drug discovery.

    PubMed

    Jäger, S; Brand, L; Eggeling, C

    2003-12-01

    The rapid increase of compound libraries as well as new targets emerging from the Human Genome Project require constant progress in pharmaceutical research. An important tool is High-Throughput Screening (HTS), which has evolved as an indispensable instrument in the pre-clinical target-to-IND (Investigational New Drug) discovery process. HTS requires machinery, which is able to test more than 100,000 potential drug candidates per day with respect to a specific biological activity. This calls for certain experimental demands especially with respect to sensitivity, speed, and statistical accuracy, which are fulfilled by using fluorescence technology instrumentation. In particular the recently developed family of fluorescence techniques, FIDA (Fluorescence Intensity Distribution Analysis), which is based on confocal single-molecule detection, has opened up a new field of HTS applications. This report describes the application of these new techniques as well as of common fluorescence techniques--such as confocal fluorescence lifetime and anisotropy--to HTS. It gives experimental examples and presents advantages and disadvantages of each method. In addition the most common artifacts (auto-fluorescence or quenching by the drug candidates) emerging from the fluorescence detection techniques are highlighted and correction methods for confocal fluorescence read-outs are presented, which are able to circumvent this deficiency.

  16. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    PubMed

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  17. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.

    PubMed

    May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk

    2009-05-04

    The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  18. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  19. Future technologies for monitoring HIV drug resistance and cure.

    PubMed

    Parikh, Urvi M; McCormick, Kevin; van Zyl, Gert; Mellors, John W

    2017-03-01

    Sensitive, scalable and affordable assays are critically needed for monitoring the success of interventions for preventing, treating and attempting to cure HIV infection. This review evaluates current and emerging technologies that are applicable for both surveillance of HIV drug resistance (HIVDR) and characterization of HIV reservoirs that persist despite antiretroviral therapy and are obstacles to curing HIV infection. Next-generation sequencing (NGS) has the potential to be adapted into high-throughput, cost-efficient approaches for HIVDR surveillance and monitoring during continued scale-up of antiretroviral therapy and rollout of preexposure prophylaxis. Similarly, improvements in PCR and NGS are resulting in higher throughput single genome sequencing to detect intact proviruses and to characterize HIV integration sites and clonal expansions of infected cells. Current population genotyping methods for resistance monitoring are high cost and low throughput. NGS, combined with simpler sample collection and storage matrices (e.g. dried blood spots), has considerable potential to broaden global surveillance and patient monitoring for HIVDR. Recent adaptions of NGS to identify integration sites of HIV in the human genome and to characterize the integrated HIV proviruses are likely to facilitate investigations of the impact of experimental 'curative' interventions on HIV reservoirs.

  20. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  1. Robo-Lector - a novel platform for automated high-throughput cultivations in microtiter plates with high information content.

    PubMed

    Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen

    2009-08-01

    In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.

  2. A high throughput mechanical screening device for cartilage tissue engineering.

    PubMed

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  3. 'PACLIMS': a component LIM system for high-throughput functional genomic analysis.

    PubMed

    Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A

    2005-04-12

    Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the approximately 11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors.

  4. Life in the fast lane for protein crystallization and X-ray crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Liu, Zhi-Jie; Tempel, Wolfram; Praissman, Jeremy; Lin, Dawei; Wang, Bi-Cheng; Gavira, Jose A.; Ng, Joseph D.

    2005-01-01

    The common goal for structural genomic centers and consortiums is to decipher as quickly as possible the three-dimensional structures for a multitude of recombinant proteins derived from known genomic sequences. Since X-ray crystallography is the foremost method to acquire atomic resolution for macromolecules, the limiting step is obtaining protein crystals that can be useful of structure determination. High-throughput methods have been developed in recent years to clone, express, purify, crystallize and determine the three-dimensional structure of a protein gene product rapidly using automated devices, commercialized kits and consolidated protocols. However, the average number of protein structures obtained for most structural genomic groups has been very low compared to the total number of proteins purified. As more entire genomic sequences are obtained for different organisms from the three kingdoms of life, only the proteins that can be crystallized and whose structures can be obtained easily are studied. Consequently, an astonishing number of genomic proteins remain unexamined. In the era of high-throughput processes, traditional methods in molecular biology, protein chemistry and crystallization are eclipsed by automation and pipeline practices. The necessity for high-rate production of protein crystals and structures has prevented the usage of more intellectual strategies and creative approaches in experimental executions. Fundamental principles and personal experiences in protein chemistry and crystallization are minimally exploited only to obtain "low-hanging fruit" protein structures. We review the practical aspects of today's high-throughput manipulations and discuss the challenges in fast pace protein crystallization and tools for crystallography. Structural genomic pipelines can be improved with information gained from low-throughput tactics that may help us reach the higher-bearing fruits. Examples of recent developments in this area are reported from the efforts of the Southeast Collaboratory for Structural Genomics (SECSG).

  5. Life in the Fast Lane for Protein Crystallization and X-Ray Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Liu, Zhi-Jie; Tempel, Wolfram; Praissman, Jeremy; Lin, Dawei; Wang, Bi-Cheng; Gavira, Jose A.; Ng, Joseph D.

    2004-01-01

    The common goal for structural genomic centers and consortiums is to decipher as quickly as possible the three-dimensional structures for a multitude of recombinant proteins derived from known genomic sequences. Since X-ray crystallography is the foremost method to acquire atomic resolution for macromolecules, the limiting step is obtaining protein crystals that can be useful of structure determination. High-throughput methods have been developed in recent years to clone, express, purify, crystallize and determine the three-dimensional structure of a protein gene product rapidly using automated devices, commercialized kits and consolidated protocols. However, the average number of protein structures obtained for most structural genomic groups has been very low compared to the total number of proteins purified. As more entire genomic sequences are obtained for different organisms from the three kingdoms of life, only the proteins that can be crystallized and whose structures can be obtained easily are studied. Consequently, an astonishing number of genomic proteins remain unexamined. In the era of high-throughput processes, traditional methods in molecular biology, protein chemistry and crystallization are eclipsed by automation and pipeline practices. The necessity for high rate production of protein crystals and structures has prevented the usage of more intellectual strategies and creative approaches in experimental executions. Fundamental principles and personal experiences in protein chemistry and crystallization are minimally exploited only to obtain "low-hanging fruit" protein structures. We review the practical aspects of today s high-throughput manipulations and discuss the challenges in fast pace protein crystallization and tools for crystallography. Structural genomic pipelines can be improved with information gained from low-throughput tactics that may help us reach the higher-bearing fruits. Examples of recent developments in this area are reported from the efforts of the Southeast Collaboratory for Structural Genomics (SECSG).

  6. 'PACLIMS': A component LIM system for high-throughput functional genomic analysis

    PubMed Central

    Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A

    2005-01-01

    Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the ~11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. Results The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Conclusion Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors. PMID:15826298

  7. WE-E-BRE-03: Biological Validation of a Novel High-Throughput Irradiator for Predictive Radiation Sensitivity Bioassays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, TL; Martin, JA; Shepard, AJ

    2014-06-15

    Purpose: The large dose-response variation in both tumor and normal cells between individual patients has led to the recent implementation of predictive bioassays of patient-specific radiation sensitivity in order to personalize radiation therapy. This exciting new clinical paradigm has led us to develop a novel high-throughput, variable dose-rate irradiator to accompany these efforts. Here we present the biological validation of this irradiator through the use of human cells as a relative dosimeter assessed by two metrics, DNA double-strand break repair pathway modulation and intercellular reactive oxygen species production. Methods: Immortalized human tonsilar epithelial cells were cultured in 96-well micro titermore » plates and irradiated in groups of eight wells to absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy. High-throughput immunofluorescent microscopy was used to detect γH2AX, a DNA double-strand break repair mechanism recruiter. The same analysis was performed with the cells stained with CM-H2DCFDA that produces a fluorescent adduct when exposed to reactive oxygen species during the irradiation cycle. Results: Irradiations of the immortalized human tonsilar epithelial cells at absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy produced excellent linearity in γH2AX and CM-H2DCFDA with R2 values of 0.9939 and 0.9595 respectively. Single cell gel electrophoresis experimentation for the detection of physical DNA double-strand breaks in ongoing. Conclusions: This work indicates significant potential for our high-throughput variable dose rate irradiator for patient-specific predictive radiation sensitivity bioassays. This irradiator provides a powerful tool by increasing the efficiency and number of assay techniques available to help personalize radiation therapy.« less

  8. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties ofmore » suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.« less

  9. A rapid and high-throughput microplate spectrophotometric method for field measurement of nitrate in seawater and freshwater.

    PubMed

    Wu, Jiapeng; Hong, Yiguo; Guan, Fengjie; Wang, Yan; Tan, Yehui; Yue, Weizhong; Wu, Meilin; Bin, Liying; Wang, Jiaping; Wen, Jiali

    2016-02-01

    The well-known zinc-cadmium reduction method is frequently used for determination of nitrate. However, this method is seldom to be applied on field research of nitrate due to the long time consuming and large sample volume demand. Here, we reported a modified zinc-cadmium reduction method (MZCRM) for measurement of nitrate at natural-abundance level in both seawater and freshwater. The main improvements of MZCRM include using small volume disposable tubes for reaction, a vortex apparatus for shaking to increase reduction rate, and a microplate reader for high-throughput spectrophotometric measurements. Considering salt effect, two salinity sections (5~10 psu and 20~35 psu) were set up for more accurate determination of nitrate in low and high salinity condition respectively. Under optimized experimental conditions, the reduction rates were stabilized on 72% and 63% on the salinity of 5 and 20 psu respectively. The lowest detection limit for nitrate was 0.5 μM and was linear up to 100 μM (RSDs was 4.8%). Environmental samples assay demonstrated that MZCRM was well consistent with conventional zinc-cadmium reduction method. In total, this modified method improved accuracy and efficiency of operations greatly, and would be realized a rapid and high-throughput determination of nitrate in field analysis of nitrate with low cost.

  10. A Review of Recent Advancement in Integrating Omics Data with Literature Mining towards Biomedical Discoveries

    PubMed Central

    Raja, Kalpana; Patrick, Matthew; Gao, Yilin; Madu, Desmond; Yang, Yuyang

    2017-01-01

    In the past decade, the volume of “omics” data generated by the different high-throughput technologies has expanded exponentially. The managing, storing, and analyzing of this big data have been a great challenge for the researchers, especially when moving towards the goal of generating testable data-driven hypotheses, which has been the promise of the high-throughput experimental techniques. Different bioinformatics approaches have been developed to streamline the downstream analyzes by providing independent information to interpret and provide biological inference. Text mining (also known as literature mining) is one of the commonly used approaches for automated generation of biological knowledge from the huge number of published articles. In this review paper, we discuss the recent advancement in approaches that integrate results from omics data and information generated from text mining approaches to uncover novel biomedical information. PMID:28331849

  11. Malthusian Parameters as Estimators of the Fitness of Microbes: A Cautionary Tale about the Low Side of High Throughput.

    PubMed

    Concepción-Acevedo, Jeniffer; Weiss, Howard N; Chaudhry, Waqas Nasir; Levin, Bruce R

    2015-01-01

    The maximum exponential growth rate, the Malthusian parameter (MP), is commonly used as a measure of fitness in experimental studies of adaptive evolution and of the effects of antibiotic resistance and other genes on the fitness of planktonic microbes. Thanks to automated, multi-well optical density plate readers and computers, with little hands-on effort investigators can readily obtain hundreds of estimates of MPs in less than a day. Here we compare estimates of the relative fitness of antibiotic susceptible and resistant strains of E. coli, Pseudomonas aeruginosa and Staphylococcus aureus based on MP data obtained with automated multi-well plate readers with the results from pairwise competition experiments. This leads us to question the reliability of estimates of MP obtained with these high throughput devices and the utility of these estimates of the maximum growth rates to detect fitness differences.

  12. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  13. Automated Phase Segmentation for Large-Scale X-ray Diffraction Data Using a Graph-Based Phase Segmentation (GPhase) Algorithm.

    PubMed

    Xiong, Zheng; He, Yinyan; Hattrick-Simpers, Jason R; Hu, Jianjun

    2017-03-13

    The creation of composition-processing-structure relationships currently represents a key bottleneck for data analysis for high-throughput experimental (HTE) material studies. Here we propose an automated phase diagram attribution algorithm for HTE data analysis that uses a graph-based segmentation algorithm and Delaunay tessellation to create a crystal phase diagram from high throughput libraries of X-ray diffraction (XRD) patterns. We also propose the sample-pair based objective evaluation measures for the phase diagram prediction problem. Our approach was validated using 278 diffraction patterns from a Fe-Ga-Pd composition spread sample with a prediction precision of 0.934 and a Matthews Correlation Coefficient score of 0.823. The algorithm was then applied to the open Ni-Mn-Al thin-film composition spread sample to obtain the first predicted phase diagram mapping for that sample.

  14. BIOREL: the benchmark resource to estimate the relevance of the gene networks.

    PubMed

    Antonov, Alexey V; Mewes, Hans W

    2006-02-06

    The progress of high-throughput methodologies in functional genomics has lead to the development of statistical procedures to infer gene networks from various types of high-throughput data. However, due to the lack of common standards, the biological significance of the results of the different studies is hard to compare. To overcome this problem we propose a benchmark procedure and have developed a web resource (BIOREL), which is useful for estimating the biological relevance of any genetic network by integrating different sources of biological information. The associations of each gene from the network are classified as biologically relevant or not. The proportion of genes in the network classified as "relevant" is used as the overall network relevance score. Employing synthetic data we demonstrated that such a score ranks the networks fairly in respect to the relevance level. Using BIOREL as the benchmark resource we compared the quality of experimental and theoretically predicted protein interaction data.

  15. Separation and Concentration without Clogging Using a High-Throughput Tunable Filter

    NASA Astrophysics Data System (ADS)

    Mossige, E. J.; Jensen, A.; Mielnik, M. M.

    2018-05-01

    We present a detailed experimental study of a hydrodynamic filtration microchip and show how chip performance can be tuned and clogging avoided by adjusting the flow rates. We demonstrate concentration and separation of microspheres at throughputs as high as 29 ml /min and with 96% pureness. Results of streakline visualizations show that the thickness of a tunable filtration layer dictates the cutoff size and that two different concentration mechanisms exist. Particles larger than pores are concentrated by low-velocity rolling over the filtration pillars, while particles smaller than pores are concentrated by lateral drift across the filtration layer. Results of microscopic particle image velocimetry and particle-tracking velocimetry show that the degree of lateral migration can be quantified by the slip velocity between the particle and the surrounding fluid. Finally, by utilizing differences in inertia and separation mode, we demonstrate size-based separation of particles in a mixture.

  16. High-Throughput Particle Uptake Analysis by Imaging Flow Cytometry

    PubMed Central

    Smirnov, Asya; Solga, Michael D.; Lannigan, Joanne; Criss, Alison K.

    2017-01-01

    Quantifying the efficiency of particle uptake by host cells is important in fields including infectious diseases, autoimmunity, cancer, developmental biology, and drug delivery. Here we present a protocol for high-throughput analysis of particle uptake using imaging flow cytometry, using the bacterium Neisseria gonorrhoeae attached and internalized to neutrophils as an example. Cells are exposed to fluorescently labeled bacteria, fixed, and stained with a bacteria-specific antibody of a different fluorophore. Thus in the absence of a permeabilizing agent, extracellular bacteria are double-labeled with two fluorophores while intracellular bacteria remain single-labeled. A spot count algorithm is used to determine the number of single- and double-labeled bacteria in individual cells, to calculate the percent of cells associated with bacteria, percent of cells with internalized bacteria, and percent of cell-associated bacteria that are internalized. These analyses quantify bacterial association and internalization across thousands of cells and can be applied to diverse experimental systems. PMID:28369762

  17. Support and Development of Workflow Protocols for High Throughput Single-Lap-Joint Testing-Experimental

    DTIC Science & Technology

    2013-04-01

    preparation, and presence of an overflow fillet for a high strength epoxy and ductile methacylate adhesive. A unique feature of this study was the...of expanding adhesive joint test configurations as part of the GEMS program. 15. SUBJECT TERMS single lap joint, adhesion, aluminum, epoxy ... epoxy and ductile methacylate adhesive. A unique feature of this study was the use of untrained GEMS (Gains in the Education of Mathematics and Sci

  18. A generalized method for high throughput in-situ experiment data analysis: An example of battery materials exploration

    NASA Astrophysics Data System (ADS)

    Aoun, Bachir; Yu, Cun; Fan, Longlong; Chen, Zonghai; Amine, Khalil; Ren, Yang

    2015-04-01

    A generalized method is introduced to extract critical information from series of ranked correlated data. The method is generally applicable to all types of spectra evolving as a function of any arbitrary parameter. This approach is based on correlation functions and statistical scedasticity formalism. Numerous challenges in analyzing high throughput experimental data can be tackled using the herein proposed method. We applied this method to understand the reactivity pathway and formation mechanism of a Li-ion battery cathode material during high temperature synthesis using in-situ high-energy X-ray diffraction. We demonstrate that Pearson's correlation function can easily unravel all major phase transition and, more importantly, the minor structural changes which cannot be revealed by conventionally inspecting the series of diffraction patterns. Furthermore, a two-dimensional (2D) reactivity pattern calculated as the scedasticity along all measured reciprocal space of all successive diffraction pattern pairs unveils clearly the structural evolution path and the active areas of interest during the synthesis. The methods described here can be readily used for on-the-fly data analysis during various in-situ operando experiments in order to quickly evaluate and optimize experimental conditions, as well as for post data analysis and large data mining where considerable amount of data hinders the feasibility of the investigation through point-by-point inspection.

  19. Learning from Heterogeneous Data Sources: An Application in Spatial Proteomics

    PubMed Central

    Breckels, Lisa M.; Holden, Sean B.; Wojnar, David; Mulvey, Claire M.; Christoforou, Andy; Groen, Arnoud; Trotter, Matthew W. B.; Kohlbacher, Oliver; Lilley, Kathryn S.; Gatto, Laurent

    2016-01-01

    Sub-cellular localisation of proteins is an essential post-translational regulatory mechanism that can be assayed using high-throughput mass spectrometry (MS). These MS-based spatial proteomics experiments enable us to pinpoint the sub-cellular distribution of thousands of proteins in a specific system under controlled conditions. Recent advances in high-throughput MS methods have yielded a plethora of experimental spatial proteomics data for the cell biology community. Yet, there are many third-party data sources, such as immunofluorescence microscopy or protein annotations and sequences, which represent a rich and vast source of complementary information. We present a unique transfer learning classification framework that utilises a nearest-neighbour or support vector machine system, to integrate heterogeneous data sources to considerably improve on the quantity and quality of sub-cellular protein assignment. We demonstrate the utility of our algorithms through evaluation of five experimental datasets, from four different species in conjunction with four different auxiliary data sources to classify proteins to tens of sub-cellular compartments with high generalisation accuracy. We further apply the method to an experiment on pluripotent mouse embryonic stem cells to classify a set of previously unknown proteins, and validate our findings against a recent high resolution map of the mouse stem cell proteome. The methodology is distributed as part of the open-source Bioconductor pRoloc suite for spatial proteomics data analysis. PMID:27175778

  20. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID:25653655

  1. Dynamic Environmental Photosynthetic Imaging Reveals Emergent Phenotypes

    DOE PAGES

    Cruz, Jeffrey A.; Savage, Linda J.; Zegarac, Robert; ...

    2016-06-22

    Understanding and improving the productivity and robustness of plant photosynthesis requires high-throughput phenotyping under environmental conditions that are relevant to the field. Here we demonstrate the dynamic environmental photosynthesis imager (DEPI), an experimental platform for integrated, continuous, and high-throughput measurements of photosynthetic parameters during plant growth under reproducible yet dynamic environmental conditions. Using parallel imagers obviates the need to move plants or sensors, reducing artifacts and allowing simultaneous measurement on large numbers of plants. As a result, DEPI can reveal phenotypes that are not evident under standard laboratory conditions but emerge under progressively more dynamic illumination. We show examples inmore » mutants of Arabidopsis of such “emergent phenotypes” that are highly transient and heterogeneous, appearing in different leaves under different conditions and depending in complex ways on both environmental conditions and plant developmental age. Finally, these emergent phenotypes appear to be caused by a range of phenomena, suggesting that such previously unseen processes are critical for plant responses to dynamic environments.« less

  2. High Throughput Plasma Water Treatment

    NASA Astrophysics Data System (ADS)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  3. High-Throughput Nanoindentation for Statistical and Spatial Property Determination

    NASA Astrophysics Data System (ADS)

    Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.

    2018-04-01

    Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.

  4. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  5. ACTS High-Speed VSAT Demonstrated

    NASA Technical Reports Server (NTRS)

    Tran, Quang K.

    1999-01-01

    The Advanced Communication Technology Satellite (ACTS) developed by NASA has demonstrated the breakthrough technologies of Ka-band transmission, spot-beam antennas, and onboard processing. These technologies have enabled the development of very small and ultrasmall aperture terminals (VSAT s and USAT's), which have capabilities greater than have been possible with conventional satellite technologies. The ACTS High Speed VSAT (HS VSAT) is an effort at the NASA Glenn Research Center at Lewis Field to experimentally demonstrate the maximum user throughput data rate that can be achieved using the technologies developed and implemented on ACTS. This was done by operating the system uplinks as frequency division multiple access (FDMA), essentially assigning all available time division multiple access (TDMA) time slots to a single user on each of two uplink frequencies. Preliminary results show that, using a 1.2-m antenna in this mode, the High Speed VSAT can achieve between 22 and 24 Mbps of the 27.5 Mbps burst rate, for a throughput efficiency of 80 to 88 percent.

  6. Experimental demonstration of topologically protected efficient sound propagation in an acoustic waveguide network

    NASA Astrophysics Data System (ADS)

    Wei, Qi; Tian, Ye; Zuo, Shu-Yu; Cheng, Ying; Liu, Xiao-Jun

    2017-03-01

    Acoustic topological states support sound propagation along the boundary in a one-way direction with inherent robustness against defects and disorders, leading to the revolution of the manipulation on acoustic waves. A variety of acoustic topological states relying on circulating fluid, chiral coupling, or temporal modulation have been proposed theoretically. However, experimental demonstration has so far remained a significant challenge, due to the critical limitations such as structural complexity and high losses. Here, we experimentally demonstrate an acoustic anomalous Floquet topological insulator in a waveguide network. The acoustic gapless edge states can be found in the band gap when the waveguides are strongly coupled. The scheme features simple structure and high-energy throughput, leading to the experimental demonstration of efficient and robust topologically protected sound propagation along the boundary. The proposal may offer a unique, promising application for design of acoustic devices in acoustic guiding, switching, isolating, filtering, etc.

  7. Simultaneous virtual prediction of anti-Escherichia coli activities and ADMET profiles: A chemoinformatic complementary approach for high-throughput screening.

    PubMed

    Speck-Planche, Alejandro; Cordeiro, M N D S

    2014-02-10

    Escherichia coli remains one of the principal pathogens that cause nosocomial infections, medical conditions that are increasingly common in healthcare facilities. E. coli is intrinsically resistant to many antibiotics, and multidrug-resistant strains have emerged recently. Chemoinformatics has been a great ally of experimental methodologies such as high-throughput screening, playing an important role in the discovery of effective antibacterial agents. However, there is no approach that can design safer anti-E. coli agents, because of the multifactorial nature and complexity of bacterial diseases and the lack of desirable ADMET (absorption, distribution, metabolism, elimination, and toxicity) profiles as a major cause of disapproval of drugs. In this work, we introduce the first multitasking model based on quantitative-structure biological effect relationships (mtk-QSBER) for simultaneous virtual prediction of anti-E. coli activities and ADMET properties of drugs and/or chemicals under many experimental conditions. The mtk-QSBER model was developed from a large and heterogeneous data set of more than 37800 cases, exhibiting overall accuracies of >95% in both training and prediction (validation) sets. The utility of our mtk-QSBER model was demonstrated by performing virtual prediction of properties for the investigational drug avarofloxacin (AVX) under 260 different experimental conditions. Results converged with the experimental evidence, confirming the remarkable anti-E. coli activities and safety of AVX. Predictions also showed that our mtk-QSBER model can be a promising computational tool for virtual screening of desirable anti-E. coli agents, and this chemoinformatic approach could be extended to the search for safer drugs with defined pharmacological activities.

  8. A Microfluidics and Agent-Based Modeling Framework for Investigating Spatial Organization in Bacterial Colonies: The Case of Pseudomonas Aeruginosa and H1-Type VI Secretion Interactions

    DOE PAGES

    Wilmoth, Jared L.; Doak, Peter W.; Timm, Andrea; ...

    2018-02-06

    The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P. aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density andmore » local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models.« less

  9. A Microfluidics and Agent-Based Modeling Framework for Investigating Spatial Organization in Bacterial Colonies: The Case of Pseudomonas Aeruginosa and H1-Type VI Secretion Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilmoth, Jared L.; Doak, Peter W.; Timm, Andrea

    The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P. aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density andmore » local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models.« less

  10. Evaluating Computational Gene Ontology Annotations.

    PubMed

    Škunca, Nives; Roberts, Richard J; Steffen, Martin

    2017-01-01

    Two avenues to understanding gene function are complementary and often overlapping: experimental work and computational prediction. While experimental annotation generally produces high-quality annotations, it is low throughput. Conversely, computational annotations have broad coverage, but the quality of annotations may be variable, and therefore evaluating the quality of computational annotations is a critical concern.In this chapter, we provide an overview of strategies to evaluate the quality of computational annotations. First, we discuss why evaluating quality in this setting is not trivial. We highlight the various issues that threaten to bias the evaluation of computational annotations, most of which stem from the incompleteness of biological databases. Second, we discuss solutions that address these issues, for example, targeted selection of new experimental annotations and leveraging the existing experimental annotations.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center propose a joint project. The goals are to enable scientific workflows of stakeholders to run on multiple cloud resources by use of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federat ion of Cloud Resources , and (c) High-Throughput Fabric Virtualization. This is a matching fund project in which Fermilab and KISTI will contribute equal resources .

  12. WE-EF-BRA-05: Experimental Design for High-Throughput In-Vitro RBE Measurements Using Protons, Helium and Carbon Ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, F; Titt, U; Patel, D

    2015-06-15

    Purpose: To design and validate experimental setups for investigation of dose and LET effects in cell kill for protons, helium and carbon ions, in high throughput and high accuracy cell experiments. Methods: Using the Geant4 Monte Carlo toolkit, we designed 3 custom range compensators to simultaneously expose cancer cells to different doses and LETs from selected portions of pristine ion beams from the entrance to points just beyond the Bragg peak. To minimize the spread of LET, we utilized mono-energetic uniformly scanned beams at the HIT facility with support from the DKFZ. Using different entrance doses and LETs, a matrixmore » of cell survival data was acquired leading to a specific RBE matrix. We utilized the standard clonogenic assay for H460 and H1437 lung-cancer cell lines grown in 96-well plates. Using these plates, the data could be acquired in a small number of exposures. The ion specific compensators were located in a horizontal beam, designed to hold two 96-wells plates (12 columns by 8 rows) at an angle of 30o with respect to the beam direction. Results: Using about 20 hours of beam time, a total of about 11,000 wells containing cancer cells could be irradiated. The H460 and H1437 cell lines exhibited a significant dependence on LET when they were exposed to comparable doses. The results were similar for each of the investigated ion species, and indicate the need to incorporate RBE into the ion therapy planning process. Conclusion: The experimental design developed is a viable approach to rapidly acquire large amounts of accurate in-vitro RBE data. We plan to further improve the design to achieve higher accuracy and throughput, thereby facilitating the irradiation of multiple cell types. The results are indicative of the possibility to develop a new degree of freedom (variable RBE) for future clinical ion therapy optimization. Work supported by the Sister Institute Network Fund (SINF), University of Texas MD Anderson Cancer Center.« less

  13. RhizoTubes as a new tool for high throughput imaging of plant root development and architecture: test, comparison with pot grown plants and validation.

    PubMed

    Jeudy, Christian; Adrian, Marielle; Baussard, Christophe; Bernard, Céline; Bernaud, Eric; Bourion, Virginie; Busset, Hughes; Cabrera-Bosquet, Llorenç; Cointault, Frédéric; Han, Simeng; Lamboeuf, Mickael; Moreau, Delphine; Pivato, Barbara; Prudent, Marion; Trouvelot, Sophie; Truong, Hoai Nam; Vernoud, Vanessa; Voisin, Anne-Sophie; Wipf, Daniel; Salon, Christophe

    2016-01-01

    In order to maintain high yields while saving water and preserving non-renewable resources and thus limiting the use of chemical fertilizer, it is crucial to select plants with more efficient root systems. This could be achieved through an optimization of both root architecture and root uptake ability and/or through the improvement of positive plant interactions with microorganisms in the rhizosphere. The development of devices suitable for high-throughput phenotyping of root structures remains a major bottleneck. Rhizotrons suitable for plant growth in controlled conditions and non-invasive image acquisition of plant shoot and root systems (RhizoTubes) are described. These RhizoTubes allow growing one to six plants simultaneously, having a maximum height of 1.1 m, up to 8 weeks, depending on plant species. Both shoot and root compartment can be imaged automatically and non-destructively throughout the experiment thanks to an imaging cabin (RhizoCab). RhizoCab contains robots and imaging equipment for obtaining high-resolution pictures of plant roots. Using this versatile experimental setup, we illustrate how some morphometric root traits can be determined for various species including model (Medicago truncatula), crops (Pisum sativum, Brassica napus, Vitis vinifera, Triticum aestivum) and weed (Vulpia myuros) species grown under non-limiting conditions or submitted to various abiotic and biotic constraints. The measurement of the root phenotypic traits using this system was compared to that obtained using "classic" growth conditions in pots. This integrated system, to include 1200 Rhizotubes, will allow high-throughput phenotyping of plant shoots and roots under various abiotic and biotic environmental conditions. Our system allows an easy visualization or extraction of roots and measurement of root traits for high-throughput or kinetic analyses. The utility of this system for studying root system architecture will greatly facilitate the identification of genetic and environmental determinants of key root traits involved in crop responses to stresses, including interactions with soil microorganisms.

  14. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  15. Development of Low-cost, High Energy-per-unit-area Solar Cell Modules

    NASA Technical Reports Server (NTRS)

    Jones, G. T.; Chitre, S.; Rhee, S. S.

    1978-01-01

    The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.

  16. XUNET experimental high-speed network testbed CRADA 1136, DOE TTI No. 92-MULT-020-B2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, R.E.

    1996-04-01

    XUNET is a research program with AT&T and other partners to study high-speed wide area communication between local area networks over a backbone using Asynchronous Transfer Mode (ATM) switches. Important goals of the project are to develop software techniques for network control and management, and applications for high-speed networks. The project entails building a testbed between member sites to explore performance issues for mixed network traffic such as congestion control, multimedia communications protocols, segmentation and reassembly of ATM cells, and overall data throughput rates.

  17. MAPPI-DAT: data management and analysis for protein-protein interaction data from the high-throughput MAPPIT cell microarray platform.

    PubMed

    Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart

    2017-05-01

    Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  18. Genome-wide association study of rice (Oryza sativa L.) leaf traits with a high-throughput leaf scorer.

    PubMed

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Wang, Ke; Jiang, Ni; Feng, Hui; Chen, Guoxing; Liu, Qian; Xiong, Lizhong

    2015-09-01

    Leaves are the plant's solar panel and food factory, and leaf traits are always key issues to investigate in plant research. Traditional methods for leaf trait measurement are time-consuming. In this work, an engineering prototype has been established for high-throughput leaf scoring (HLS) of a large number of Oryza sativa accessions. The mean absolute per cent of errors in traditional measurements versus HLS were below 5% for leaf number, area, shape, and colour. Moreover, HLS can measure up to 30 leaves per minute. To demonstrate the usefulness of HLS in dissecting the genetic bases of leaf traits, a genome-wide association study (GWAS) was performed for 29 leaf traits related to leaf size, shape, and colour at three growth stages using HLS on a panel of 533 rice accessions. Nine associated loci contained known leaf-related genes, such as Nal1 for controlling the leaf width. In addition, a total of 73, 123, and 177 new loci were detected for traits associated with leaf size, colour, and shape, respectively. In summary, after evaluating the performance with a large number of rice accessions, the combination of GWAS and high-throughput leaf phenotyping (HLS) has proven a valuable strategy to identify the genetic loci controlling rice leaf traits. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  19. A High Throughput In Vivo Assay for Taste Quality and Palatability

    PubMed Central

    Palmer, R. Kyle; Long, Daniel; Brennan, Francis; Buber, Tulu; Bryant, Robert; Salemme, F. Raymond

    2013-01-01

    Taste quality and palatability are two of the most important properties measured in the evaluation of taste stimuli. Human panels can report both aspects, but are of limited experimental flexibility and throughput capacity. Relatively efficient animal models for taste evaluation have been developed, but each of them is designed to measure either taste quality or palatability as independent experimental endpoints. We present here a new apparatus and method for high throughput quantification of both taste quality and palatability using rats in an operant taste discrimination paradigm. Cohorts of four rats were trained in a modified operant chamber to sample taste stimuli by licking solutions from a 96-well plate that moved in a randomized pattern beneath the chamber floor. As a rat’s tongue entered the well it disrupted a laser beam projecting across the top of the 96-well plate, consequently producing two retractable levers that operated a pellet dispenser. The taste of sucrose was associated with food reinforcement by presses on a sucrose-designated lever, whereas the taste of water and other basic tastes were associated with the alternative lever. Each disruption of the laser was counted as a lick. Using this procedure, rats were trained to discriminate 100 mM sucrose from water, quinine, citric acid, and NaCl with 90-100% accuracy. Palatability was determined by the number of licks per trial and, due to intermediate rates of licking for water, was quantifiable along the entire spectrum of appetitiveness to aversiveness. All 96 samples were evaluated within 90 minute test sessions with no evidence of desensitization or fatigue. The technology is capable of generating multiple concentration–response functions within a single session, is suitable for in vivo primary screening of tastant libraries, and potentially can be used to evaluate stimuli for any taste system. PMID:23951319

  20. Application of ToxCast High-Throughput Screening and ...

    EPA Pesticide Factsheets

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  1. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  2. A Prospective Virtual Screening Study: Enriching Hit Rates and Designing Focus Libraries To Find Inhibitors of PI3Kδ and PI3Kγ.

    PubMed

    Damm-Ganamet, Kelly L; Bembenek, Scott D; Venable, Jennifer W; Castro, Glenda G; Mangelschots, Lieve; Peeters, Daniëlle C G; Mcallister, Heather M; Edwards, James P; Disepio, Daniel; Mirzadegan, Taraneh

    2016-05-12

    Here, we report a high-throughput virtual screening (HTVS) study using phosphoinositide 3-kinase (both PI3Kγ and PI3Kδ). Our initial HTVS results of the Janssen corporate database identified small focused libraries with hit rates at 50% inhibition showing a 50-fold increase over those from a HTS (high-throughput screen). Further, applying constraints based on "chemically intuitive" hydrogen bonds and/or positional requirements resulted in a substantial improvement in the hit rates (versus no constraints) and reduced docking time. While we find that docking scoring functions are not capable of providing a reliable relative ranking of a set of compounds, a prioritization of groups of compounds (e.g., low, medium, and high) does emerge, which allows for the chemistry efforts to be quickly focused on the most viable candidates. Thus, this illustrates that it is not always necessary to have a high correlation between a computational score and the experimental data to impact the drug discovery process.

  3. Modeling congenital disease and inborn errors of development in Drosophila melanogaster

    PubMed Central

    Moulton, Matthew J.; Letsou, Anthea

    2016-01-01

    ABSTRACT Fly models that faithfully recapitulate various aspects of human disease and human health-related biology are being used for research into disease diagnosis and prevention. Established and new genetic strategies in Drosophila have yielded numerous substantial successes in modeling congenital disorders or inborn errors of human development, as well as neurodegenerative disease and cancer. Moreover, although our ability to generate sequence datasets continues to outpace our ability to analyze these datasets, the development of high-throughput analysis platforms in Drosophila has provided access through the bottleneck in the identification of disease gene candidates. In this Review, we describe both the traditional and newer methods that are facilitating the incorporation of Drosophila into the human disease discovery process, with a focus on the models that have enhanced our understanding of human developmental disorders and congenital disease. Enviable features of the Drosophila experimental system, which make it particularly useful in facilitating the much anticipated move from genotype to phenotype (understanding and predicting phenotypes directly from the primary DNA sequence), include its genetic tractability, the low cost for high-throughput discovery, and a genome and underlying biology that are highly evolutionarily conserved. In embracing the fly in the human disease-gene discovery process, we can expect to speed up and reduce the cost of this process, allowing experimental scales that are not feasible and/or would be too costly in higher eukaryotes. PMID:26935104

  4. A generalized method for high throughput in-situ experiment data analysis: An example of battery materials exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aoun, Bachir; Yu, Cun; Fan, Longlong

    A generalized method is introduced to extract critical information from series of ranked correlated data. The method is generally applicable to all types of spectra evolving as a function of any arbitrary parameter. This approach is based on correlation functions and statistical scedasticity formalism. Numerous challenges in analyzing high throughput experimental data can be tackled using the herein proposed method. We applied this method to understand the reactivity pathway and formation mechanism of a Li-ion battery cathode material during high temperature synthesis using in-situ highenergy X-ray diffraction. We demonstrate that Pearson's correlation function can easily unravel all major phase transitionmore » and, more importantly, the minor structural changes which cannot be revealed by conventionally inspecting the series of diffraction patterns. Furthermore, a two-dimensional (2D) reactivity pattern calculated as the scedasticity along all measured reciprocal space of all successive diffraction pattern pairs unveils clearly the structural evolution path and the active areas of interest during the synthesis. The methods described here can be readily used for on-the-fly data analysis during various in-situ operando experiments in order to quickly evaluate and optimize experimental conditions, as well as for post data analysis and large data mining where considerable amount of data hinders the feasibility of the investigation through point-by-point inspection.« less

  5. Tempest: GPU-CPU computing for high-throughput database spectral matching.

    PubMed

    Milloy, Jeffrey A; Faherty, Brendan K; Gerber, Scott A

    2012-07-06

    Modern mass spectrometers are now capable of producing hundreds of thousands of tandem (MS/MS) spectra per experiment, making the translation of these fragmentation spectra into peptide matches a common bottleneck in proteomics research. When coupled with experimental designs that enrich for post-translational modifications such as phosphorylation and/or include isotopically labeled amino acids for quantification, additional burdens are placed on this computational infrastructure by shotgun sequencing. To address this issue, we have developed a new database searching program that utilizes the massively parallel compute capabilities of a graphical processing unit (GPU) to produce peptide spectral matches in a very high throughput fashion. Our program, named Tempest, combines efficient database digestion and MS/MS spectral indexing on a CPU with fast similarity scoring on a GPU. In our implementation, the entire similarity score, including the generation of full theoretical peptide candidate fragmentation spectra and its comparison to experimental spectra, is conducted on the GPU. Although Tempest uses the classical SEQUEST XCorr score as a primary metric for evaluating similarity for spectra collected at unit resolution, we have developed a new "Accelerated Score" for MS/MS spectra collected at high resolution that is based on a computationally inexpensive dot product but exhibits scoring accuracy similar to that of the classical XCorr. In our experience, Tempest provides compute-cluster level performance in an affordable desktop computer.

  6. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.

    2004-05-12

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view,more » create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments.« less

  7. Probabilistic cross-link analysis and experiment planning for high-throughput elucidation of protein structure.

    PubMed

    Ye, Xiaoduan; O'Neil, Patrick K; Foster, Adrienne N; Gajda, Michal J; Kosinski, Jan; Kurowski, Michal A; Bujnicki, Janusz M; Friedman, Alan M; Bailey-Kellogg, Chris

    2004-12-01

    Emerging high-throughput techniques for the characterization of protein and protein-complex structures yield noisy data with sparse information content, placing a significant burden on computation to properly interpret the experimental data. One such technique uses cross-linking (chemical or by cysteine oxidation) to confirm or select among proposed structural models (e.g., from fold recognition, ab initio prediction, or docking) by testing the consistency between cross-linking data and model geometry. This paper develops a probabilistic framework for analyzing the information content in cross-linking experiments, accounting for anticipated experimental error. This framework supports a mechanism for planning experiments to optimize the information gained. We evaluate potential experiment plans using explicit trade-offs among key properties of practical importance: discriminability, coverage, balance, ambiguity, and cost. We devise a greedy algorithm that considers those properties and, from a large number of combinatorial possibilities, rapidly selects sets of experiments expected to discriminate pairs of models efficiently. In an application to residue-specific chemical cross-linking, we demonstrate the ability of our approach to plan experiments effectively involving combinations of cross-linkers and introduced mutations. We also describe an experiment plan for the bacteriophage lambda Tfa chaperone protein in which we plan dicysteine mutants for discriminating threading models by disulfide formation. Preliminary results from a subset of the planned experiments are consistent and demonstrate the practicality of planning. Our methods provide the experimenter with a valuable tool (available from the authors) for understanding and optimizing cross-linking experiments.

  8. Identification of widespread adenosine nucleotide binding in Mycobacterium tuberculosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansong, Charles; Ortega, Corrie; Payne, Samuel H.

    The annotation of protein function is almost completely performed by in silico approaches. However, computational prediction of protein function is frequently incomplete and error prone. In Mycobacterium tuberculosis (Mtb), ~25% of all genes have no predicted function and are annotated as hypothetical proteins. This lack of functional information severely limits our understanding of Mtb pathogenicity. Current tools for experimental functional annotation are limited and often do not scale to entire protein families. Here, we report a generally applicable chemical biology platform to functionally annotate bacterial proteins by combining activity-based protein profiling (ABPP) and quantitative LC-MS-based proteomics. As an example ofmore » this approach for high-throughput protein functional validation and discovery, we experimentally annotate the families of ATP-binding proteins in Mtb. Our data experimentally validate prior in silico predictions of >250 ATPases and adenosine nucleotide-binding proteins, and reveal 73 hypothetical proteins as novel ATP-binding proteins. We identify adenosine cofactor interactions with many hypothetical proteins containing a diversity of unrelated sequences, providing a new and expanded view of adenosine nucleotide binding in Mtb. Furthermore, many of these hypothetical proteins are both unique to Mycobacteria and essential for infection, suggesting specialized functions in mycobacterial physiology and pathogenicity. Thus, we provide a generally applicable approach for high throughput protein function discovery and validation, and highlight several ways in which application of activity-based proteomics data can improve the quality of functional annotations to facilitate novel biological insights.« less

  9. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  10. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  11. Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.

    PubMed

    Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani

    2007-03-01

    Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.

  12. A high volume cost efficient production macrostructuring process. [for silicon solar cell surface treatment

    NASA Technical Reports Server (NTRS)

    Chitre, S. R.

    1978-01-01

    The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.

  13. Predicting the Oxygen-Binding Properties of Platinum Nanoparticle Ensembles by Combining High-Precision Electron Microscopy and Density Functional Theory.

    PubMed

    Aarons, Jolyon; Jones, Lewys; Varambhia, Aakash; MacArthur, Katherine E; Ozkaya, Dogan; Sarwar, Misbah; Skylaris, Chris-Kriton; Nellist, Peter D

    2017-07-12

    Many studies of heterogeneous catalysis, both experimental and computational, make use of idealized structures such as extended surfaces or regular polyhedral nanoparticles. This simplification neglects the morphological diversity in real commercial oxygen reduction reaction (ORR) catalysts used in fuel-cell cathodes. Here we introduce an approach that combines 3D nanoparticle structures obtained from high-throughput high-precision electron microscopy with density functional theory. Discrepancies between experimental observations and cuboctahedral/truncated-octahedral particles are revealed and discussed using a range of widely used descriptors, such as electron-density, d-band centers, and generalized coordination numbers. We use this new approach to determine the optimum particle size for which both detrimental surface roughness and particle shape effects are minimized.

  14. Adaptation to high throughput batch chromatography enhances multivariate screening.

    PubMed

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Extruder system for high-throughput/steady-state hydrogen ice supply and application for pellet fueling of reactor-scale fusion experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combs, S.K.; Foust, C.R.; Qualls, A.L.

    Pellet injection systems for the next-generation fusion devices, such as the proposed International Thermonuclear Experimental Reactor (ITER), will require feed systems capable of providing a continuous supply of hydrogen ice at high throughputs. A straightforward concept in which multiple extruder units operate in tandem has been under development at the Oak Ridge National Laboratory. A prototype with three large-volume extruder units has been fabricated and tested in the laboratory. In experiments, it was found that each extruder could provide volumetric ice flow rates of up to {approximately}1.3 cm{sup 3}/s (for {approximately}10 s), which is sufficient for fueling fusion reactors atmore » the gigawatt power level. With the three extruders of the prototype operating in sequence, a steady rate of {approximately}0.33 cm{sup 3}/s was maintained for a duration of 1 h. Even steady-state rates approaching the full ITER design value ({approximately}1 cm{sup 3}/s) may be feasible with the prototype. However, additional extruder units (1{endash}3) would facilitate operations at the higher throughputs and reduce the duty cycle of each unit. The prototype can easily accommodate steady-state pellet fueling of present large tokamaks or other near-term plasma experiments.« less

  16. PhenStat | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    PhenStat is a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations from model organisms developed for the International Mouse Phenotyping Consortium (IMPC at www.mousephenotype.org ). The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation and is being adapted for analysis with PDX mouse strains.

  17. Content Is King: Databases Preserve the Collective Information of Science.

    PubMed

    Yates, John R

    2018-04-01

    Databases store sequence information experimentally gathered to create resources that further science. In the last 20 years databases have become critical components of fields like proteomics where they provide the basis for large-scale and high-throughput proteomic informatics. Amos Bairoch, winner of the Association of Biomolecular Resource Facilities Frederick Sanger Award, has created some of the important databases proteomic research depends upon for accurate interpretation of data.

  18. Autonomy in Materials Research: A Case Study in Carbon Nanotube Growth (Postprint)

    DTIC Science & Technology

    2016-10-21

    built an Autonomous Research System (ARES)—an autonomous research robot capable of first-of-its-kind closed-loop iterative materials experimentation...ARES exploits advances in autonomous robotics , artificial intelligence, data sciences, and high-throughput and in situ techniques, and is able to...roles of humans and autonomous research robots , and for human-machine partnering. We believe autonomous research robots like ARES constitute a

  19. Controlled Electrospray Generation of Nonspherical Alginate Microparticles.

    PubMed

    Jeyhani, Morteza; Mak, Sze Yi; Sammut, Stephen; Shum, Ho Cheung; Hwang, Dae Kun; Tsai, Scott S H

    2017-12-11

    Electrospraying is a technique used to generate microparticles in a high throughput manner. For biomedical applications, a biocompatible electrosprayed material is often desirable. Using polymers, such as alginate hydrogels, makes it possible to create biocompatible and biodegradable microparticles that can be used for cell encapsulation, to be employed as drug carriers, and for use in 3D cell culturing. Evidence in the literature suggests that the morphology of the biocompatible microparticles is relevant in controlling the dynamics of the microparticles in drug delivery and 3D cell culturing applications. Yet, most electrospray-based techniques only form spherical microparticles, and there is currently no widely adopted technique for producing nonspherical microparticles at a high throughput. Here, we demonstrate the generation of nonspherical biocompatible alginate microparticles by electrospraying, and control the shape of the microparticles by varying experimental parameters such as chemical concentration and the distance between the electrospray tip and the particle-solidification bath. Importantly, we show that these changes to the experimental setup enable the synthesis of different shaped particles, and the systematic change in parameters, such as chemical concentration, result in monotonic changes to the particle aspect ratio. We expect that these results will find utility in many biomedical applications that require biocompatible microparticles of specific shapes. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Quantification of locomotor activity in larval zebrafish: considerations for the design of high-throughput behavioral studies

    PubMed Central

    Ingebretson, Justin J.; Masino, Mark A.

    2013-01-01

    High-throughput behavioral studies using larval zebrafish often assess locomotor activity to determine the effects of experimental perturbations. However, the results reported by different groups are difficult to compare because there is not a standardized experimental paradigm or measure of locomotor activity. To address this, we investigated the effects that several factors, including the stage of larval development and the physical dimensions (depth and diameter) of the behavioral arena, have on the locomotor activity produced by larval zebrafish. We provide evidence for differences in locomotor activity between larvae at different stages and when recorded in wells of different depths, but not in wells of different diameters. We also show that the variability for most properties of locomotor activity is less for older than younger larvae, which is consistent with previous reports. Finally, we show that conflicting interpretations of activity level can occur when activity is assessed with a single measure of locomotor activity. Thus, we conclude that although a combination of factors should be considered when designing behavioral experiments, the use of older larvae in deep wells will reduce the variability of locomotor activity, and that multiple properties of locomotor activity should be measured to determine activity level. PMID:23772207

  1. Sensitivity and accuracy of high-throughput metabarcoding methods for early detection of invasive fish species

    NASA Astrophysics Data System (ADS)

    Hatzenbuhler, Chelsea; Kelly, John R.; Martinson, John; Okum, Sara; Pilgrim, Erik

    2017-04-01

    High-throughput DNA metabarcoding has gained recognition as a potentially powerful tool for biomonitoring, including early detection of aquatic invasive species (AIS). DNA based techniques are advancing, but our understanding of the limits to detection for metabarcoding complex samples is inadequate. For detecting AIS at an early stage of invasion when the species is rare, accuracy at low detection limits is key. To evaluate the utility of metabarcoding in future fish community monitoring programs, we conducted several experiments to determine the sensitivity and accuracy of routine metabarcoding methods. Experimental mixes used larval fish tissue from multiple “common” species spiked with varying proportions of tissue from an additional “rare” species. Pyrosequencing of genetic marker, COI (cytochrome c oxidase subunit I) and subsequent sequence data analysis provided experimental evidence of low-level detection of the target “rare” species at biomass percentages as low as 0.02% of total sample biomass. Limits to detection varied interspecifically and were susceptible to amplification bias. Moreover, results showed some data processing methods can skew sequence-based biodiversity measurements from corresponding relative biomass abundances and increase false absences. We suggest caution in interpreting presence/absence and relative abundance in larval fish assemblages until metabarcoding methods are optimized for accuracy and precision.

  2. 04-ERD-052-Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loots, G G; Ovcharenko, I; Collette, N

    2007-02-26

    Generating the sequence of the human genome represents a colossal achievement for science and mankind. The technical use for the human genome project information holds great promise to cure disease, prevent bioterror threats, as well as to learn about human origins. Yet converting the sequence data into biological meaningful information has not been immediately obvious, and we are still in the preliminary stages of understanding how the genome is organized, what are the functional building blocks and how do these sequences mediate complex biological processes. The overarching goal of this program was to develop novel methods and high throughput strategiesmore » for determining the functions of ''anonymous'' human genes that are evolutionarily deeply conserved in other vertebrates. We coupled analytical tool development and computational predictions regarding gene function with novel high throughput experimental strategies and tested biological predictions in the laboratory. The tools required for comparative genomic data-mining are fundamentally the same whether they are applied to scientific studies of related microbes or the search for functions of novel human genes. For this reason the tools, conceptual framework and the coupled informatics-experimental biology paradigm we developed in this LDRD has many potential scientific applications relevant to LLNL multidisciplinary research in bio-defense, bioengineering, bionanosciences and microbial and environmental genomics.« less

  3. Robo-Lector – a novel platform for automated high-throughput cultivations in microtiter plates with high information content

    PubMed Central

    Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen

    2009-01-01

    Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274

  4. High Throughput Transcriptomics: From screening to pathways

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  5. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    NASA Astrophysics Data System (ADS)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  6. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations.

    PubMed

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  7. Adaptive Packet Combining Scheme in Three State Channel Model

    NASA Astrophysics Data System (ADS)

    Saring, Yang; Bulo, Yaka; Bhunia, Chandan Tilak

    2018-01-01

    The two popular techniques of packet combining based error correction schemes are: Packet Combining (PC) scheme and Aggressive Packet Combining (APC) scheme. PC scheme and APC scheme have their own merits and demerits; PC scheme has better throughput than APC scheme, but suffers from higher packet error rate than APC scheme. The wireless channel state changes all the time. Because of this random and time varying nature of wireless channel, individual application of SR ARQ scheme, PC scheme and APC scheme can't give desired levels of throughput. Better throughput can be achieved if appropriate transmission scheme is used based on the condition of channel. Based on this approach, adaptive packet combining scheme has been proposed to achieve better throughput. The proposed scheme adapts to the channel condition to carry out transmission using PC scheme, APC scheme and SR ARQ scheme to achieve better throughput. Experimentally, it was observed that the error correction capability and throughput of the proposed scheme was significantly better than that of SR ARQ scheme, PC scheme and APC scheme.

  8. Chapter 15: Disease Gene Prioritization

    PubMed Central

    Bromberg, Yana

    2013-01-01

    Disease-causing aberrations in the normal function of a gene define that gene as a disease gene. Proving a causal link between a gene and a disease experimentally is expensive and time-consuming. Comprehensive prioritization of candidate genes prior to experimental testing drastically reduces the associated costs. Computational gene prioritization is based on various pieces of correlative evidence that associate each gene with the given disease and suggest possible causal links. A fair amount of this evidence comes from high-throughput experimentation. Thus, well-developed methods are necessary to reliably deal with the quantity of information at hand. Existing gene prioritization techniques already significantly improve the outcomes of targeted experimental studies. Faster and more reliable techniques that account for novel data types are necessary for the development of new diagnostics, treatments, and cure for many diseases. PMID:23633938

  9. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  10. Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siol, Sebastian; Dhakal, Tara P.; Gudavalli, Ganesh S.

    High-throughput computational and experimental techniques have been used in the past to accelerate the discovery of new promising solar cell materials. An important part of the development of novel thin film solar cell technologies, that is still considered a bottleneck for both theory and experiment, is the search for alternative interfacial contact (buffer) layers. The research and development of contact materials is difficult due to the inherent complexity that arises from its interactions at the interface with the absorber. A promising alternative to the commonly used CdS buffer layer in thin film solar cells that contain absorbers with lower electronmore » affinity can be found in ..beta..-In2S3. However, the synthesis conditions for the sputter deposition of this material are not well-established. Here, In2S3 is investigated as a solar cell contact material utilizing a high-throughput combinatorial screening of the temperature-flux parameter space, followed by a number of spatially resolved characterization techniques. It is demonstrated that, by tuning the sulfur partial pressure, phase pure ..beta..-In2S3 could be deposited using a broad range of substrate temperatures between 500 degrees C and ambient temperature. Combinatorial photovoltaic device libraries with Al/ZnO/In2S3/Cu2ZnSnS4/Mo/SiO2 structure were built at optimal processing conditions to investigate the feasibility of the sputtered In2S3 buffer layers and of an accelerated optimization of the device structure. The performance of the resulting In2S3/Cu2ZnSnS4 photovoltaic devices is on par with CdS/Cu2ZnSnS4 reference solar cells with similar values for short circuit currents and open circuit voltages, despite the overall quite low efficiency of the devices (-2%). Overall, these results demonstrate how a high-throughput experimental approach can be used to accelerate the development of contact materials and facilitate the optimization of thin film solar cell devices.« less

  12. Secure UNIX socket-based controlling system for high-throughput protein crystallography experiments.

    PubMed

    Gaponov, Yurii; Igarashi, Noriyuki; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Suzuki, Mamoru; Kosuge, Takashi; Wakatsuki, Soichi

    2004-01-01

    A control system for high-throughput protein crystallography experiments has been developed based on a multilevel secure (SSL v2/v3) UNIX socket under the Linux operating system. Main features of protein crystallography experiments (purification, crystallization, loop preparation, data collecting, data processing) are dealt with by the software. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data, that are stored in Network File Server) in a relational database (MySQL). The system consists of several servers and clients. TCP/IP secure UNIX sockets with four predefined behaviors [(a) listening to a request followed by a reply, (b) sending a request and waiting for a reply, (c) listening to a broadcast message, and (d) sending a broadcast message] support communications between all servers and clients allowing one to control experiments, view data, edit experimental conditions and perform data processing remotely. The usage of the interface software is well suited for developing well organized control software with a hierarchical structure of different software units (Gaponov et al., 1998), which will pass and receive different types of information. All communication is divided into two parts: low and top levels. Large and complicated control tasks are split into several smaller ones, which can be processed by control clients independently. For communicating with experimental equipment (beamline optical elements, robots, and specialized experimental equipment etc.), the STARS server, developed at the Photon Factory, is used (Kosuge et al., 2002). The STARS server allows any application with an open socket to be connected with any other clients that control experimental equipment. Majority of the source code is written in C/C++. GUI modules of the system were built mainly using Glade user interface builder for GTK+ and Gnome under Red Hat Linux 7.1 operating system.

  13. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  14. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  15. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  16. Empirical analysis of RNA robustness and evolution using high-throughput sequencing of ribozyme reactions.

    PubMed

    Hayden, Eric J

    2016-08-15

    RNA molecules provide a realistic but tractable model of a genotype to phenotype relationship. This relationship has been extensively investigated computationally using secondary structure prediction algorithms. Enzymatic RNA molecules, or ribozymes, offer access to genotypic and phenotypic information in the laboratory. Advancements in high-throughput sequencing technologies have enabled the analysis of sequences in the lab that now rivals what can be accomplished computationally. This has motivated a resurgence of in vitro selection experiments and opened new doors for the analysis of the distribution of RNA functions in genotype space. A body of computational experiments has investigated the persistence of specific RNA structures despite changes in the primary sequence, and how this mutational robustness can promote adaptations. This article summarizes recent approaches that were designed to investigate the role of mutational robustness during the evolution of RNA molecules in the laboratory, and presents theoretical motivations, experimental methods and approaches to data analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  18. Physico-chemical foundations underpinning microarray and next-generation sequencing experiments

    PubMed Central

    Harrison, Andrew; Binder, Hans; Buhot, Arnaud; Burden, Conrad J.; Carlon, Enrico; Gibas, Cynthia; Gamble, Lara J.; Halperin, Avraham; Hooyberghs, Jef; Kreil, David P.; Levicky, Rastislav; Noble, Peter A.; Ott, Albrecht; Pettitt, B. Montgomery; Tautz, Diethard; Pozhitkov, Alexander E.

    2013-01-01

    Hybridization of nucleic acids on solid surfaces is a key process involved in high-throughput technologies such as microarrays and, in some cases, next-generation sequencing (NGS). A physical understanding of the hybridization process helps to determine the accuracy of these technologies. The goal of a widespread research program is to develop reliable transformations between the raw signals reported by the technologies and individual molecular concentrations from an ensemble of nucleic acids. This research has inputs from many areas, from bioinformatics and biostatistics, to theoretical and experimental biochemistry and biophysics, to computer simulations. A group of leading researchers met in Ploen Germany in 2011 to discuss present knowledge and limitations of our physico-chemical understanding of high-throughput nucleic acid technologies. This meeting inspired us to write this summary, which provides an overview of the state-of-the-art approaches based on physico-chemical foundation to modeling of the nucleic acids hybridization process on solid surfaces. In addition, practical application of current knowledge is emphasized. PMID:23307556

  19. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    NASA Astrophysics Data System (ADS)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  20. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  1. Making Waves: New Developments in Toxicology With the Zebrafish.

    PubMed

    Horzmann, Katharine A; Freeman, Jennifer L

    2018-05-01

    The laboratory zebrafish (Danio rerio) is now an accepted model in toxicologic research. The zebrafish model fills a niche between in vitro models and mammalian biomedical models. The developmental characteristics of the small fish are strategically being used by scientists to study topics ranging from high-throughput toxicity screens to toxicity in multi- and transgenerational studies. High-throughput technology has increased the utility of zebrafish embryonic toxicity assays in screening of chemicals and drugs for toxicity or effect. Additionally, advances in behavioral characterization and experimental methodology allow for observation of recognizable phenotypic changes after xenobiotic exposure. Future directions in zebrafish research are predicted to take advantage of CRISPR-Cas9 genome editing methods in creating models of disease and interrogating mechanisms of action with fluorescent reporters or tagged proteins. Zebrafish can also model developmental origins of health and disease and multi- and transgenerational toxicity. The zebrafish has many advantages as a toxicologic model and new methodologies and areas of study continue to expand the usefulness and application of the zebrafish.

  2. Advances in the Study of Heart Development and Disease Using Zebrafish

    PubMed Central

    Brown, Daniel R.; Samsa, Leigh Ann; Qian, Li; Liu, Jiandong

    2016-01-01

    Animal models of cardiovascular disease are key players in the translational medicine pipeline used to define the conserved genetic and molecular basis of disease. Congenital heart diseases (CHDs) are the most common type of human birth defect and feature structural abnormalities that arise during cardiac development and maturation. The zebrafish, Danio rerio, is a valuable vertebrate model organism, offering advantages over traditional mammalian models. These advantages include the rapid, stereotyped and external development of transparent embryos produced in large numbers from inexpensively housed adults, vast capacity for genetic manipulation, and amenability to high-throughput screening. With the help of modern genetics and a sequenced genome, zebrafish have led to insights in cardiovascular diseases ranging from CHDs to arrhythmia and cardiomyopathy. Here, we discuss the utility of zebrafish as a model system and summarize zebrafish cardiac morphogenesis with emphasis on parallels to human heart diseases. Additionally, we discuss the specific tools and experimental platforms utilized in the zebrafish model including forward screens, functional characterization of candidate genes, and high throughput applications. PMID:27335817

  3. A new pooling strategy for high-throughput screening: the Shifted Transversal Design

    PubMed Central

    Thierry-Mieg, Nicolas

    2006-01-01

    Background In binary high-throughput screening projects where the goal is the identification of low-frequency events, beyond the obvious issue of efficiency, false positives and false negatives are a major concern. Pooling constitutes a natural solution: it reduces the number of tests, while providing critical duplication of the individual experiments, thereby correcting for experimental noise. The main difficulty consists in designing the pools in a manner that is both efficient and robust: few pools should be necessary to correct the errors and identify the positives, yet the experiment should not be too vulnerable to biological shakiness. For example, some information should still be obtained even if there are slightly more positives or errors than expected. This is known as the group testing problem, or pooling problem. Results In this paper, we present a new non-adaptive combinatorial pooling design: the "shifted transversal design" (STD). It relies on arithmetics, and rests on two intuitive ideas: minimizing the co-occurrence of objects, and constructing pools of constant-sized intersections. We prove that it allows unambiguous decoding of noisy experimental observations. This design is highly flexible, and can be tailored to function robustly in a wide range of experimental settings (i.e., numbers of objects, fractions of positives, and expected error-rates). Furthermore, we show that our design compares favorably, in terms of efficiency, to the previously described non-adaptive combinatorial pooling designs. Conclusion This method is currently being validated by field-testing in the context of yeast-two-hybrid interactome mapping, in collaboration with Marc Vidal's lab at the Dana Farber Cancer Institute. Many similar projects could benefit from using the Shifted Transversal Design. PMID:16423300

  4. Spatial tuning of acoustofluidic pressure nodes by altering net sonic velocity enables high-throughput, efficient cell sorting

    DOE PAGES

    Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...

    2015-01-07

    Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.

  5. Experimental Evolution as a High-Throughput Screen for Genetic Adaptations.

    PubMed

    Cooper, Vaughn S

    2018-06-27

    Experimental evolution is a method in which populations of organisms, often microbes, are founded by one or more ancestors of known genotype and then propagated under controlled conditions to study the evolutionary process. These evolving populations are influenced by all population genetic forces, including selection, mutation, drift, and recombination, and the relative contributions of these forces may be seen as mysterious. Here, I describe why the outcomes of experimental evolution should be viewed with greater certainty because the force of selection typically dominates. Importantly, any mutant rising rapidly to high frequency in large populations must have acquired adaptive traits in the selective environment. Sequencing the genomes of these mutants can identify genes or pathways that contribute to an adaptation. I review the logic and simple mathematics why this evolve-and-resequence approach is a powerful way to find the mutations or mutation combinations that best increase fitness in any new environment. Copyright © 2018 Cooper.

  6. High Throughput, Real-time, Dual-readout Testing of Intracellular Antimicrobial Activity and Eukaryotic Cell Cytotoxicity

    PubMed Central

    Chiaraviglio, Lucius; Kang, Yoon-Suk; Kirby, James E.

    2016-01-01

    Traditional measures of intracellular antimicrobial activity and eukaryotic cell cytotoxicity rely on endpoint assays. Such endpoint assays require several additional experimental steps prior to readout, such as cell lysis, colony forming unit determination, or reagent addition. When performing thousands of assays, for example, during high-throughput screening, the downstream effort required for these types of assays is considerable. Therefore, to facilitate high-throughput antimicrobial discovery, we developed a real-time assay to simultaneously identify inhibitors of intracellular bacterial growth and assess eukaryotic cell cytotoxicity. Specifically, real-time intracellular bacterial growth detection was enabled by marking bacterial screening strains with either a bacterial lux operon (1st generation assay) or fluorescent protein reporters (2nd generation, orthogonal assay). A non-toxic, cell membrane-impermeant, nucleic acid-binding dye was also added during initial infection of macrophages. These dyes are excluded from viable cells. However, non-viable host cells lose membrane integrity permitting entry and fluorescent labeling of nuclear DNA (deoxyribonucleic acid). Notably, DNA binding is associated with a large increase in fluorescent quantum yield that provides a solution-based readout of host cell death. We have used this combined assay to perform a high-throughput screen in microplate format, and to assess intracellular growth and cytotoxicity by microscopy. Notably, antimicrobials may demonstrate synergy in which the combined effect of two or more antimicrobials when applied together is greater than when applied separately. Testing for in vitro synergy against intracellular pathogens is normally a prodigious task as combinatorial permutations of antibiotics at different concentrations must be assessed. However, we found that our real-time assay combined with automated, digital dispensing technology permitted facile synergy testing. Using these approaches, we were able to systematically survey action of a large number of antimicrobials alone and in combination against the intracellular pathogen, Legionella pneumophila. PMID:27911388

  7. Quantitative description on structure–property relationships of Li-ion battery materials for high-throughput computations

    PubMed Central

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737

  8. Neutral desorption extractive electrospray ionization mass spectrometry for fast screening sunscreen agents in cream cosmetic products.

    PubMed

    Zhang, Xinglei; Liu, Yan; Zhang, Jinghua; Hu, Zhong; Hu, Bin; Ding, Liying; Jia, Li; Chen, Huanwen

    2011-09-15

    High throughput analysis of sunscreen agents present in cream cosmetic has been demonstrated, typically 2 samples per minute, using neutral desorption extractive electrospray ionization mass spectrometry (ND-EESI-MS) without sample pretreatment. For the targeted compounds such as 4-Aminobenzoic acid and oxybenzone, ND-EESI-MS method provided linear signal responses in the range of 1-100 ppb. Limits of detection (LOD) of the method were estimated at sub-ppb levels for the analytes tested. Reasonable relative standard deviation (RSD=8.4-16.0%) was obtained as a result of 10 independent measurements for commercial cosmetics samples spiked with each individual sunscreen agents at 1-10 ppb. Acceptable recoveries were achieved in the range of 87-116% for direct analysis of commercial cream cosmetic samples. The experimental data demonstrate that ND-EESI-MS is a useful tool for high throughput screening of sunscreen agents in highly viscous cream cosmetic products, with the capability to obtain quantitative information of the analytes. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Six-flow operations for catalyst development in Fischer-Tropsch synthesis: Bridging the gap between high-throughput experimentation and extensive product evaluation

    NASA Astrophysics Data System (ADS)

    Sartipi, Sina; Jansma, Harrie; Bosma, Duco; Boshuizen, Bart; Makkee, Michiel; Gascon, Jorge; Kapteijn, Freek

    2013-12-01

    Design and operation of a "six-flow fixed-bed microreactor" setup for Fischer-Tropsch synthesis (FTS) is described. The unit consists of feed and mixing, flow division, reaction, separation, and analysis sections. The reactor system is made of five heating blocks with individual temperature controllers, assuring an identical isothermal zone of at least 10 cm along six fixed-bed microreactor inserts (4 mm inner diameter). Such a lab-scale setup allows running six experiments in parallel, under equal feed composition, reaction temperature, and conditions of separation and analysis equipment. It permits separate collection of wax and liquid samples (from each flow line), allowing operation with high productivities of C5+ hydrocarbons. The latter is crucial for a complete understanding of FTS product compositions and will represent an advantage over high-throughput setups with more than ten flows where such instrumental considerations lead to elevated equipment volume, cost, and operation complexity. The identical performance (of the six flows) under similar reaction conditions was assured by testing a same catalyst batch, loaded in all microreactors.

  10. High-Throughput Computing on High-Performance Platforms: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleynik, D; Panitkin, S; Matteo, Turilli

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less

  11. Microfluidic based high throughput synthesis of lipid-polymer hybrid nanoparticles with tunable diameters

    PubMed Central

    Feng, Qiang; Zhang, Lu; Liu, Chao; Li, Xuanyu; Hu, Guoqing; Sun, Jiashu; Jiang, Xingyu

    2015-01-01

    Core-shell hybrid nanoparticles (NPs) for drug delivery have attracted numerous attentions due to their enhanced therapeutic efficacy and good biocompatibility. In this work, we fabricate a two-stage microfluidic chip to implement a high-throughput, one-step, and size-tunable synthesis of mono-disperse lipid-poly (lactic-co-glycolic acid) NPs. The size of hybrid NPs is tunable by varying the flow rates inside the two-stage microfluidic chip. To elucidate the mechanism of size-controllable generation of hybrid NPs, we observe the flow field in the microchannel with confocal microscope and perform the simulation by a numerical model. Both the experimental and numerical results indicate an enhanced mixing effect at high flow rate, thus resulting in the assembly of small and mono-disperse hybrid NPs. In vitro experiments show that the large hybrid NPs are more likely to be aggregated in serum and exhibit a lower cellular uptake efficacy than the small ones. This microfluidic chip shows great promise as a robust platform for optimization of nano drug delivery system. PMID:26180574

  12. High Throughput Screening for Anti–Trypanosoma cruzi Drug Discovery

    PubMed Central

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-01-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti–T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti–T. cruzi drug entities in the near future, are reviewed here. PMID:25474364

  13. SELMAP - SELEX affinity landscape MAPping of transcription factor binding sites using integrated microfluidics

    PubMed Central

    Chen, Dana; Orenstein, Yaron; Golodnitsky, Rada; Pellach, Michal; Avrahami, Dorit; Wachtel, Chaim; Ovadia-Shochat, Avital; Shir-Shapira, Hila; Kedmi, Adi; Juven-Gershon, Tamar; Shamir, Ron; Gerber, Doron

    2016-01-01

    Transcription factors (TFs) alter gene expression in response to changes in the environment through sequence-specific interactions with the DNA. These interactions are best portrayed as a landscape of TF binding affinities. Current methods to study sequence-specific binding preferences suffer from limited dynamic range, sequence bias, lack of specificity and limited throughput. We have developed a microfluidic-based device for SELEX Affinity Landscape MAPping (SELMAP) of TF binding, which allows high-throughput measurement of 16 proteins in parallel. We used it to measure the relative affinities of Pho4, AtERF2 and Btd full-length proteins to millions of different DNA binding sites, and detected both high and low-affinity interactions in equilibrium conditions, generating a comprehensive landscape of the relative TF affinities to all possible DNA 6-mers, and even DNA10-mers with increased sequencing depth. Low quantities of both the TFs and DNA oligomers were sufficient for obtaining high-quality results, significantly reducing experimental costs. SELMAP allows in-depth screening of hundreds of TFs, and provides a means for better understanding of the regulatory processes that govern gene expression. PMID:27628341

  14. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    PubMed

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-12-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  15. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    PubMed

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.

  16. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    NASA Astrophysics Data System (ADS)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated. Afterwards, the total energy for each distorted structure is calculated by the first-principles codes, e.g. VASP [3]. Finally, the second-order elastic constants are determined from the quadratic coefficients of the polynomial fitting of the energies vs strain relationships and other elastic properties are accordingly derived. References [1] http://atztogo.github.io/spglib/. [2] A. Meitzler, H.F. Tiersten, A.W. Warner, D. Berlincourt, G.A. Couqin, F.S. Welsh III, IEEE standard on piezoelectricity, Society, 1988. [3] G. Kresse, J. Furthmüller, Phys. Rev. B 54 (1996) 11169.

  17. High-throughput screening (HTS) and modeling of the retinoid ...

    EPA Pesticide Factsheets

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  18. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    EPA Science Inventory

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  19. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  20. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  1. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  2. Outlook for Development of High-throughput Cryopreservation for Small-bodied Biomedical Model Fishes★

    PubMed Central

    Tiersch, Terrence R.; Yang, Huiping; Hu, E.

    2011-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666

  3. High-throughput measurements of biochemical responses using the plate::vision multimode 96 minilens array reader.

    PubMed

    Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich

    2006-01-01

    The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.

  4. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    PubMed

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  5. Evaluating High-Throughput Ab Initio Gene Finders to Discover Proteins Encoded in Eukaryotic Pathogen Genomes Missed by Laboratory Techniques

    PubMed Central

    Goodswen, Stephen J.; Kennedy, Paul J.; Ellis, John T.

    2012-01-01

    Next generation sequencing technology is advancing genome sequencing at an unprecedented level. By unravelling the code within a pathogen’s genome, every possible protein (prior to post-translational modifications) can theoretically be discovered, irrespective of life cycle stages and environmental stimuli. Now more than ever there is a great need for high-throughput ab initio gene finding. Ab initio gene finders use statistical models to predict genes and their exon-intron structures from the genome sequence alone. This paper evaluates whether existing ab initio gene finders can effectively predict genes to deduce proteins that have presently missed capture by laboratory techniques. An aim here is to identify possible patterns of prediction inaccuracies for gene finders as a whole irrespective of the target pathogen. All currently available ab initio gene finders are considered in the evaluation but only four fulfil high-throughput capability: AUGUSTUS, GeneMark_hmm, GlimmerHMM, and SNAP. These gene finders require training data specific to a target pathogen and consequently the evaluation results are inextricably linked to the availability and quality of the data. The pathogen, Toxoplasma gondii, is used to illustrate the evaluation methods. The results support current opinion that predicted exons by ab initio gene finders are inaccurate in the absence of experimental evidence. However, the results reveal some patterns of inaccuracy that are common to all gene finders and these inaccuracies may provide a focus area for future gene finder developers. PMID:23226328

  6. Representing high throughput expression profiles via perturbation barcodes reveals compound targets.

    PubMed

    Filzen, Tracey M; Kutchukian, Peter S; Hermes, Jeffrey D; Li, Jing; Tudor, Matthew

    2017-02-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound's high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data.

  7. Representing high throughput expression profiles via perturbation barcodes reveals compound targets

    PubMed Central

    Kutchukian, Peter S.; Li, Jing; Tudor, Matthew

    2017-01-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound’s high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data. PMID:28182661

  8. TCP Throughput Profiles Using Measurements over Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less

  9. Enhancing high throughput toxicology - development of putative adverse outcome pathways linking US EPA ToxCast screening targets to relevant apical hazards.

    EPA Science Inventory

    High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...

  10. Evaluation of High-Throughput Chemical Exposure Models via Analysis of Matched Environmental and Biological Media Measurements

    EPA Science Inventory

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...

  11. The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD

    NASA Astrophysics Data System (ADS)

    Cox, M. A.; Reed, R.; Mellado, B.

    2015-01-01

    After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.

  12. High throughput screening of CO2 solubility in aqueous monoamine solutions.

    PubMed

    Porcheron, Fabien; Gibert, Alexandre; Mougin, Pascal; Wender, Aurélie

    2011-03-15

    Post-combustion Carbon Capture and Storage technology (CCS) is viewed as an efficient solution to reduce CO(2) emissions of coal-fired power stations. In CCS, an aqueous amine solution is commonly used as a solvent to selectively capture CO(2) from the flue gas. However, this process generates additional costs, mostly from the reboiler heat duty required to release the carbon dioxide from the loaded solvent solution. In this work, we present thermodynamic results of CO(2) solubility in aqueous amine solutions from a 6-reactor High Throughput Screening (HTS) experimental device. This device is fully automated and designed to perform sequential injections of CO(2) within stirred-cell reactors containing the solvent solutions. The gas pressure within each reactor is monitored as a function of time, and the resulting transient pressure curves are transformed into CO(2) absorption isotherms. Solubility measurements are first performed on monoethanolamine, diethanolamine, and methyldiethanolamine aqueous solutions at T = 313.15 K. Experimental results are compared with existing data in the literature to validate the HTS device. In addition, a comprehensive thermodynamic model is used to represent CO(2) solubility variations in different classes of amine structures upon a wide range of thermodynamic conditions. This model is used to fit the experimental data and to calculate the cyclic capacity, which is a key parameter for CO(2) process design. Solubility measurements are then performed on a set of 50 monoamines and cyclic capacities are extracted using the thermodynamic model, to asses the potential of these molecules for CO(2) capture.

  13. [Current applications of high-throughput DNA sequencing technology in antibody drug research].

    PubMed

    Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong

    2012-03-01

    Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.

  14. In silico modelling of directed evolution: Implications for experimental design and stepwise evolution.

    PubMed

    Wedge, David C; Rowe, William; Kell, Douglas B; Knowles, Joshua

    2009-03-07

    We model the process of directed evolution (DE) in silico using genetic algorithms. Making use of the NK fitness landscape model, we analyse the effects of mutation rate, crossover and selection pressure on the performance of DE. A range of values of K, the epistatic interaction of the landscape, are considered, and high- and low-throughput modes of evolution are compared. Our findings suggest that for runs of or around ten generations' duration-as is typical in DE-there is little difference between the way in which DE needs to be configured in the high- and low-throughput regimes, nor across different degrees of landscape epistasis. In all cases, a high selection pressure (but not an extreme one) combined with a moderately high mutation rate works best, while crossover provides some benefit but only on the less rugged landscapes. These genetic algorithms were also compared with a "model-based approach" from the literature, which uses sequential fixing of the problem parameters based on fitting a linear model. Overall, we find that purely evolutionary techniques fare better than do model-based approaches across all but the smoothest landscapes.

  15. Holographic femtosecond laser processing and its application to biological materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hayasaki, Yoshio

    2017-02-01

    Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.

  16. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    PubMed Central

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  17. High-throughput screening based on label-free detection of small molecule microarrays

    NASA Astrophysics Data System (ADS)

    Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong

    2017-02-01

    Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.

  18. High-throughput analysis of yeast replicative aging using a microfluidic system

    PubMed Central

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-01-01

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317

  19. Systems cell biology

    PubMed Central

    Mast, Fred D.; Ratushny, Alexander V.

    2014-01-01

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336

  20. Current status and future prospects for enabling chemistry technology in the drug discovery process.

    PubMed

    Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  1. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  2. SubCellProt: predicting protein subcellular localization using machine learning approaches.

    PubMed

    Garg, Prabha; Sharma, Virag; Chaudhari, Pradeep; Roy, Nilanjan

    2009-01-01

    High-throughput genome sequencing projects continue to churn out enormous amounts of raw sequence data. However, most of this raw sequence data is unannotated and, hence, not very useful. Among the various approaches to decipher the function of a protein, one is to determine its localization. Experimental approaches for proteome annotation including determination of a protein's subcellular localizations are very costly and labor intensive. Besides the available experimental methods, in silico methods present alternative approaches to accomplish this task. Here, we present two machine learning approaches for prediction of the subcellular localization of a protein from the primary sequence information. Two machine learning algorithms, k Nearest Neighbor (k-NN) and Probabilistic Neural Network (PNN) were used to classify an unknown protein into one of the 11 subcellular localizations. The final prediction is made on the basis of a consensus of the predictions made by two algorithms and a probability is assigned to it. The results indicate that the primary sequence derived features like amino acid composition, sequence order and physicochemical properties can be used to assign subcellular localization with a fair degree of accuracy. Moreover, with the enhanced accuracy of our approach and the definition of a prediction domain, this method can be used for proteome annotation in a high throughput manner. SubCellProt is available at www.databases.niper.ac.in/SubCellProt.

  3. AQuA: An Automated Quantification Algorithm for High-Throughput NMR-Based Metabolomics and Its Application in Human Plasma.

    PubMed

    Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A

    2018-02-06

    A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.

  4. Detection of co-eluted peptides using database search methods

    PubMed Central

    Alves, Gelio; Ogurtsov, Aleksey Y; Kwok, Siwei; Wu, Wells W; Wang, Guanghui; Shen, Rong-Fong; Yu, Yi-Kuo

    2008-01-01

    Background Current experimental techniques, especially those applying liquid chromatography mass spectrometry, have made high-throughput proteomic studies possible. The increase in throughput however also raises concerns on the accuracy of identification or quantification. Most experimental procedures select in a given MS scan only a few relatively most intense parent ions, each to be fragmented (MS2) separately, and most other minor co-eluted peptides that have similar chromatographic retention times are ignored and their information lost. Results We have computationally investigated the possibility of enhancing the information retrieval during a given LC/MS experiment by selecting the two or three most intense parent ions for simultaneous fragmentation. A set of spectra is created via superimposing a number of MS2 spectra, each can be identified by all search methods tested with high confidence, to mimick the spectra of co-eluted peptides. The generated convoluted spectra were used to evaluate the capability of several database search methods – SEQUEST, Mascot, X!Tandem, OMSSA, and RAId_DbS – in identifying true peptides from superimposed spectra of co-eluted peptides. We show that using these simulated spectra, all the database search methods will gain eventually in the number of true peptides identified by using the compound spectra of co-eluted peptides. Open peer review Reviewed by Vlad Petyuk (nominated by Arcady Mushegian), King Jordan and Shamil Sunyaev. For the full reviews, please go to the Reviewers' comments section. PMID:18597684

  5. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  6. 40 CFR Table 9 to Subpart Eeee of... - Continuous Compliance With Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...

  7. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    PubMed

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  8. Linking Proteomic and Transcriptional Data through the Interactome and Epigenome Reveals a Map of Oncogene-induced Signaling

    PubMed Central

    Huang, Shao-shan Carol; Clarke, David C.; Gosline, Sara J. C.; Labadorf, Adam; Chouinard, Candace R.; Gordon, William; Lauffenburger, Douglas A.; Fraenkel, Ernest

    2013-01-01

    Cellular signal transduction generally involves cascades of post-translational protein modifications that rapidly catalyze changes in protein-DNA interactions and gene expression. High-throughput measurements are improving our ability to study each of these stages individually, but do not capture the connections between them. Here we present an approach for building a network of physical links among these data that can be used to prioritize targets for pharmacological intervention. Our method recovers the critical missing links between proteomic and transcriptional data by relating changes in chromatin accessibility to changes in expression and then uses these links to connect proteomic and transcriptome data. We applied our approach to integrate epigenomic, phosphoproteomic and transcriptome changes induced by the variant III mutation of the epidermal growth factor receptor (EGFRvIII) in a cell line model of glioblastoma multiforme (GBM). To test the relevance of the network, we used small molecules to target highly connected nodes implicated by the network model that were not detected by the experimental data in isolation and we found that a large fraction of these agents alter cell viability. Among these are two compounds, ICG-001, targeting CREB binding protein (CREBBP), and PKF118–310, targeting β-catenin (CTNNB1), which have not been tested previously for effectiveness against GBM. At the level of transcriptional regulation, we used chromatin immunoprecipitation sequencing (ChIP-Seq) to experimentally determine the genome-wide binding locations of p300, a transcriptional co-regulator highly connected in the network. Analysis of p300 target genes suggested its role in tumorigenesis. We propose that this general method, in which experimental measurements are used as constraints for building regulatory networks from the interactome while taking into account noise and missing data, should be applicable to a wide range of high-throughput datasets. PMID:23408876

  9. The combination of gas-phase fluorophore technology and automation to enable high-throughput analysis of plant respiration.

    PubMed

    Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K

    2017-01-01

    Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.

  10. Simultaneous Measurements of Auto-Immune and Infectious Disease Specific Antibodies Using a High Throughput Multiplexing Tool

    PubMed Central

    Asati, Atul; Kachurina, Olga; Kachurin, Anatoly

    2012-01-01

    Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605

  11. Modelling Human Regulatory Variation in Mouse: Finding the Function in Genome-Wide Association Studies and Whole-Genome Sequencing

    PubMed Central

    Schmouth, Jean-François; Bonaguro, Russell J.; Corso-Diaz, Ximena; Simpson, Elizabeth M.

    2012-01-01

    An increasing body of literature from genome-wide association studies and human whole-genome sequencing highlights the identification of large numbers of candidate regulatory variants of potential therapeutic interest in numerous diseases. Our relatively poor understanding of the functions of non-coding genomic sequence, and the slow and laborious process of experimental validation of the functional significance of human regulatory variants, limits our ability to fully benefit from this information in our efforts to comprehend human disease. Humanized mouse models (HuMMs), in which human genes are introduced into the mouse, suggest an approach to this problem. In the past, HuMMs have been used successfully to study human disease variants; e.g., the complex genetic condition arising from Down syndrome, common monogenic disorders such as Huntington disease and β-thalassemia, and cancer susceptibility genes such as BRCA1. In this commentary, we highlight a novel method for high-throughput single-copy site-specific generation of HuMMs entitled High-throughput Human Genes on the X Chromosome (HuGX). This method can be applied to most human genes for which a bacterial artificial chromosome (BAC) construct can be derived and a mouse-null allele exists. This strategy comprises (1) the use of recombineering technology to create a human variant–harbouring BAC, (2) knock-in of this BAC into the mouse genome using Hprt docking technology, and (3) allele comparison by interspecies complementation. We demonstrate the throughput of the HuGX method by generating a series of seven different alleles for the human NR2E1 gene at Hprt. In future challenges, we consider the current limitations of experimental approaches and call for a concerted effort by the genetics community, for both human and mouse, to solve the challenge of the functional analysis of human regulatory variation. PMID:22396661

  12. Development, implementation, and test results on integrated optics switching matrix

    NASA Technical Reports Server (NTRS)

    Rutz, E.

    1982-01-01

    A small integrated optics switching matrix, which was developed, implemented, and tested, indicates high performance. The matrix serves as a model for the design of larger switching matrices. The larger integrated optics switching matrix should form the integral part of a switching center with high data rate throughput of up to 300 megabits per second. The switching matrix technique can accomplish the design goals of low crosstalk and low distortion. About 50 illustrations help explain and depict the many phases of the integrated optics switching matrix. Many equations used to explain and calculate the experimental data are also included.

  13. HTS techniques for patch clamp-based ion channel screening - advances and economy.

    PubMed

    Farre, Cecilia; Fertig, Niels

    2012-06-01

    Ten years ago, the first publication appeared showing patch clamp recordings performed on a planar glass chip instead of using a conventional patch clamp pipette. "Going planar" proved to revolutionize ion channel drug screening as we know it, by allowing high quality measurements of ion channels and their effectors at a higher throughput and at the same time de-skilling the highly laborious technique. Over the years, platforms evolved in response to user requirements regarding experimental features, data handling plus storage, and suitable target diversity. This article gives a snapshot image of patch clamp-based ion channel screening with focus on platforms developed to meet requirements of high-throughput screening environments. The commercially available platforms are described, along with their benefits and drawbacks in ion channel drug screening. Automated patch clamp (APC) platforms allow faster investigation of a larger number of ion channel active compounds or cell clones than previously possible. Since patch clamp is the only method allowing direct, real-time measurements of ion channel activity, APC holds the promise of picking up high quality leads, where they otherwise would have been overseen using indirect methods. In addition, drug candidate safety profiling can be performed earlier in the drug discovery process, avoiding late-phase compound withdrawal due to safety liability issues, which is highly costly and inefficient.

  14. Equipment for neutron measurements at VR-1 Sparrow training reactor.

    PubMed

    Kolros, Antonin; Huml, Ondrej; Kríz, Martin; Kos, Josef

    2010-01-01

    The VR-1 sparrow reactor is an experimental nuclear facility for training, student education and teaching purposes. The sparrow reactor is an educational platform for the basic experiments at the reactor physic and dosimetry. The aim of this article is to describe the new experimental equipment EMK310 features and possibilities for neutron detection by different gas filled detectors at VR-1 reactor. Among the EMK310 equipment typical attributes belong precise set-up, simple control, resistance to electromagnetic interference, high throughput (counting rate), versatility and remote controllability. The methods for non-linearity correction of pulse neutron detection system and reactimeter application are presented. Copyright 2009. Published by Elsevier Ltd.

  15. Systems cell biology.

    PubMed

    Mast, Fred D; Ratushny, Alexander V; Aitchison, John D

    2014-09-15

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.

  16. Identification of Novel Myelin-Associated CD4+ T cell Autoantigens Targeted in MS Using a High-Throughput Gene Synthesis Technology

    DTIC Science & Technology

    2013-10-01

    epitopes from Epstein - Barr virus (EBV), Cytomegalovirus, influenza and tetanus toxoid linked to the LC3 tag were constructed and in vitro transcribed...of these proteins in the CNS, their ability to elicit MS-like disease in the mouse experimental autoimmune encephalitis model, and the presence of T...Goverman, J. 2009. Autoimmune T cell responses in the central nervous system . Nat. Rev. Immunol. 9: 393-407. 3. Jahn, O., S. Tenzer, and H. B

  17. Current status and future prospects for enabling chemistry technology in the drug discovery process

    PubMed Central

    Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094

  18. MIPS plant genome information resources.

    PubMed

    Spannagl, Manuel; Haberer, Georg; Ernst, Rebecca; Schoof, Heiko; Mayer, Klaus F X

    2007-01-01

    The Munich Institute for Protein Sequences (MIPS) has been involved in maintaining plant genome databases since the Arabidopsis thaliana genome project. Genome databases and analysis resources have focused on individual genomes and aim to provide flexible and maintainable data sets for model plant genomes as a backbone against which experimental data, for example from high-throughput functional genomics, can be organized and evaluated. In addition, model genomes also form a scaffold for comparative genomics, and much can be learned from genome-wide evolutionary studies.

  19. Ultra-High Throughput Synthesis of Nanoparticles with Homogeneous Size Distribution Using a Coaxial Turbulent Jet Mixer

    PubMed Central

    2015-01-01

    High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296

  20. PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*

    PubMed Central

    Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.

    2010-01-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208

  1. Building predictive in vitro pulmonary toxicity assays using high-throughput imaging and artificial intelligence.

    PubMed

    Lee, Jia-Ying Joey; Miller, James Alastair; Basu, Sreetama; Kee, Ting-Zhen Vanessa; Loo, Lit-Hsin

    2018-06-01

    Human lungs are susceptible to the toxicity induced by soluble xenobiotics. However, the direct cellular effects of many pulmonotoxic chemicals are not always clear, and thus, a general in vitro assay for testing pulmonotoxicity applicable to a wide variety of chemicals is not currently available. Here, we report a study that uses high-throughput imaging and artificial intelligence to build an in vitro pulmonotoxicity assay by automatically comparing and selecting human lung-cell lines and their associated quantitative phenotypic features most predictive of in vivo pulmonotoxicity. This approach is called "High-throughput In vitro Phenotypic Profiling for Toxicity Prediction" (HIPPTox). We found that the resulting assay based on two phenotypic features of a human bronchial epithelial cell line, BEAS-2B, can accurately classify 33 reference chemicals with human pulmonotoxicity information (88.8% balance accuracy, 84.6% sensitivity, and 93.0% specificity). In comparison, the predictivity of a standard cell-viability assay on the same set of chemicals is much lower (77.1% balanced accuracy, 84.6% sensitivity, and 69.5% specificity). We also used the assay to evaluate 17 additional test chemicals with unknown/unclear human pulmonotoxicity, and experimentally confirmed that many of the pulmonotoxic reference and predicted-positive test chemicals induce DNA strand breaks and/or activation of the DNA-damage response (DDR) pathway. Therefore, HIPPTox helps us to uncover these common modes-of-action of pulmonotoxic chemicals. HIPPTox may also be applied to other cell types or models, and accelerate the development of predictive in vitro assays for other cell-type- or organ-specific toxicities.

  2. Generation and characterization of West Nile pseudo-infectious reporter virus for antiviral screening.

    PubMed

    Zhang, Hong-Lei; Ye, Han-Qing; Deng, Cheng-Lin; Liu, Si-Qing; Shi, Pei-Yong; Qin, Cheng-Feng; Yuan, Zhi-Ming; Zhang, Bo

    2017-05-01

    West Nile virus (WNV), a mosquito-borne flavivirus, is an important neurotropic human pathogen. As a biosafety level-3 (BSL-3) agent, WNV is strictly to BSL-3 laboratories for experimentations, thus greatly hindering the development of vaccine and antiviral drug. Here, we developed a novel pseudo-infectious WNV reporter virus expressing the Gaussia luciferase (Gluc). A stable 293T NS1 cell line expressing NS1 was selected for trans-supplying NS1 protein to support the replication of WNV-ΔNS1 virus and WNV-ΔNS1-Gluc reporter virus with large-fragment deletion of NS1. WNV-ΔNS1 virus and WNV-Gluc-ΔNS1 reporter virus were confined to complete their replication cycle in this 293T NS1 cell line, displaying nearly identical growth kinetics to WT WNV although the viral titers were lower than those of WT WNV. The reporter gene was stably maintained in virus genome at least within three rounds of passage in 293T NS1 cell line. Using a known flaviviruses inhibitor, NITD008, we demonstrated that the pseudo-infectious WNV-Gluc-ΔNS1 could be used for antiviral screening. Furthermore, a high-throughput screening (HTS) assay in a 96-well format was optimized and validated using several known WNV inhibitors, indicating that the optimized HTS assay was suitable for high-throughput screening WNV inhibitors. Our work provides a stable and safe tool to handle WNV outside of a BSL-3 facility and facilitates high throughput screening for anti-WNV drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. The light spot test: Measuring anxiety in mice in an automated home-cage environment.

    PubMed

    Aarts, Emmeke; Maroteaux, Gregoire; Loos, Maarten; Koopmans, Bastijn; Kovačević, Jovana; Smit, August B; Verhage, Matthijs; Sluis, Sophie van der

    2015-11-01

    Behavioral tests of animals in a controlled experimental setting provide a valuable tool to advance understanding of genotype-phenotype relations, and to study the effects of genetic and environmental manipulations. To optimally benefit from the increasing numbers of genetically engineered mice, reliable high-throughput methods for comprehensive behavioral phenotyping of mice lines have become a necessity. Here, we describe the development and validation of an anxiety test, the light spot test, that allows for unsupervised, automated, high-throughput testing of mice in a home-cage system. This automated behavioral test circumvents bias introduced by pretest handling, and enables recording both baseline behavior and the behavioral test response over a prolonged period of time. We demonstrate that the light spot test induces a behavioral response in C57BL/6J mice. This behavior reverts to baseline when the aversive stimulus is switched off, and is blunted by treatment with the anxiolytic drug Diazepam, demonstrating predictive validity of the assay, and indicating that the observed behavioral response has a significant anxiety component. Also, we investigated the effectiveness of the light spot test as part of sequential testing for different behavioral aspects in the home-cage. Two learning tests, administered prior to the light spot test, affected the light spot test parameters. The light spot test is a novel, automated assay for anxiety-related high-throughput testing of mice in an automated home-cage environment, allowing for both comprehensive behavioral phenotyping of mice, and rapid screening of pharmacological compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Rediscovery rate estimation for assessing the validation of significant findings in high-throughput studies.

    PubMed

    Ganna, Andrea; Lee, Donghwan; Ingelsson, Erik; Pawitan, Yudi

    2015-07-01

    It is common and advised practice in biomedical research to validate experimental or observational findings in a population different from the one where the findings were initially assessed. This practice increases the generalizability of the results and decreases the likelihood of reporting false-positive findings. Validation becomes critical when dealing with high-throughput experiments, where the large number of tests increases the chance to observe false-positive results. In this article, we review common approaches to determine statistical thresholds for validation and describe the factors influencing the proportion of significant findings from a 'training' sample that are replicated in a 'validation' sample. We refer to this proportion as rediscovery rate (RDR). In high-throughput studies, the RDR is a function of false-positive rate and power in both the training and validation samples. We illustrate the application of the RDR using simulated data and real data examples from metabolomics experiments. We further describe an online tool to calculate the RDR using t-statistics. We foresee two main applications. First, if the validation study has not yet been collected, the RDR can be used to decide the optimal combination between the proportion of findings taken to validation and the size of the validation study. Secondly, if a validation study has already been done, the RDR estimated using the training data can be compared with the observed RDR from the validation data; hence, the success of the validation study can be assessed. © The Author 2014. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science

    NASA Astrophysics Data System (ADS)

    Chikyo, Toyohiro

    2011-10-01

    About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.

  6. Application of Chemical Genomics to Plant-Bacteria Communication: A High-Throughput System to Identify Novel Molecules Modulating the Induction of Bacterial Virulence Genes by Plant Signals.

    PubMed

    Vandelle, Elodie; Puttilli, Maria Rita; Chini, Andrea; Devescovi, Giulia; Venturi, Vittorio; Polverari, Annalisa

    2017-01-01

    The life cycle of bacterial phytopathogens consists of a benign epiphytic phase, during which the bacteria grow in the soil or on the plant surface, and a virulent endophytic phase involving the penetration of host defenses and the colonization of plant tissues. Innovative strategies are urgently required to integrate copper treatments that control the epiphytic phase with complementary tools that control the virulent endophytic phase, thus reducing the quantity of chemicals applied to economically and ecologically acceptable levels. Such strategies include targeted treatments that weaken bacterial pathogens, particularly those inhibiting early infection steps rather than tackling established infections. This chapter describes a reporter gene-based chemical genomic high-throughput screen for the induction of bacterial virulence by plant molecules. Specifically, we describe a chemical genomic screening method to identify agonist and antagonist molecules for the induction of targeted bacterial virulence genes by plant extracts, focusing on the experimental controls required to avoid false positives and thus ensuring the results are reliable and reproducible.

  7. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Discovery of J Chain in African Lungfish (Protopterus dolloi, Sarcopterygii) Using High Throughput Transcriptome Sequencing: Implications in Mucosal Immunity

    PubMed Central

    Tacchi, Luca; Larragoite, Erin; Salinas, Irene

    2013-01-01

    J chain is a small polypeptide responsible for immunoglobulin (Ig) polymerization and transport of Igs across mucosal surfaces in higher vertebrates. We identified a J chain in dipnoid fish, the African lungfish (Protopterus dolloi) by high throughput sequencing of the transcriptome. P. dolloi J chain is 161 aa long and contains six of the eight Cys residues present in mammalian J chain. Phylogenetic studies place the lungfish J chain closer to tetrapod J chain than to the coelacanth or nurse shark sequences. J chain expression occurs in all P. dolloi immune tissues examined and it increases in the gut and kidney in response to an experimental bacterial infection. Double fluorescent in-situ hybridization shows that 88.5% of IgM+ cells in the gut co-express J chain, a significantly higher percentage than in the pre-pyloric spleen. Importantly, J chain expression is not restricted to the B-cell compartment since gut epithelial cells also express J chain. These results improve our current view of J chain from a phylogenetic perspective. PMID:23967082

  9. Nanomechanical recognition of prognostic biomarker suPAR with DVD-ROM optical technology.

    PubMed

    Bache, Michael; Bosco, Filippo G; Brøgger, Anna L; Frøhling, Kasper B; Alstrøm, Tommy Sonne; Hwu, En-Te; Chen, Ching-Hsiu; Eugen-Olsen, Jesper; Hwang, Ing-Shouh; Boisen, Anja

    2013-11-08

    In this work the use of a high-throughput nanomechanical detection system based on a DVD-ROM optical drive and cantilever sensors is presented for the detection of urokinase plasminogen activator receptor inflammatory biomarker (uPAR). Several large scale studies have linked elevated levels of soluble uPAR (suPAR) to infectious diseases, such as HIV, and certain types of cancer. Using hundreds of cantilevers and a DVD-based platform, cantilever deflection response from antibody-antigen recognition is investigated as a function of suPAR concentration. The goal is to provide a cheap and portable detection platform which can carry valuable prognostic information. In order to optimize the cantilever response the antibody immobilization and unspecific binding are initially characterized using quartz crystal microbalance technology. Also, the choice of antibody is explored in order to generate the largest surface stress on the cantilevers, thus increasing the signal. Using optimized experimental conditions the lowest detectable suPAR concentration is currently around 5 nM. The results reveal promising research strategies for the implementation of specific biochemical assays in a portable and high-throughput microsensor-based detection platform.

  10. An overview of bioinformatics methods for modeling biological pathways in yeast

    PubMed Central

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao

    2016-01-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein–protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae. In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways in S. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. PMID:26476430

  11. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  12. Intellicount: High-Throughput Quantification of Fluorescent Synaptic Protein Puncta by Machine Learning

    PubMed Central

    Fantuzzo, J. A.; Mirabella, V. R.; Zahn, J. D.

    2017-01-01

    Abstract Synapse formation analyses can be performed by imaging and quantifying fluorescent signals of synaptic markers. Traditionally, these analyses are done using simple or multiple thresholding and segmentation approaches or by labor-intensive manual analysis by a human observer. Here, we describe Intellicount, a high-throughput, fully-automated synapse quantification program which applies a novel machine learning (ML)-based image processing algorithm to systematically improve region of interest (ROI) identification over simple thresholding techniques. Through processing large datasets from both human and mouse neurons, we demonstrate that this approach allows image processing to proceed independently of carefully set thresholds, thus reducing the need for human intervention. As a result, this method can efficiently and accurately process large image datasets with minimal interaction by the experimenter, making it less prone to bias and less liable to human error. Furthermore, Intellicount is integrated into an intuitive graphical user interface (GUI) that provides a set of valuable features, including automated and multifunctional figure generation, routine statistical analyses, and the ability to run full datasets through nested folders, greatly expediting the data analysis process. PMID:29218324

  13. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  14. STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation

    PubMed Central

    2013-01-01

    Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969

  15. Automated analysis of siRNA screens of cells infected by hepatitis C and dengue viruses based on immunofluorescence microscopy images

    NASA Astrophysics Data System (ADS)

    Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl

    2008-03-01

    We present an image analysis approach as part of a high-throughput microscopy siRNA-based screening system using cell arrays for the identification of cellular genes involved in hepatitis C and dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in the neighborhood of segmented cell nuclei, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment and single images. In particular, we propose a novel approach for the localization of regions of transfected cells within cell array images, which combines model-based circle fitting and grid fitting. By this scheme we integrate information from single cell array images and knowledge from the complete cell arrays. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behaviour of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.

  16. High-throughput prediction of tablet weight and trimethoprim content of compound sulfamethoxazole tablets for controlling the uniformity of dosage units by NIR.

    PubMed

    Dong, Yanhong; Li, Juan; Zhong, Xiaoxiao; Cao, Liya; Luo, Yang; Fan, Qi

    2016-04-15

    This paper establishes a novel method to simultaneously predict the tablet weight (TW) and trimethoprim (TMP) content of compound sulfamethoxazole tablets (SMZCO) by near infrared (NIR) spectroscopy with partial least squares (PLS) regression for controlling the uniformity of dosage units (UODU). The NIR spectra for 257 samples were measured using the optimized parameter values and pretreated using the optimized chemometric techniques. After the outliers were ignored, two PLS models for predicting TW and TMP content were respectively established by using the selected spectral sub-ranges and the reference values. The TW model reaches the correlation coefficient of calibration (R(c)) 0.9543 and the TMP content model has the R(c) 0.9205. The experimental results indicate that this strategy expands the NIR application in controlling UODU, especially in the high-throughput and rapid analysis of TWs and contents of the compound pharmaceutical tablets, and may be an important complement to the common NIR on-line analytical method for pharmaceutical tablets. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Combinatorial Methods for Exploring Complex Materials

    NASA Astrophysics Data System (ADS)

    Amis, Eric J.

    2004-03-01

    Combinatorial and high-throughput methods have changed the paradigm of pharmaceutical synthesis and have begun to have a similar impact on materials science research. Already there are examples of combinatorial methods used for inorganic materials, catalysts, and polymer synthesis. For many investigations the primary goal has been discovery of new material compositions that optimize properties such as phosphorescence or catalytic activity. In the midst of the excitement generated to "make things", another opportunity arises for materials science to "understand things" by using the efficiency of combinatorial methods. We have shown that combinatorial methods hold potential for rapid and systematic generation of experimental data over the multi-parameter space typical of investigations in polymer physics. We have applied the combinatorial approach to studies of polymer thin films, biomaterials, polymer blends, filled polymers, and semicrystalline polymers. By combining library fabrication, high-throughput measurements, informatics, and modeling we can demonstrate validation of the methodology, new observations, and developments toward predictive models. This talk will present some of our latest work with applications to coating stability, multi-component formulations, and nanostructure assembly.

  18. Exploration of the molecular basis of blast injury in a biofidelic model of traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Thielen, P.; Mehoke, T.; Gleason, J.; Iwaskiw, A.; Paulson, J.; Merkle, A.; Wester, B.; Dymond, J.

    2018-01-01

    Biological response to blast overpressure is complex and results in various and potentially non-concomitant acute and long-term deficits to exposed individuals. Clinical links between blast severity and injury outcomes remain elusive and have yet to be fully described, resulting in a critical inability to develop associated protection and mitigation strategies. Further, experimental models frequently fail to reproduce observed physiological phenomena and/or introduce artifacts that confound analysis and reproducibility. New models are required that employ consistent mechanical inputs, scale with biological analogs and known clinical data, and permit high-throughput examination of biological responses for a range of environmental and battlefield- relevant exposures. Here we describe a novel, biofidelic headform capable of integrating complex biological samples for blast exposure studies. We additionally demonstrate its utility in detecting acute transcriptional responses in the model organism Caenorhabditis elegans after exposure to blast overpressure. This approach enables correlation between mechanical exposure and biological outcome, permitting both the enhancement of existing surrogate and computational models and the high-throughput biofidelic testing of current and future protection systems.

  19. Characterization of matrix effects in developing rugged high-throughput LC-MS/MS methods for bioanalysis.

    PubMed

    Li, Fumin; Wang, Jun; Jenkins, Rand

    2016-05-01

    There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.

  20. Identification and Correction of Additive and Multiplicative Spatial Biases in Experimental High-Throughput Screening.

    PubMed

    Mazoure, Bogdan; Caraus, Iurie; Nadon, Robert; Makarenkov, Vladimir

    2018-06-01

    Data generated by high-throughput screening (HTS) technologies are prone to spatial bias. Traditionally, bias correction methods used in HTS assume either a simple additive or, more recently, a simple multiplicative spatial bias model. These models do not, however, always provide an accurate correction of measurements in wells located at the intersection of rows and columns affected by spatial bias. The measurements in these wells depend on the nature of interaction between the involved biases. Here, we propose two novel additive and two novel multiplicative spatial bias models accounting for different types of bias interactions. We describe a statistical procedure that allows for detecting and removing different types of additive and multiplicative spatial biases from multiwell plates. We show how this procedure can be applied by analyzing data generated by the four HTS technologies (homogeneous, microorganism, cell-based, and gene expression HTS), the three high-content screening (HCS) technologies (area, intensity, and cell-count HCS), and the only small-molecule microarray technology available in the ChemBank small-molecule screening database. The proposed methods are included in the AssayCorrector program, implemented in R, and available on CRAN.

  1. Transfer, imaging, and analysis plate for facile handling of 384 hanging drop 3D tissue spheroids.

    PubMed

    Cavnar, Stephen P; Salomonsson, Emma; Luker, Kathryn E; Luker, Gary D; Takayama, Shuichi

    2014-04-01

    Three-dimensional culture systems bridge the experimental gap between in vivo and in vitro physiology. However, nonstandardized formation and limited downstream adaptability of 3D cultures have hindered mainstream adoption of these systems for biological applications, especially for low- and moderate-throughput assays commonly used in biomedical research. Here we build on our recent development of a 384-well hanging drop plate for spheroid culture to design a complementary spheroid transfer and imaging (TRIM) plate. The low-aspect ratio wells of the TRIM plate facilitated high-fidelity, user-independent, contact-based collection of hanging drop spheroids. Using the TRIM plate, we demonstrated several downstream analyses, including bulk tissue collection for flow cytometry, high-resolution low working-distance immersion imaging, and timely reagent delivery for enzymatic studies. Low working-distance multiphoton imaging revealed a cell type-dependent, macroscopic spheroid structure. Unlike ovarian cancer spheroids, which formed loose, disk-shaped spheroids, human mammary fibroblasts formed tight, spherical, and nutrient-limited spheroids. Beyond the applications we describe here, we expect the hanging drop spheroid plate and complementary TRIM plate to facilitate analyses of spheroids across the spectrum of throughput, particularly for bulk collection of spheroids and high-content imaging.

  2. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  3. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  4. Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing

    NASA Astrophysics Data System (ADS)

    Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey

    Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.

  5. Tissue vascularization through 3D printing: Will technology bring us flow?

    PubMed

    Paulsen, S J; Miller, J S

    2015-05-01

    Though in vivo models provide the most physiologically relevant environment for studying tissue function, in vitro studies provide researchers with explicit control over experimental conditions and the potential to develop high throughput testing methods. In recent years, advancements in developmental biology research and imaging techniques have significantly improved our understanding of the processes involved in vascular development. However, the task of recreating the complex, multi-scale vasculature seen in in vivo systems remains elusive. 3D bioprinting offers a potential method to generate controlled vascular networks with hierarchical structure approaching that of in vivo networks. Bioprinting is an interdisciplinary field that relies on advances in 3D printing technology along with advances in imaging and computational modeling, which allow researchers to monitor cellular function and to better understand cellular environment within the printed tissue. As bioprinting technologies improve with regards to resolution, printing speed, available materials, and automation, 3D printing could be used to generate highly controlled vascularized tissues in a high throughput manner for use in regenerative medicine and the development of in vitro tissue models for research in developmental biology and vascular diseases. © 2015 Wiley Periodicals, Inc.

  6. Machine vision for digital microfluidics

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun; Lee, Jeong-Bong

    2010-01-01

    Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future.

  7. A Dual-Mode Large-Arrayed CMOS ISFET Sensor for Accurate and High-Throughput pH Sensing in Biomedical Diagnosis.

    PubMed

    Huang, Xiwei; Yu, Hao; Liu, Xu; Jiang, Yu; Yan, Mei; Wu, Dongping

    2015-09-01

    The existing ISFET-based DNA sequencing detects hydrogen ions released during the polymerization of DNA strands on microbeads, which are scattered into microwell array above the ISFET sensor with unknown distribution. However, false pH detection happens at empty microwells due to crosstalk from neighboring microbeads. In this paper, a dual-mode CMOS ISFET sensor is proposed to have accurate pH detection toward DNA sequencing. Dual-mode sensing, optical and chemical modes, is realized by integrating a CMOS image sensor (CIS) with ISFET pH sensor, and is fabricated in a standard 0.18-μm CIS process. With accurate determination of microbead physical locations with CIS pixel by contact imaging, the dual-mode sensor can correlate local pH for one DNA slice at one location-determined microbead, which can result in improved pH detection accuracy. Moreover, toward a high-throughput DNA sequencing, a correlated-double-sampling readout that supports large array for both modes is deployed to reduce pixel-to-pixel nonuniformity such as threshold voltage mismatch. The proposed CMOS dual-mode sensor is experimentally examined to show a well correlated pH map and optical image for microbeads with a pH sensitivity of 26.2 mV/pH, a fixed pattern noise (FPN) reduction from 4% to 0.3%, and a readout speed of 1200 frames/s. A dual-mode CMOS ISFET sensor with suppressed FPN for accurate large-arrayed pH sensing is proposed and demonstrated with state-of-the-art measured results toward accurate and high-throughput DNA sequencing. The developed dual-mode CMOS ISFET sensor has great potential for future personal genome diagnostics with high accuracy and low cost.

  8. Biased ligand quantification in drug discovery: from theory to high throughput screening to identify new biased μ opioid receptor agonists

    PubMed Central

    Winpenny, David; Clark, Mellissa

    2016-01-01

    Background and Purpose Biased GPCR ligands are able to engage with their target receptor in a manner that preferentially activates distinct downstream signalling and offers potential for next generation therapeutics. However, accurate quantification of ligand bias in vitro is complex, and current best practice is not amenable for testing large numbers of compound. We have therefore sought to apply ligand bias theory to an industrial scale screening campaign for the identification of new biased μ receptor agonists. Experimental Approach μ receptor assays with appropriate dynamic range were developed for both Gαi‐dependent signalling and β‐arrestin2 recruitment. Δlog(Emax/EC50) analysis was validated as an alternative for the operational model of agonism in calculating pathway bias towards Gαi‐dependent signalling. The analysis was applied to a high throughput screen to characterize the prevalence and nature of pathway bias among a diverse set of compounds with μ receptor agonist activity. Key Results A high throughput screening campaign yielded 440 hits with greater than 10‐fold bias relative to DAMGO. To validate these results, we quantified pathway bias of a subset of hits using the operational model of agonism. The high degree of correlation across these biased hits confirmed that Δlog(Emax/EC50) was a suitable method for identifying genuine biased ligands within a large collection of diverse compounds. Conclusions and Implications This work demonstrates that using Δlog(Emax/EC50), drug discovery can apply the concept of biased ligand quantification on a large scale and accelerate the deliberate discovery of novel therapeutics acting via this complex pharmacology. PMID:26791140

  9. Systematic Analysis of Zn2Cys6 Transcription Factors Required for Development and Pathogenicity by High-Throughput Gene Knockout in the Rice Blast Fungus

    PubMed Central

    Huang, Pengyun; Lin, Fucheng

    2014-01-01

    Because of great challenges and workload in deleting genes on a large scale, the functions of most genes in pathogenic fungi are still unclear. In this study, we developed a high-throughput gene knockout system using a novel yeast-Escherichia-Agrobacterium shuttle vector, pKO1B, in the rice blast fungus Magnaporthe oryzae. Using this method, we deleted 104 fungal-specific Zn2Cys6 transcription factor (TF) genes in M. oryzae. We then analyzed the phenotypes of these mutants with regard to growth, asexual and infection-related development, pathogenesis, and 9 abiotic stresses. The resulting data provide new insights into how this rice pathogen of global significance regulates important traits in the infection cycle through Zn2Cys6TF genes. A large variation in biological functions of Zn2Cys6TF genes was observed under the conditions tested. Sixty-one of 104 Zn2Cys6 TF genes were found to be required for fungal development. In-depth analysis of TF genes revealed that TF genes involved in pathogenicity frequently tend to function in multiple development stages, and disclosed many highly conserved but unidentified functional TF genes of importance in the fungal kingdom. We further found that the virulence-required TF genes GPF1 and CNF2 have similar regulation mechanisms in the gene expression involved in pathogenicity. These experimental validations clearly demonstrated the value of a high-throughput gene knockout system in understanding the biological functions of genes on a genome scale in fungi, and provided a solid foundation for elucidating the gene expression network that regulates the development and pathogenicity of M. oryzae. PMID:25299517

  10. Investigation of FPGA-Based Real-Time Adaptive Digital Pulse Shaping for High-Count-Rate Applications

    NASA Astrophysics Data System (ADS)

    Saxena, Shefali; Hawari, Ayman I.

    2017-07-01

    Digital signal processing techniques have been widely used in radiation spectrometry to provide improved stability and performance with compact physical size over the traditional analog signal processing. In this paper, field-programmable gate array (FPGA)-based adaptive digital pulse shaping techniques are investigated for real-time signal processing. National Instruments (NI) NI 5761 14-bit, 250-MS/s adaptor module is used for digitizing high-purity germanium (HPGe) detector's preamplifier pulses. Digital pulse processing algorithms are implemented on the NI PXIe-7975R reconfigurable FPGA (Kintex-7) using the LabVIEW FPGA module. Based on the time separation between successive input pulses, the adaptive shaping algorithm selects the optimum shaping parameters (rise time and flattop time of trapezoid-shaping filter) for each incoming signal. A digital Sallen-Key low-pass filter is implemented to enhance signal-to-noise ratio and reduce baseline drifting in trapezoid shaping. A recursive trapezoid-shaping filter algorithm is employed for pole-zero compensation of exponentially decayed (with two-decay constants) preamplifier pulses of an HPGe detector. It allows extraction of pulse height information at the beginning of each pulse, thereby reducing the pulse pileup and increasing throughput. The algorithms for RC-CR2 timing filter, baseline restoration, pile-up rejection, and pulse height determination are digitally implemented for radiation spectroscopy. Traditionally, at high-count-rate conditions, a shorter shaping time is preferred to achieve high throughput, which deteriorates energy resolution. In this paper, experimental results are presented for varying count-rate and pulse shaping conditions. Using adaptive shaping, increased throughput is accepted while preserving the energy resolution observed using the longer shaping times.

  11. High-throughput materials discovery and development: breakthroughs and challenges in the mapping of the materials genome

    NASA Astrophysics Data System (ADS)

    Buongiorno Nardelli, Marco

    High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends almost exclusively on the the design of efficient algorithms for electronic structure simulations of realistic material systems beyond the limitations of the current standard theories. In this talk, I will review recent progress in theoretical and computational tools, and in particular, discuss the development and validation of novel functionals within Density Functional Theory and of local basis representations for effective ab-initio tight-binding schemes. Marco Buongiorno Nardelli is a pioneer in the development of computational platforms for theory/data/applications integration rooted in his profound and extensive expertise in the design of electronic structure codes and in his vision for sustainable and innovative software development for high-performance materials simulations. His research activities range from the design and discovery of novel materials for 21st century applications in renewable energy, environment, nano-electronics and devices, the development of advanced electronic structure theories and high-throughput techniques in materials genomics and computational materials design, to an active role as community scientific software developer (QUANTUM ESPRESSO, WanT, AFLOWpi)

  12. High throughput system for magnetic manipulation of cells, polymers, and biomaterials

    PubMed Central

    Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.

    2008-01-01

    In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357

  13. Identification of functional modules using network topology and high-throughput data.

    PubMed

    Ulitsky, Igor; Shamir, Ron

    2007-01-26

    With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.

  14. Statistical Modeling of Single Target Cell Encapsulation

    PubMed Central

    Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan

    2011-01-01

    High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548

  15. Combinatorial Reactive Sputtering of In2S3 as an Alternative Contact Layer for Thin Film Solar Cells.

    PubMed

    Siol, Sebastian; Dhakal, Tara P; Gudavalli, Ganesh S; Rajbhandari, Pravakar P; DeHart, Clay; Baranowski, Lauryn L; Zakutayev, Andriy

    2016-06-08

    High-throughput computational and experimental techniques have been used in the past to accelerate the discovery of new promising solar cell materials. An important part of the development of novel thin film solar cell technologies, that is still considered a bottleneck for both theory and experiment, is the search for alternative interfacial contact (buffer) layers. The research and development of contact materials is difficult due to the inherent complexity that arises from its interactions at the interface with the absorber. A promising alternative to the commonly used CdS buffer layer in thin film solar cells that contain absorbers with lower electron affinity can be found in β-In2S3. However, the synthesis conditions for the sputter deposition of this material are not well-established. Here, In2S3 is investigated as a solar cell contact material utilizing a high-throughput combinatorial screening of the temperature-flux parameter space, followed by a number of spatially resolved characterization techniques. It is demonstrated that, by tuning the sulfur partial pressure, phase pure β-In2S3 could be deposited using a broad range of substrate temperatures between 500 °C and ambient temperature. Combinatorial photovoltaic device libraries with Al/ZnO/In2S3/Cu2ZnSnS4/Mo/SiO2 structure were built at optimal processing conditions to investigate the feasibility of the sputtered In2S3 buffer layers and of an accelerated optimization of the device structure. The performance of the resulting In2S3/Cu2ZnSnS4 photovoltaic devices is on par with CdS/Cu2ZnSnS4 reference solar cells with similar values for short circuit currents and open circuit voltages, despite the overall quite low efficiency of the devices (∼2%). Overall, these results demonstrate how a high-throughput experimental approach can be used to accelerate the development of contact materials and facilitate the optimization of thin film solar cell devices.

  16. 'Enzyme Test Bench': A biochemical application of the multi-rate modeling

    NASA Astrophysics Data System (ADS)

    Rachinskiy, K.; Schultze, H.; Boy, M.; Büchs, J.

    2008-11-01

    In the expanding field of 'white biotechnology' enzymes are frequently applied to catalyze the biochemical reaction from a resource material to a valuable product. Evolutionary designed to catalyze the metabolism in any life form, they selectively accelerate complex reactions under physiological conditions. Modern techniques, such as directed evolution, have been developed to satisfy the increasing demand on enzymes. Applying these techniques together with rational protein design, we aim at improving of enzymes' activity, selectivity and stability. To tap the full potential of these techniques, it is essential to combine them with adequate screening methods. Nowadays a great number of high throughput colorimetric and fluorescent enzyme assays are applied to measure the initial enzyme activity with high throughput. However, the prediction of enzyme long term stability within short experiments is still a challenge. A new high throughput technique for enzyme characterization with specific attention to the long term stability, called 'Enzyme Test Bench', is presented. The concept of the Enzyme Test Bench consists of short term enzyme tests conducted under partly extreme conditions to predict the enzyme long term stability under moderate conditions. The technique is based on the mathematical modeling of temperature dependent enzyme activation and deactivation. Adapting the temperature profiles in sequential experiments by optimum non-linear experimental design, the long term deactivation effects can be purposefully accelerated and detected within hours. During the experiment the enzyme activity is measured online to estimate the model parameters from the obtained data. Thus, the enzyme activity and long term stability can be calculated as a function of temperature. The results of the characterization, based on micro liter format experiments of hours, are in good agreement with the results of long term experiments in 1L format. Thus, the new technique allows for both: the enzyme screening with regard to the long term stability and the choice of the optimal process temperature. The presented article gives a successful example for the application of multi-rate modeling, experimental design and parameter estimation within biochemical engineering. At the same time, it shows the limitations of the methods at the state of the art and addresses the current problems to the applied mathematics community.

  17. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  18. Towards High Throughput Cell Growth Screening: A New CMOS 8 × 8 Biosensor Array for Life Science Applications.

    PubMed

    Nabovati, Ghazal; Ghafar-Zadeh, Ebrahim; Letourneau, Antoine; Sawan, Mohamad

    2017-04-01

    In this paper we present a CMOS capacitive sensor array as a compact and low-cost platform for high-throughput cell growth monitoring. The proposed biosensor, consists of an array of 8 × 8 CMOS fully differential charge-based capacitive measurement sensors. A DC-input Σ∆ modulator is used to convert the sensors' signals to digital values for reading out the biological/chemical data and further signal processing. To compensate the mismatch variations between the current mirror transistors, a calibration circuitry is proposed which removes the output voltage offset with less than 8.2% error. We validate the chip functionality using various organic solvents with different dielectric constants. Moreover, we show the response of the chip to different concentrations of Polystyrene beads that have the same electrical properties as the living cells. The experimental results show that the chip allows the detection of a wide range of Polystyrene beads concentrations from as low as 10 beads/ml to 100 k beads/ml. In addition, we present the experimental results from H1299 (human lung carcinoma) cell line where we show that the chip successfully allows the detection of cell attachment and growth over capacitive electrodes in a 30 h measurement time and the results are in consistency with the standard cell-based assays. The capability of proposed device for label-free and real-time detection of cell growth with very high sensitivity opens up the important opportunity for utilizing the device in rapid screening of living cells.

  19. NetGen: a novel network-based probabilistic generative model for gene set functional enrichment analysis.

    PubMed

    Sun, Duanchen; Liu, Yinliang; Zhang, Xiang-Sun; Wu, Ling-Yun

    2017-09-21

    High-throughput experimental techniques have been dramatically improved and widely applied in the past decades. However, biological interpretation of the high-throughput experimental results, such as differential expression gene sets derived from microarray or RNA-seq experiments, is still a challenging task. Gene Ontology (GO) is commonly used in the functional enrichment studies. The GO terms identified via current functional enrichment analysis tools often contain direct parent or descendant terms in the GO hierarchical structure. Highly redundant terms make users difficult to analyze the underlying biological processes. In this paper, a novel network-based probabilistic generative model, NetGen, was proposed to perform the functional enrichment analysis. An additional protein-protein interaction (PPI) network was explicitly used to assist the identification of significantly enriched GO terms. NetGen achieved a superior performance than the existing methods in the simulation studies. The effectiveness of NetGen was explored further on four real datasets. Notably, several GO terms which were not directly linked with the active gene list for each disease were identified. These terms were closely related to the corresponding diseases when accessed to the curated literatures. NetGen has been implemented in the R package CopTea publicly available at GitHub ( http://github.com/wulingyun/CopTea/ ). Our procedure leads to a more reasonable and interpretable result of the functional enrichment analysis. As a novel term combination-based functional enrichment analysis method, NetGen is complementary to current individual term-based methods, and can help to explore the underlying pathogenesis of complex diseases.

  20. An automated maze task for assessing hippocampus-sensitive memory in mice☆

    PubMed Central

    Pioli, Elsa Y.; Gaskill, Brianna N.; Gilmour, Gary; Tricklebank, Mark D.; Dix, Sophie L.; Bannerman, David; Garner, Joseph P.

    2014-01-01

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery. PMID:24333574

  1. Efficient Modeling and Active Learning Discovery of Biological Responses

    PubMed Central

    Naik, Armaghan W.; Kangas, Joshua D.; Langmead, Christopher J.; Murphy, Robert F.

    2013-01-01

    High throughput and high content screening involve determination of the effect of many compounds on a given target. As currently practiced, screening for each new target typically makes little use of information from screens of prior targets. Further, choices of compounds to advance to drug development are made without significant screening against off-target effects. The overall drug development process could be made more effective, as well as less expensive and time consuming, if potential effects of all compounds on all possible targets could be considered, yet the cost of such full experimentation would be prohibitive. In this paper, we describe a potential solution: probabilistic models that can be used to predict results for unmeasured combinations, and active learning algorithms for efficiently selecting which experiments to perform in order to build those models and determining when to stop. Using simulated and experimental data, we show that our approaches can produce powerful predictive models without exhaustive experimentation and can learn them much faster than by selecting experiments at random. PMID:24358322

  2. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less

  3. 40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...

  4. Experimental dynamic characterizations and modelling of disk vibrations for HDDs.

    PubMed

    Pang, Chee Khiang; Ong, Eng Hong; Guo, Guoxiao; Qian, Hua

    2008-01-01

    Currently, the rotational speed of spindle motors in HDDs (Hard-Disk Drives) are increasing to improve high data throughput and decrease rotational latency for ultra-high data transfer rates. However, the disk platters are excited to vibrate at their natural frequencies due to higher air-flow excitation as well as eccentricities and imbalances in the disk-spindle assembly. These factors contribute directly to TMR (Track Mis-Registration) which limits achievable high recording density essential for future mobile HDDs. In this paper, the natural mode shapes of an annular disk mounted on a spindle motor used in current HDDs are characterized using FEM (Finite Element Methods) analysis and verified with SLDV (Scanning Laser Doppler Vibrometer) measurements. The identified vibration frequencies and amplitudes of the disk ODS (Operating Deflection Shapes) at corresponding disk mode shapes are modelled as repeatable disturbance components for servo compensation in HDDs. Our experimental results show that the SLDV measurements are accurate in capturing static disk mode shapes without the need for intricate air-flow aero-elastic models, and the proposed disk ODS vibration model correlates well with experimental measurements from a LDV.

  5. Physical descriptions of the bacterial nucleoid at large scales, and their biological implications

    NASA Astrophysics Data System (ADS)

    Benza, Vincenzo G.; Bassetti, Bruno; Dorfman, Kevin D.; Scolari, Vittore F.; Bromek, Krystyna; Cicuta, Pietro; Cosentino Lagomarsino, Marco

    2012-07-01

    Recent experimental and theoretical approaches have attempted to quantify the physical organization (compaction and geometry) of the bacterial chromosome with its complement of proteins (the nucleoid). The genomic DNA exists in a complex and dynamic protein-rich state, which is highly organized at various length scales. This has implications for modulating (when not directly enabling) the core biological processes of replication, transcription and segregation. We overview the progress in this area, driven in the last few years by new scientific ideas and new interdisciplinary experimental techniques, ranging from high space- and time-resolution microscopy to high-throughput genomics employing sequencing to map different aspects of the nucleoid-related interactome. The aim of this review is to present the wide spectrum of experimental and theoretical findings coherently, from a physics viewpoint. In particular, we highlight the role that statistical and soft condensed matter physics play in describing this system of fundamental biological importance, specifically reviewing classic and more modern tools from the theory of polymers. We also discuss some attempts toward unifying interpretations of the current results, pointing to possible directions for future investigation.

  6. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  7. High-speed laser microsurgery of alert fruit flies for fluorescence imaging of neural activity

    PubMed Central

    Sinha, Supriyo; Liang, Liang; Ho, Eric T. W.; Urbanek, Karel E.; Luo, Liqun; Baer, Thomas M.; Schnitzer, Mark J.

    2013-01-01

    Intravital microscopy is a key means of monitoring cellular function in live organisms, but surgical preparation of a live animal for microscopy often is time-consuming, requires considerable skill, and limits experimental throughput. Here we introduce a spatially precise (<1-µm edge precision), high-speed (<1 s), largely automated, and economical protocol for microsurgical preparation of live animals for optical imaging. Using a 193-nm pulsed excimer laser and the fruit fly as a model, we created observation windows (12- to 350-µm diameters) in the exoskeleton. Through these windows we used two-photon microscopy to image odor-evoked Ca2+ signaling in projection neuron dendrites of the antennal lobe and Kenyon cells of the mushroom body. The impact of a laser-cut window on fly health appears to be substantially less than that of conventional manual dissection, for our imaging durations of up to 18 h were ∼5–20 times longer than prior in vivo microscopy studies of hand-dissected flies. This improvement will facilitate studies of numerous questions in neuroscience, such as those regarding neuronal plasticity or learning and memory. As a control, we used phototaxis as an exemplary complex behavior in flies and found that laser microsurgery is sufficiently gentle to leave it intact. To demonstrate that our techniques are applicable to other species, we created microsurgical openings in nematodes, ants, and the mouse cranium. In conjunction with emerging robotic methods for handling and mounting flies or other small organisms, our rapid, precisely controllable, and highly repeatable microsurgical techniques should enable automated, high-throughput preparation of live animals for optical experimentation. PMID:24167298

  8. Fully-automated, high-throughput micro-computed tomography analysis of body composition enables therapeutic efficacy monitoring in preclinical models.

    PubMed

    Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D

    2015-11-01

    The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.

  9. From Lab to Fab: Developing a Nanoscale Delivery Tool for Scalable Nanomanufacturing

    NASA Astrophysics Data System (ADS)

    Safi, Asmahan A.

    The emergence of nanomaterials with unique properties at the nanoscale over the past two decades carries a capacity to impact society and transform or create new industries ranging from nanoelectronics to nanomedicine. However, a gap in nanomanufacturing technologies has prevented the translation of nanomaterial into real-world commercialized products. Bridging this gap requires a paradigm shift in methods for fabricating structured devices with a nanoscale resolution in a repeatable fashion. This thesis explores the new paradigms for fabricating nanoscale structures devices and systems for high throughput high registration applications. We present a robust and scalable nanoscale delivery platform, the Nanofountain Probe (NFP), for parallel direct-write of functional materials. The design and microfabrication of NFP is presented. The new generation addresses the challenges of throughput, resolution and ink replenishment characterizing tip-based nanomanufacturing. To achieve these goals, optimized probe geometry is integrated to the process along with channel sealing and cantilever bending. The capabilities of the newly fabricated probes are demonstrated through two type of delivery: protein nanopatterning and single cell nanoinjection. The broad applications of the NFP for single cell delivery are investigated. An external microfluidic packaging is developed to enable delivery in liquid environment. The system is integrated to a combined atomic force microscope and inverted fluorescence microscope. Intracellular delivery is demonstrated by injecting a fluorescent dextran into Hela cells in vitro while monitoring the injection forces. Such developments enable in vitro cellular delivery for single cell studies and high throughput gene expression. The nanomanufacturing capabilities of NFPs are explored. Nanofabrication of carbon nanotube-based electronics presents all the manufacturing challenges characterizing of assembling nanomaterials precisely onto devices. The presented study combines top-down and bottom-approaches by integrating the catalyst patterning and carbon nanotube growth directly on structures. Large array of iron-rich catalyst are patterned on an substrate for subsequent carbon nanotubes synthesis. The dependence of probe geometry and substrate wetting is assessed by modeling and experimental studies. Finally preliminary results on synthesis of carbon nanotube by catalyst assisted chemical vapor deposition suggest increasing the catalyst yield is critical. Such work will enable high throughput nanomanufacturing of carbon nanotube based devices.

  10. Model for High-Throughput Screening of Multitarget Drugs in Chemical Neurosciences: Synthesis, Assay, and Theoretic Study of Rasagiline Carbamates

    PubMed Central

    2013-01-01

    The disappointing results obtained in recent clinical trials renew the interest in experimental/computational techniques for the discovery of neuroprotective drugs. In this context, multitarget or multiplexing QSAR models (mt-QSAR/mx-QSAR) may help to predict neurotoxicity/neuroprotective effects of drugs in multiple assays, on drug targets, and in model organisms. In this work, we study a data set downloaded from CHEMBL; each data point (>8000) contains the values of one out of 37 possible measures of activity, 493 assays, 169 molecular or cellular targets, and 11 different organisms (including human) for a given compound. In this work, we introduce the first mx-QSAR model for neurotoxicity/neuroprotective effects of drugs based on the MARCH-INSIDE (MI) method. First, we used MI to calculate the stochastic spectral moments (structural descriptors) of all compounds. Next, we found a model that classified correctly 2955 out of 3548 total cases in the training and validation series with Accuracy, Sensitivity, and Specificity values > 80%. The model also showed excellent results in Computational-Chemistry simulations of High-Throughput Screening (CCHTS) experiments, with accuracy = 90.6% for 4671 positive cases. Next, we reported the synthesis, characterization, and experimental assays of new rasagiline derivatives. We carried out three different experimental tests: assay (1) in the absence of neurotoxic agents, assay (2) in the presence of glutamate, and assay (3) in the presence of H2O2. Compounds 11 with 27.4%, 8 with 11.6%, and 9 with 15.4% showed the highest neuroprotective effects in assays (1), (2), and (3), respectively. After that, we used the mx-QSAR model to carry out a CCHTS of the new compounds in >400 unique pharmacological tests not carried out experimentally. Consequently, this model may become a promising auxiliary tool for the discovery of new drugs for the treatment of neurodegenerative diseases. PMID:23855599

  11. Human in vitro 3D co-culture model to engineer vascularized bone-mimicking tissues combining computational tools and statistical experimental approach.

    PubMed

    Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo

    2016-01-01

    The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Compound Transfer by Acoustic Droplet Ejection Promotes Quality and Efficiency in Ultra-High-Throughput Screening Campaigns.

    PubMed

    Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H

    2016-02-01

    Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.

  13. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  14. Wireless Coexistence and EMC of Bluetooth and 802.11b Devices in Controlled Laboratory Settings

    PubMed Central

    Seidman, Seth; Kainz, Wolfgang; Ruggera, Paul; Mendoza, Gonzalo

    2011-01-01

    This paper presents experimental testing that has been performed on wireless communication devices as victims of electromagnetic interference (EMI). Wireless victims included universal serial bus (USB) network adapters and personal digital assistants (PDAs) equipped with IEEE 802.11b and Bluetooth technologies. The experimental data in this paper was gathered in an anechoic chamber and a gigahertz transverse electromagnetic (GTEM) cell to ensure reliable and repeatable results. This testing includes: Electromagnetic compatibility (EMC) testing performed in accordance with IEC 60601-1-2, an in-band sweep of EMC testing, and coexistence testing. The tests in this study show that a Bluetooth communication was able to coexist with other Bluetooth devices with no decrease in throughput and no communication breakdowns. However, testing revealed a significant decrease in throughput and increase in communication breakdowns when an 802.11b source is near an 802.11b victim. In a hospital setting decreased throughput and communication breakdowns can cause wireless medical devices to fail. It is therefore vital to have an understanding of the effect EMI can have on wireless communication devices. PMID:22043254

  15. Wireless Coexistence and EMC of Bluetooth and 802.11b Devices in Controlled Laboratory Settings.

    PubMed

    Seidman, Seth; Kainz, Wolfgang; Ruggera, Paul; Mendoza, Gonzalo

    2011-01-01

    This paper presents experimental testing that has been performed on wireless communication devices as victims of electromagnetic interference (EMI). Wireless victims included universal serial bus (USB) network adapters and personal digital assistants (PDAs) equipped with IEEE 802.11b and Bluetooth technologies. The experimental data in this paper was gathered in an anechoic chamber and a gigahertz transverse electromagnetic (GTEM) cell to ensure reliable and repeatable results. This testing includes: Electromagnetic compatibility (EMC) testing performed in accordance with IEC 60601-1-2, an in-band sweep of EMC testing, and coexistence testing. The tests in this study show that a Bluetooth communication was able to coexist with other Bluetooth devices with no decrease in throughput and no communication breakdowns. However, testing revealed a significant decrease in throughput and increase in communication breakdowns when an 802.11b source is near an 802.11b victim. In a hospital setting decreased throughput and communication breakdowns can cause wireless medical devices to fail. It is therefore vital to have an understanding of the effect EMI can have on wireless communication devices.

  16. All-passive pixel super-resolution of time-stretch imaging

    PubMed Central

    Chan, Antony C. S.; Ng, Ho-Cheung; Bogaraju, Sharat C. V.; So, Hayden K. H.; Lam, Edmund Y.; Tsia, Kevin K.

    2017-01-01

    Based on image encoding in a serial-temporal format, optical time-stretch imaging entails a stringent requirement of state-of-the-art fast data acquisition unit in order to preserve high image resolution at an ultrahigh frame rate — hampering the widespread utilities of such technology. Here, we propose a pixel super-resolution (pixel-SR) technique tailored for time-stretch imaging that preserves pixel resolution at a relaxed sampling rate. It harnesses the subpixel shifts between image frames inherently introduced by asynchronous digital sampling of the continuous time-stretch imaging process. Precise pixel registration is thus accomplished without any active opto-mechanical subpixel-shift control or other additional hardware. Here, we present the experimental pixel-SR image reconstruction pipeline that restores high-resolution time-stretch images of microparticles and biological cells (phytoplankton) at a relaxed sampling rate (≈2–5 GSa/s)—more than four times lower than the originally required readout rate (20 GSa/s) — is thus effective for high-throughput label-free, morphology-based cellular classification down to single-cell precision. Upon integration with the high-throughput image processing technology, this pixel-SR time-stretch imaging technique represents a cost-effective and practical solution for large scale cell-based phenotypic screening in biomedical diagnosis and machine vision for quality control in manufacturing. PMID:28303936

  17. High-throughput screening for thermoelectric sulphides by using crystal structure features as descriptors

    NASA Astrophysics Data System (ADS)

    Zhang, Ruizhi; Du, Baoli; Chen, Kan; Reece, Mike; Materials Research Insititute Team

    With the increasing computational power and reliable databases, high-throughput screening is playing a more and more important role in the search of new thermoelectric materials. Rather than the well established density functional theory (DFT) calculation based methods, we propose an alternative approach to screen for new TE materials: using crystal structural features as 'descriptors'. We show that a non-distorted transition metal sulphide polyhedral network can be a good descriptor for high power factor according to crystal filed theory. By using Cu/S containing compounds as an example, 1600+ Cu/S containing entries in the Inorganic Crystal Structure Database (ICSD) were screened, and of those 84 phases are identified as promising thermoelectric materials. The screening results are validated by both electronic structure calculations and experimental results from the literature. We also fabricated some new compounds to test our screening results. Another advantage of using crystal structure features as descriptors is that we can easily establish structural relationships between the identified phases. Based on this, two material design approaches are discussed: 1) High-pressure synthesis of metastable phase; 2) In-situ 2-phase composites with coherent interface. This work was supported by a Marie Curie International Incoming Fellowship of the European Community Human Potential Program.

  18. CyTOF workflow: differential discovery in high-throughput high-dimensional cytometry datasets

    PubMed Central

    Nowicka, Malgorzata; Krieg, Carsten; Weber, Lukas M.; Hartmann, Felix J.; Guglietta, Silvia; Becher, Burkhard; Levesque, Mitchell P.; Robinson, Mark D.

    2017-01-01

    High dimensional mass and flow cytometry (HDCyto) experiments have become a method of choice for high throughput interrogation and characterization of cell populations.Here, we present an R-based pipeline for differential analyses of HDCyto data, largely based on Bioconductor packages. We computationally define cell populations using FlowSOM clustering, and facilitate an optional but reproducible strategy for manual merging of algorithm-generated clusters. Our workflow offers different analysis paths, including association of cell type abundance with a phenotype or changes in signaling markers within specific subpopulations, or differential analyses of aggregated signals. Importantly, the differential analyses we show are based on regression frameworks where the HDCyto data is the response; thus, we are able to model arbitrary experimental designs, such as those with batch effects, paired designs and so on. In particular, we apply generalized linear mixed models to analyses of cell population abundance or cell-population-specific analyses of signaling markers, allowing overdispersion in cell count or aggregated signals across samples to be appropriately modeled. To support the formal statistical analyses, we encourage exploratory data analysis at every step, including quality control (e.g. multi-dimensional scaling plots), reporting of clustering results (dimensionality reduction, heatmaps with dendrograms) and differential analyses (e.g. plots of aggregated signals). PMID:28663787

  19. An extensible framework for capturing solvent effects in computer generated kinetic models.

    PubMed

    Jalan, Amrit; West, Richard H; Green, William H

    2013-03-14

    Detailed kinetic models provide useful mechanistic insight into a chemical system. Manual construction of such models is laborious and error-prone, which has led to the development of automated methods for exploring chemical pathways. These methods rely on fast, high-throughput estimation of species thermochemistry and kinetic parameters. In this paper, we present a methodology for extending automatic mechanism generation to solution phase systems which requires estimation of solvent effects on reaction rates and equilibria. The linear solvation energy relationship (LSER) method of Abraham and co-workers is combined with Mintz correlations to estimate ΔG(solv)°(T) in over 30 solvents using solute descriptors estimated from group additivity. Simple corrections are found to be adequate for the treatment of radical sites, as suggested by comparison with known experimental data. The performance of scaled particle theory expressions for enthalpic-entropic decomposition of ΔG(solv)°(T) is also presented along with the associated computational issues. Similar high-throughput methods for solvent effects on free-radical kinetics are only available for a handful of reactions due to lack of reliable experimental data, and continuum dielectric calculations offer an alternative method for their estimation. For illustration, we model liquid phase oxidation of tetralin in different solvents computing the solvent dependence for ROO• + ROO• and ROO• + solvent reactions using polarizable continuum quantum chemistry methods. The resulting kinetic models show an increase in oxidation rate with solvent polarity, consistent with experiment. Further work needed to make this approach more generally useful is outlined.

  20. Analysis of sequencing data for probing RNA secondary structures and protein-RNA binding in studying posttranscriptional regulations.

    PubMed

    Hu, Xihao; Wu, Yang; Lu, Zhi John; Yip, Kevin Y

    2016-11-01

    High-throughput sequencing has been used to study posttranscriptional regulations, where the identification of protein-RNA binding is a major and fast-developing sub-area, which is in turn benefited by the sequencing methods for whole-transcriptome probing of RNA secondary structures. In the study of RNA secondary structures using high-throughput sequencing, bases are modified or cleaved according to their structural features, which alter the resulting composition of sequencing reads. In the study of protein-RNA binding, methods have been proposed to immuno-precipitate (IP) protein-bound RNA transcripts in vitro or in vivo By sequencing these transcripts, the protein-RNA interactions and the binding locations can be identified. For both types of data, read counts are affected by a combination of confounding factors, including expression levels of transcripts, sequence biases, mapping errors and the probing or IP efficiency of the experimental protocols. Careful processing of the sequencing data and proper extraction of important features are fundamentally important to a successful analysis. Here we review and compare different experimental methods for probing RNA secondary structures and binding sites of RNA-binding proteins (RBPs), and the computational methods proposed for analyzing the corresponding sequencing data. We suggest how these two types of data should be integrated to study the structural properties of RBP binding sites as a systematic way to better understand posttranscriptional regulations. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. High-throughput metabolic profiling of diverse green Coffea arabica beans identified tryptophan as a universal discrimination factor for immature beans.

    PubMed

    Setoyama, Daiki; Iwasa, Keiko; Seta, Harumichi; Shimizu, Hiroaki; Fujimura, Yoshinori; Miura, Daisuke; Wariishi, Hiroyuki; Nagai, Chifumi; Nakahara, Koichi

    2013-01-01

    The maturity of green coffee beans is the most influential determinant of the quality and flavor of the resultant coffee beverage. However, the chemical compounds that can be used to discriminate the maturity of the beans remain uncharacterized. We herein analyzed four distinct stages of maturity (immature, semi-mature, mature and overripe) of nine different varieties of green Coffea arabica beans hand-harvested from a single experimental field in Hawaii. After developing a high-throughput experimental system for sample preparation and liquid chromatography-mass spectrometry (LC-MS) measurement, we applied metabolic profiling, integrated with chemometric techniques, to explore the relationship between the metabolome and maturity of the sample in a non-biased way. For the multivariate statistical analyses, a partial least square (PLS) regression model was successfully created, which allowed us to accurately predict the maturity of the beans based on the metabolomic information. As a result, tryptophan was identified to be the best contributor to the regression model; the relative MS intensity of tryptophan was higher in immature beans than in those after the semi-mature stages in all arabica varieties investigated, demonstrating a universal discrimination factor for diverse arabica beans. Therefore, typtophan, either alone or together with other metabolites, may be utilized for traders as an assessment standard when purchasing qualified trading green arabica bean products. Furthermore, our results suggest that the tryptophan metabolism may be tightly linked to the development of coffee cherries and/or beans.

  2. Design of a high-throughput human neural crest cell migration assay to indicate potential developmental toxicants.

    PubMed

    Nyffeler, Johanna; Karreman, Christiaan; Leisner, Heidrun; Kim, Yong Jun; Lee, Gabsang; Waldmann, Tanja; Leist, Marcel

    2017-01-01

    Migration of neural crest cells (NCCs) is one of the pivotal processes of human fetal development. Malformations arise if NCC migration and differentiation are impaired genetically or by toxicants. In the currently available test systems for migration inhibition of NCC (MINC), the manual generation of a cell-free space results in extreme operator dependencies, and limits throughput. Here a new test format was established. The assay avoids scratching by plating cells around a commercially available circular stopper. Removal of the stopper barrier after cell attachment initiates migration. This microwell-based circular migration zone NCC function assay (cMINC) was further optimized for toxicological testing of human pluripotent stem cell (hPSC)-derived NCCs. The challenge of obtaining data on viability and migration by automated image processing was addressed by developing a freeware. Data on cell proliferation were obtained by labelling replicating cells, and by careful assessment of cell viability for each experimental sample. The role of cell proliferation as an experimental confounder was tested experimentally by performing the cMINC in the presence of the proliferation-inhibiting drug cytosine arabinoside (AraC), and by a careful evaluation of mitotic events over time. Data from these studies led to an adaptation of the test protocol, so that toxicant exposure was limited to 24 h. Under these conditions, a prediction model was developed that allows classification of toxicants as either inactive, leading to unspecific cytotoxicity, or specifically inhibiting NC migration at non-cytotoxic concentrations.

  3. High-throughput measurements of the optical redox ratio using a commercial microplate reader.

    PubMed

    Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C

    2015-01-01

    There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.

  4. Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.

    PubMed

    Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil

    2012-07-01

    Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.

  5. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    PubMed Central

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  6. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  7. High-throughput, image-based screening of pooled genetic variant libraries

    PubMed Central

    Emanuel, George; Moffitt, Jeffrey R.; Zhuang, Xiaowei

    2018-01-01

    Image-based, high-throughput screening of genetic perturbations will advance both biology and biotechnology. We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in numerous individual cells. We achieve genotyping by introducing barcoded genetic variants into cells and using massively multiplexed FISH to measure the barcodes. We demonstrated this method by screening mutants of the fluorescent protein YFAST, yielding brighter and more photostable YFAST variants. PMID:29083401

  8. Novel strategy for protein exploration: high-throughput screening assisted with fuzzy neural network.

    PubMed

    Kato, Ryuji; Nakano, Hideo; Konishi, Hiroyuki; Kato, Katsuya; Koga, Yuchi; Yamane, Tsuneo; Kobayashi, Takeshi; Honda, Hiroyuki

    2005-08-19

    To engineer proteins with desirable characteristics from a naturally occurring protein, high-throughput screening (HTS) combined with directed evolutional approach is the essential technology. However, most HTS techniques are simple positive screenings. The information obtained from the positive candidates is used only as results but rarely as clues for understanding the structural rules, which may explain the protein activity. In here, we have attempted to establish a novel strategy for exploring functional proteins associated with computational analysis. As a model case, we explored lipases with inverted enantioselectivity for a substrate p-nitrophenyl 3-phenylbutyrate from the wild-type lipase of Burkhorderia cepacia KWI-56, which is originally selective for (S)-configuration of the substrate. Data from our previous work on (R)-enantioselective lipase screening were applied to fuzzy neural network (FNN), bioinformatic algorithm, to extract guidelines for screening and engineering processes to be followed. FNN has an advantageous feature of extracting hidden rules that lie between sequences of variants and their enzyme activity to gain high prediction accuracy. Without any prior knowledge, FNN predicted a rule indicating that "size at position L167," among four positions (L17, F119, L167, and L266) in the substrate binding core region, is the most influential factor for obtaining lipase with inverted (R)-enantioselectivity. Based on the guidelines obtained, newly engineered novel variants, which were not found in the actual screening, were experimentally proven to gain high (R)-enantioselectivity by engineering the size at position L167. We also designed and assayed two novel variants, namely FIGV (L17F, F119I, L167G, and L266V) and FFGI (L17F, L167G, and L266I), which were compatible with the guideline obtained from FNN analysis, and confirmed that these designed lipases could acquire high inverted enantioselectivity. The results have shown that with the aid of bioinformatic analysis, high-throughput screening can expand its potential for exploring vast combinatorial sequence spaces of proteins.

  9. Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing

    PubMed Central

    Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi

    2016-01-01

    Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039

  10. Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.

    PubMed

    Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S

    1994-01-01

    The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.

  11. Experimental station for ultrafast extreme ultraviolet spectroscopy for non-equilibrium dynamics in warm dense matter

    NASA Astrophysics Data System (ADS)

    Lee, Jong-won; Geng, Xiaotao; Jung, Jae Hyung; Cho, Min Sang; Yang, Seong Hyeok; Jo, Jawon; Lee, Chang-lyoul; Cho, Byoung Ick; Kim, Dong-Eon

    2018-07-01

    Recent interest in highly excited matter generated by intense femtosecond laser pulses has led to experimental methods that directly investigate ultrafast non-equilibrium electronic and structural dynamics. We present a tabletop experimental station for the extreme ultraviolet (EUV) spectroscopy used to trace L-edge dynamics in warm dense aluminum with a temporal resolution of a hundred femtoseconds. The system consists of the EUV probe generation part via a high-order harmonic generation process of femtosecond laser pulses with atomic clusters, a beamline with high-throughput optics and a sample-refreshment system of nano-foils utilizing the full repetition rate of the probe, and a flat-field EUV spectrograph. With the accumulation of an order of a hundred shots, a clear observation of the change in the aluminum L-shell absorption was achieved with a temporal resolution of 90 fs in a 600-fs window. The signature of a non-equilibrium electron distribution over a 10-eV range and its evolution to a 1-eV Fermi distribution are observed. This demonstrates the capability of this apparatus to capture the non-equilibrium electron-hole dynamics in highly excited warm dense matter conditions.

  12. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  13. High-Throughput Fabrication of Quality Nanofibers Using a Modified Free Surface Electrospinning.

    PubMed

    Shao, Zhongbiao; Yu, Liang; Xu, Lan; Wang, Mingdi

    2017-12-01

    Based on bubble electrospinning (BE), a modified free surface electrospinning (MFSE) using a cone-shaped air nozzle combined with a solution reservoir made of copper tubes was presented to increase the production of quality nanofibers. In the MFSE process, sodium dodecyl benzene sulfonates (SDBS) were added in the electrospun solution to generate bubbles on a liquid surface. The effects of applied voltage and generated bubbles on the morphology and production of nanofibers were investigated experimentally and theoretically. The theoretical analysis results of the electric field were in good agreement with the experimental data and showed that the quality and production of nanofibers were improved with the increase of applied voltage, and the generated bubbles would decrease the quality and production of nanofibers.

  14. High-throughput density-functional perturbation theory phonons for inorganic materials

    NASA Astrophysics Data System (ADS)

    Petretto, Guido; Dwaraknath, Shyam; P. C. Miranda, Henrique; Winston, Donald; Giantomassi, Matteo; van Setten, Michiel J.; Gonze, Xavier; Persson, Kristin A.; Hautier, Geoffroy; Rignanese, Gian-Marco

    2018-05-01

    The knowledge of the vibrational properties of a material is of key importance to understand physical phenomena such as thermal conductivity, superconductivity, and ferroelectricity among others. However, detailed experimental phonon spectra are available only for a limited number of materials, which hinders the large-scale analysis of vibrational properties and their derived quantities. In this work, we perform ab initio calculations of the full phonon dispersion and vibrational density of states for 1521 semiconductor compounds in the harmonic approximation based on density functional perturbation theory. The data is collected along with derived dielectric and thermodynamic properties. We present the procedure used to obtain the results, the details of the provided database and a validation based on the comparison with experimental data.

  15. High-Throughput Fabrication of Quality Nanofibers Using a Modified Free Surface Electrospinning

    NASA Astrophysics Data System (ADS)

    Shao, Zhongbiao; Yu, Liang; Xu, Lan; Wang, Mingdi

    2017-07-01

    Based on bubble electrospinning (BE), a modified free surface electrospinning (MFSE) using a cone-shaped air nozzle combined with a solution reservoir made of copper tubes was presented to increase the production of quality nanofibers. In the MFSE process, sodium dodecyl benzene sulfonates (SDBS) were added in the electrospun solution to generate bubbles on a liquid surface. The effects of applied voltage and generated bubbles on the morphology and production of nanofibers were investigated experimentally and theoretically. The theoretical analysis results of the electric field were in good agreement with the experimental data and showed that the quality and production of nanofibers were improved with the increase of applied voltage, and the generated bubbles would decrease the quality and production of nanofibers.

  16. Parallelization of Catalytic Packed-Bed Microchannels with Pressure-Drop Microstructures for Gas-Liquid Multiphase Reactions

    NASA Astrophysics Data System (ADS)

    Murakami, Sunao; Ohtaki, Kenichiro; Matsumoto, Sohei; Inoue, Tomoya

    2012-06-01

    High-throughput and stable treatments are required to achieve the practical production of chemicals with microreactors. However, the flow maldistribution to the paralleled microchannels has been a critical problem in achieving the productive use of multichannel microreactors for multiphase flow conditions. In this study, we newly designed and fabricated a glass four-channel catalytic packed-bed microreactor for the scale-up of gas-liquid multiphase chemical reactions. We embedded microstructures generating high pressure losses at the upstream side of each packed bed, and experimentally confirmed the efficacy of the microstructures in decreasing the maldistribution of the gas-liquid flow to the parallel microchannels.

  17. High-throughput protein concentration and buffer exchange: comparison of ultrafiltration and ammonium sulfate precipitation.

    PubMed

    Moore, Priscilla A; Kery, Vladimir

    2009-01-01

    High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.

  18. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  19. Sample flow switching techniques on microfluidic chips.

    PubMed

    Pan, Yu-Jen; Lin, Jin-Jie; Luo, Win-Jet; Yang, Ruey-Jen

    2006-02-15

    This paper presents an experimental investigation into electrokinetically focused flow injection for bio-analytical applications. A novel microfluidic device for microfluidic sample handling is presented. The microfluidic chip is fabricated on glass substrates using conventional photolithographic and chemical etching processes and is bonded using a high-temperature fusion method. The proposed valve-less device is capable not only of directing a single sample flow to a specified output port, but also of driving multiple samples to separate outlet channels or even to a single outlet to facilitate sample mixing. The experimental results confirm that the sample flow can be electrokinetically pre-focused into a narrow stream and guided to the desired outlet port by means of a simple control voltage model. The microchip presented within this paper has considerable potential for use in a variety of applications, including high-throughput chemical analysis, cell fusion, fraction collection, sample mixing, and many other applications within the micro-total-analysis systems field.

  20. Identification of Extracellular Segments by Mass Spectrometry Improves Topology Prediction of Transmembrane Proteins.

    PubMed

    Langó, Tamás; Róna, Gergely; Hunyadi-Gulyás, Éva; Turiák, Lilla; Varga, Julia; Dobson, László; Várady, György; Drahos, László; Vértessy, Beáta G; Medzihradszky, Katalin F; Szakács, Gergely; Tusnády, Gábor E

    2017-02-13

    Transmembrane proteins play crucial role in signaling, ion transport, nutrient uptake, as well as in maintaining the dynamic equilibrium between the internal and external environment of cells. Despite their important biological functions and abundance, less than 2% of all determined structures are transmembrane proteins. Given the persisting technical difficulties associated with high resolution structure determination of transmembrane proteins, additional methods, including computational and experimental techniques remain vital in promoting our understanding of their topologies, 3D structures, functions and interactions. Here we report a method for the high-throughput determination of extracellular segments of transmembrane proteins based on the identification of surface labeled and biotin captured peptide fragments by LC/MS/MS. We show that reliable identification of extracellular protein segments increases the accuracy and reliability of existing topology prediction algorithms. Using the experimental topology data as constraints, our improved prediction tool provides accurate and reliable topology models for hundreds of human transmembrane proteins.

  1. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    PubMed

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    PubMed

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  3. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  4. High-Throughput Sequencing of Germline and Tumor From Men with Early-Onset Metastatic Prostate Cancer

    DTIC Science & Technology

    2016-12-01

    AWARD NUMBER: W81XWH-13-1-0371 TITLE: High-Throughput Sequencing of Germline and Tumor From Men with Early- Onset Metastatic Prostate Cancer...DATES COVERED 30 Sep 2013 - 29 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER High-Throughput Sequencing of Germline and Tumor From Men with...presenting with metastatic prostate cancer at a young age (before age 60 years). Whole exome sequencing identified a panel of germline variants that have

  5. High throughput determination of cleaning solutions to prevent the fouling of an anion exchange resin.

    PubMed

    Elich, Thomas; Iskra, Timothy; Daniels, William; Morrison, Christopher J

    2016-06-01

    Effective cleaning of chromatography resin is required to prevent fouling and maximize the number of processing cycles which can be achieved. Optimization of resin cleaning procedures, however, can lead to prohibitive material, labor, and time requirements, even when using milliliter scale chromatography columns. In this work, high throughput (HT) techniques were used to evaluate cleaning agents for a monoclonal antibody (mAb) polishing step utilizing Fractogel(®) EMD TMAE HiCap (M) anion exchange (AEX) resin. For this particular mAb feed stream, the AEX resin could not be fully restored with traditional NaCl and NaOH cleaning solutions, resulting in a loss of impurity capacity with resin cycling. Miniaturized microliter scale chromatography columns and an automated liquid handling system (LHS) were employed to evaluate various experimental cleaning conditions. Cleaning agents were monitored for their ability to maintain resin impurity capacity over multiple processing cycles by analyzing the flowthrough material for turbidity and high molecular weight (HMW) content. HT experiments indicated that a 167 mM acetic acid strip solution followed by a 0.5 M NaOH, 2 M NaCl sanitization provided approximately 90% cleaning improvement over solutions containing solely NaCl and/or NaOH. Results from the microliter scale HT experiments were confirmed in subsequent evaluations at the milliliter scale. These results identify cleaning agents which may restore resin performance for applications involving fouling species in ion exchange systems. In addition, this work demonstrates the use of miniaturized columns operated with an automated LHS for HT evaluation of chromatographic cleaning procedures, effectively decreasing material requirements while simultaneously increasing throughput. Biotechnol. Bioeng. 2016;113: 1251-1259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  6. Utilization of a high-throughput shoot imaging system to examine the dynamic phenotypic responses of a C4 cereal crop plant to nitrogen and water deficiency over time

    PubMed Central

    Neilson, E. H.; Edwards, A. M.; Blomstedt, C. K.; Berger, B.; Møller, B. Lindberg; Gleadow, R. M.

    2015-01-01

    The use of high-throughput phenotyping systems and non-destructive imaging is widely regarded as a key technology allowing scientists and breeders to develop crops with the ability to perform well under diverse environmental conditions. However, many of these phenotyping studies have been optimized using the model plant Arabidopsis thaliana. In this study, The Plant Accelerator® at The University of Adelaide, Australia, was used to investigate the growth and phenotypic response of the important cereal crop, Sorghum bicolor L. Moench and related hybrids to water-limited conditions and different levels of fertilizer. Imaging in different spectral ranges was used to monitor plant composition, chlorophyll, and moisture content. Phenotypic image analysis accurately measured plant biomass. The data set obtained enabled the responses of the different sorghum varieties to the experimental treatments to be differentiated and modelled. Plant architectural instead of architecture elements were determined using imaging and found to correlate with an improved tolerance to stress, for example diurnal leaf curling and leaf area index. Analysis of colour images revealed that leaf ‘greenness’ correlated with foliar nitrogen and chlorophyll, while near infrared reflectance (NIR) analysis was a good predictor of water content and leaf thickness, and correlated with plant moisture content. It is shown that imaging sorghum using a high-throughput system can accurately identify and differentiate between growth and specific phenotypic traits. R scripts for robust, parsimonious models are provided to allow other users of phenomic imaging systems to extract useful data readily, and thus relieve a bottleneck in phenotypic screening of multiple genotypes of key crop plants. PMID:25697789

  7. Identification of microRNAs in PCV2 subclinically infected pigs by high throughput sequencing.

    PubMed

    Núñez-Hernández, Fernando; Pérez, Lester J; Muñoz, Marta; Vera, Gonzalo; Tomás, Anna; Egea, Raquel; Córdoba, Sarai; Segalés, Joaquim; Sánchez, Armand; Núñez, José I

    2015-03-03

    Porcine circovirus type 2 (PCV2) is the essential etiological infectious agent of PCV2-systemic disease and has been associated with other swine diseases, all of them collectively known as porcine circovirus diseases. MicroRNAs (miRNAs) are a new class of small non-coding RNAs that regulate gene expression post-transcriptionally. miRNAs play an increasing role in many biological processes. The study of miRNA-mediated host-pathogen interactions has emerged in the last decade due to the important role that miRNAs play in antiviral defense. The objective of this study was to identify the miRNA expression pattern in PCV2 subclinically infected and non-infected pigs. For this purpose an experimental PCV2 infection was carried out and small-RNA libraries were constructed from tonsil and mediastinal lymph node (MLN) of infected and non-infected pigs. High throughput sequencing determined differences in miRNA expression in MLN between infected and non-infected while, in tonsil, a very conserved pattern was observed. In MLN, miRNA 126-3p, miRNA 126-5p, let-7d-3p, mir-129a and mir-let-7b-3p were up-regulated whereas mir-193a-5p, mir-574-5p and mir-34a down-regulated. Prediction of functional analysis showed that these miRNAs can be involved in pathways related to immune system and in processes related to the pathogenesis of PCV2, although functional assays are needed to support these predictions. This is the first study on miRNA gene expression in pigs infected with PCV2 using a high throughput sequencing approach in which several host miRNAs were differentially expressed in response to PCV2 infection.

  8. Using high throughput sequencing to explore the biodiversity in oral bacterial communities.

    PubMed

    Diaz, P I; Dupuy, A K; Abusleme, L; Reese, B; Obergfell, C; Choquette, L; Dongari-Bagtzoglou, A; Peterson, D E; Terzi, E; Strausbaugh, L D

    2012-06-01

    High throughput sequencing of 16S ribosomal RNA gene amplicons is a cost-effective method for characterization of oral bacterial communities. However, before undertaking large-scale studies, it is necessary to understand the technique-associated limitations and intrinsic variability of the oral ecosystem. In this work we evaluated bias in species representation using an in vitro-assembled mock community of oral bacteria. We then characterized the bacterial communities in saliva and buccal mucosa of five healthy subjects to investigate the power of high throughput sequencing in revealing their diversity and biogeography patterns. Mock community analysis showed primer and DNA isolation biases and an overestimation of diversity that was reduced after eliminating singleton operational taxonomic units (OTUs). Sequencing of salivary and mucosal communities found a total of 455 OTUs (0.3% dissimilarity) with only 78 of these present in all subjects. We demonstrate that this variability was partly the result of incomplete richness coverage even at great sequencing depths, and so comparing communities by their structure was more effective than comparisons based solely on membership. With respect to oral biogeography, we found inter-subject variability in community structure was lower than site differences between salivary and mucosal communities within subjects. These differences were evident at very low sequencing depths and were mostly caused by the abundance of Streptococcus mitis and Gemella haemolysans in mucosa. In summary, we present an experimental and data analysis framework that will facilitate design and interpretation of pyrosequencing-based studies. Despite challenges associated with this technique, we demonstrate its power for evaluation of oral diversity and biogeography patterns. © 2012 John Wiley & Sons A/S.

  9. Accelerated Discovery of High-Refractive-Index Polymers Using First-Principles Modeling, Virtual High-Throughput Screening, and Data Mining

    NASA Astrophysics Data System (ADS)

    Afzal, Mohammad Atif Faiz; Cheng, Chong; Hachmann, Johannes

    Organic materials with refractive index (RI) values higher than 1.7 have attracted considerable interest in recent years due to the tremendous potential for their application in optical, optometric, and optoelectronic devices, and thus for shaping technological innovation in numerous related areas. Our work is concerned with creating predictive models for the optical properties of organic polymers, which will guide our experimentalist partners and allow them to target the most promising candidates. The RI model is developed based on a synergistic combination of first-principles electronic structure theory and machine learning techniques. The RI values predicted for common polymers using this model are in very good agreement with the experimental values. We also benchmark different DFT approximations along with various basis sets for their predictive performance in this model. We demonstrate that this combination of first-principles and data modeling is both successful and highly economical in determining the RI values of a wide range of organic polymers. To accelerate the development process, we cast this modeling approach into the high-throughput screening, materials informatics, and rational design framework that is developed in the group. This framework is a powerful tool and has shown to be highly promising for rapidly identifying polymer candidates with exceptional RI values as well as discovering design rules for advanced materials.

  10. Analytic expressions for Atomic Layer Deposition: coverage, throughput, and materials utilization in cross-flow, particle coating, and spatial ALD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yanguas-Gil, Angel; Elam, Jeffrey W.

    2014-05-01

    In this work, the authors present analytic models for atomic layer deposition (ALD) in three common experimental configurations: cross-flow, particle coating, and spatial ALD. These models, based on the plug-flow and well-mixed approximations, allow us to determine the minimum dose times and materials utilization for all three configurations. A comparison between the three models shows that throughput and precursor utilization can each be expressed by universal equations, in which the particularity of the experimental system is contained in a single parameter related to the residence time of the precursor in the reactor. For the case of cross-flow reactors, the authorsmore » show how simple analytic expressions for the reactor saturation profiles agree well with experimental results. Consequently, the analytic model can be used to extract information about the ALD surface chemistry (e. g., the reaction probability) by comparing the analytic and experimental saturation profiles, providing a useful tool for characterizing new and existing ALD processes. (C) 2014 American Vacuum Society« less

  11. PomBase: The Scientific Resource for Fission Yeast.

    PubMed

    Lock, Antonia; Rutherford, Kim; Harris, Midori A; Wood, Valerie

    2018-01-01

    The fission yeast Schizosaccharomyces pombe has become well established as a model species for studying conserved cell-level biological processes, especially the mechanics and regulation of cell division. PomBase integrates the S. pombe genome sequence with traditional genetic, molecular, and cell biological experimental data as well as the growing body of large datasets generated by emerging high-throughput methods. This chapter provides insight into the curation philosophy and data organization at PomBase, and provides a guide to using PomBase for infrequent visitors and anyone considering exploring S. pombe in their research.

  12. Translational Biomedical Informatics in the Cloud: Present and Future

    PubMed Central

    Chen, Jiajia; Qian, Fuliang; Yan, Wenying; Shen, Bairong

    2013-01-01

    Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research. PMID:23586054

  13. Dissecting the human immunologic memory for pathogens.

    PubMed

    Zielinski, Christina E; Corti, Davide; Mele, Federico; Pinto, Dora; Lanzavecchia, Antonio; Sallusto, Federica

    2011-03-01

    Studies on immunologic memory in animal models and especially in the human system are instrumental to identify mechanisms and correlates of protection necessary for vaccine development. In this article, we provide an overview of the cellular basis of immunologic memory. We also describe experimental approaches based on high throughput cell cultures, which we have developed to interrogate human memory T cells, B cells, and plasma cells. We discuss how these approaches can provide new tools and information for vaccine design, in a process that we define as 'analytic vaccinology'. © 2011 John Wiley & Sons A/S.

  14. The receptive field is dead. Long live the receptive field?

    PubMed Central

    Fairhall, Adrienne

    2014-01-01

    Advances in experimental techniques, including behavioral paradigms using rich stimuli under closed loop conditions and the interfacing of neural systems with external inputs and outputs, reveal complex dynamics in the neural code and require a revisiting of standard concepts of representation. High-throughput recording and imaging methods along with the ability to observe and control neuronal subpopulations allow increasingly detailed access to the neural circuitry that subserves these representations and the computations they support. How do we harness theory to build biologically grounded models of complex neural function? PMID:24618227

  15. Profiling optimization for big data transfer over dedicated channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, D.; Wu, Qishi; Rao, Nageswara S

    The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less

  16. High-throughput sequencing methods to study neuronal RNA-protein interactions.

    PubMed

    Ule, Jernej

    2009-12-01

    UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.

  17. Evaluation of Compatibility of ToxCast High-Throughput/High-Content Screening Assays with Engineered Nanomaterials

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  18. Active machine learning-driven experimentation to determine compound effects on protein patterns.

    PubMed

    Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F

    2016-02-03

    High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance.

  19. Ab initio structure prediction of silicon and germanium sulfides for lithium-ion battery materials

    NASA Astrophysics Data System (ADS)

    Hsueh, Connie; Mayo, Martin; Morris, Andrew J.

    Conventional experimental-based approaches to materials discovery, which can rely heavily on trial and error, are time-intensive and costly. We discuss approaches to coupling experimental and computational techniques in order to systematize, automate, and accelerate the process of materials discovery, which is of particular relevance to developing new battery materials. We use the ab initio random structure searching (AIRSS) method to conduct a systematic investigation of Si-S and Ge-S binary compounds in order to search for novel materials for lithium-ion battery (LIB) anodes. AIRSS is a high-throughput, density functional theory-based approach to structure prediction which has been successful at predicting the structures of LIBs containing sulfur and silicon and germanium. We propose a lithiation mechanism for Li-GeS2 anodes as well as report new, theoretically stable, layered and porous structures in the Si-S and Ge-S systems that pique experimental interest.

  20. Predicting Essential Genes and Proteins Based on Machine Learning and Network Topological Features: A Comprehensive Review

    PubMed Central

    Zhang, Xue; Acencio, Marcio Luis; Lemke, Ney

    2016-01-01

    Essential proteins/genes are indispensable to the survival or reproduction of an organism, and the deletion of such essential proteins will result in lethality or infertility. The identification of essential genes is very important not only for understanding the minimal requirements for survival of an organism, but also for finding human disease genes and new drug targets. Experimental methods for identifying essential genes are costly, time-consuming, and laborious. With the accumulation of sequenced genomes data and high-throughput experimental data, many computational methods for identifying essential proteins are proposed, which are useful complements to experimental methods. In this review, we show the state-of-the-art methods for identifying essential genes and proteins based on machine learning and network topological features, point out the progress and limitations of current methods, and discuss the challenges and directions for further research. PMID:27014079

  1. SwellGel: an affinity chromatography technology for high-capacity and high-throughput purification of recombinant-tagged proteins.

    PubMed

    Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W

    2001-07-01

    The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.

  2. Large-scale microfluidics providing high-resolution and high-throughput screening of Caenorhabditis elegans poly-glutamine aggregation model

    NASA Astrophysics Data System (ADS)

    Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela

    2016-10-01

    Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.

  3. ToxCast Dashboard

    EPA Pesticide Factsheets

    The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.

  4. RapidTox Dashboard

    EPA Pesticide Factsheets

    The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.

  5. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    PubMed Central

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  6. Reconstructing the regulatory circuit of cell fate determination in yeast mating response.

    PubMed

    Shao, Bin; Yuan, Haiyu; Zhang, Rongfei; Wang, Xuan; Zhang, Shuwen; Ouyang, Qi; Hao, Nan; Luo, Chunxiong

    2017-07-01

    Massive technological advances enabled high-throughput measurements of proteomic changes in biological processes. However, retrieving biological insights from large-scale protein dynamics data remains a challenging task. Here we used the mating differentiation in yeast Saccharomyces cerevisiae as a model and developed integrated experimental and computational approaches to analyze the proteomic dynamics during the process of cell fate determination. When exposed to a high dose of mating pheromone, the yeast cell undergoes growth arrest and forms a shmoo-like morphology; however, at intermediate doses, chemotropic elongated growth is initialized. To understand the gene regulatory networks that control this differentiation switch, we employed a high-throughput microfluidic imaging system that allows real-time and simultaneous measurements of cell growth and protein expression. Using kinetic modeling of protein dynamics, we classified the stimulus-dependent changes in protein abundance into two sources: global changes due to physiological alterations and gene-specific changes. A quantitative framework was proposed to decouple gene-specific regulatory modes from the growth-dependent global modulation of protein abundance. Based on the temporal patterns of gene-specific regulation, we established the network architectures underlying distinct cell fates using a reverse engineering method and uncovered the dose-dependent rewiring of gene regulatory network during mating differentiation. Furthermore, our results suggested a potential crosstalk between the pheromone response pathway and the target of rapamycin (TOR)-regulated ribosomal biogenesis pathway, which might underlie a cell differentiation switch in yeast mating response. In summary, our modeling approach addresses the distinct impacts of the global and gene-specific regulation on the control of protein dynamics and provides new insights into the mechanisms of cell fate determination. We anticipate that our integrated experimental and modeling strategies could be widely applicable to other biological systems.

  7. A high-quality annotated transcriptome of swine peripheral blood

    USDA-ARS?s Scientific Manuscript database

    Background: High throughput gene expression profiling assays of peripheral blood are widely used in biomedicine, as well as in animal genetics and physiology research. Accurate, comprehensive, and precise interpretation of such high throughput assays relies on well-characterized reference genomes an...

  8. Adenylylation of small RNA sequencing adapters using the TS2126 RNA ligase I.

    PubMed

    Lama, Lodoe; Ryan, Kevin

    2016-01-01

    Many high-throughput small RNA next-generation sequencing protocols use 5' preadenylylated DNA oligonucleotide adapters during cDNA library preparation. Preadenylylation of the DNA adapter's 5' end frees from ATP-dependence the ligation of the adapter to RNA collections, thereby avoiding ATP-dependent side reactions. However, preadenylylation of the DNA adapters can be costly and difficult. The currently available method for chemical adenylylation of DNA adapters is inefficient and uses techniques not typically practiced in laboratories profiling cellular RNA expression. An alternative enzymatic method using a commercial RNA ligase was recently introduced, but this enzyme works best as a stoichiometric adenylylating reagent rather than a catalyst and can therefore prove costly when several variant adapters are needed or during scale-up or high-throughput adenylylation procedures. Here, we describe a simple, scalable, and highly efficient method for the 5' adenylylation of DNA oligonucleotides using the thermostable RNA ligase 1 from bacteriophage TS2126. Adapters with 3' blocking groups are adenylylated at >95% yield at catalytic enzyme-to-adapter ratios and need not be gel purified before ligation to RNA acceptors. Experimental conditions are also reported that enable DNA adapters with free 3' ends to be 5' adenylylated at >90% efficiency. © 2015 Lama and Ryan; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  9. Rosette Assay: Highly Customizable Dot-Blot for SH2 Domain Screening.

    PubMed

    Ng, Khong Y; Machida, Kazuya

    2017-01-01

    With a growing number of high-throughput studies, structural analyses, and availability of protein-protein interaction databases, it is now possible to apply web-based prediction tools to SH2 domain-interactions. However, in silico prediction is not always reliable and requires experimental validation. Rosette assay is a dot blot-based reverse-phase assay developed for the assessment of binding between SH2 domains and their ligands. It is conveniently customizable, allowing for low- to high-throughput analysis of interactions between various numbers of SH2 domains and their ligands, e.g., short peptides, purified proteins, and cell lysates. The binding assay is performed in a 96-well plate (MBA or MWA apparatus) in which a sample spotted membrane is incubated with up to 96 labeled SH2 domains. Bound domains are detected and quantified using a chemiluminescence or near-infrared fluorescence (IR) imaging system. In this chapter, we describe a practical protocol for rosette assay to assess interactions between synthesized tyrosine phosphorylated peptides and a library of GST-tagged SH2 domains. Since the methodology is not confined to assessment of SH2-pTyr interactions, rosette assay can be broadly utilized for ligand and drug screening using different protein interaction domains or antibodies.

  10. Establishment and antitumor effects of dasatinib and PKI-587 in BD-138T, a patient-derived muscle invasive bladder cancer preclinical platform with concomitant EGFR amplification and PTEN deletion

    PubMed Central

    Lim, Joung Eun; Jeong, Da Eun; Song, Hye Jin; Kim, Sudong; Nam, Do-Hyun; Sung, Hyun Hwan; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han-Yong; Jeon, Hwang Gyun

    2016-01-01

    Muscle-invasive bladder cancer (MIBC) consists of a heterogeneous group of tumors with a high rate of metastasis and mortality. To facilitate the in-depth investigation and validation of tailored strategies for MIBC treatment, we have developed an integrated approach using advanced high-throughput drug screening and a clinically relevant patient-derived preclinical platform. We isolated patient-derived tumor cells (PDCs) from a rare MIBC case (BD-138T) that harbors concomitant epidermal growth factor receptor (EGFR) amplification and phosphatase and tensin homolog (PTEN) deletion. High-throughput in vitro drug screening demonstrated that dasatinib, a SRC inhibitor, and PKI-587, a dual PI3K/mTOR inhibitor, exhibited targeted anti-proliferative and pro-apoptotic effects against BD-138T PDCs. Using established patient-derived xenograft models that successfully retain the genomic and molecular characteristics of the parental tumor, we confirmed that these anti-tumor responses occurred through the inhibition of SRC and PI3K/AKT/mTOR signaling pathways. Taken together, these experimental results demonstrate that dasatinib and PKI-587 might serve as promising anticancer drug candidates for treating MIBC with combined EGFR gene amplification and PTEN deletion. PMID:27438149

  11. Establishment and antitumor effects of dasatinib and PKI-587 in BD-138T, a patient-derived muscle invasive bladder cancer preclinical platform with concomitant EGFR amplification and PTEN deletion.

    PubMed

    Chang, Nakho; Lee, Hye Won; Lim, Joung Eun; Jeong, Da Eun; Song, Hye Jin; Kim, Sudong; Nam, Do-Hyun; Sung, Hyun Hwan; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han-Yong; Jeon, Hwang Gyun

    2016-08-09

    Muscle-invasive bladder cancer (MIBC) consists of a heterogeneous group of tumors with a high rate of metastasis and mortality. To facilitate the in-depth investigation and validation of tailored strategies for MIBC treatment, we have developed an integrated approach using advanced high-throughput drug screening and a clinically relevant patient-derived preclinical platform. We isolated patient-derived tumor cells (PDCs) from a rare MIBC case (BD-138T) that harbors concomitant epidermal growth factor receptor (EGFR) amplification and phosphatase and tensin homolog (PTEN) deletion. High-throughput in vitro drug screening demonstrated that dasatinib, a SRC inhibitor, and PKI-587, a dual PI3K/mTOR inhibitor, exhibited targeted anti-proliferative and pro-apoptotic effects against BD-138T PDCs. Using established patient-derived xenograft models that successfully retain the genomic and molecular characteristics of the parental tumor, we confirmed that these anti-tumor responses occurred through the inhibition of SRC and PI3K/AKT/mTOR signaling pathways. Taken together, these experimental results demonstrate that dasatinib and PKI-587 might serve as promising anticancer drug candidates for treating MIBC with combined EGFR gene amplification and PTEN deletion.

  12. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  13. Proteome data to explore the impact of pBClin15 on Bacillus cereus ATCC 14579.

    PubMed

    Madeira, Jean-Paul; Alpha-Bazin, Béatrice; Armengaud, Jean; Omer, Hélène; Duport, Catherine

    2016-09-01

    This data article reports changes in the cellular and exoproteome of B. cereus cured from pBClin15.Time-course changes of proteins were assessed by high-throughput nanoLC-MS/MS. We report all the peptides and proteins identified and quantified in B. cereus with and without pBClin15. Proteins were classified into functional groups using the information available in the KEGG classification and we reported their abundance in term of normalized spectral abundance factor. The repertoire of experimentally confirmed proteins of B. cereus presented here is the largest ever reported, and provides new insights into the interplay between pBClin15 and its host B. cereus ATCC 14579. The data reported here is related to a published shotgun proteomics analysis regarding the role of pBClin15, "Deciphering the interactions between the Bacillus cereus linear plasmid, pBClin15, and its host by high-throughput comparative proteomics" Madeira et al. [1]. All the associated mass spectrometry data have been deposited in the ProteomeXchange Consortium (http://proteomecentral.proteomexchange.org) via the PRIDE partner repository (http://www.ebi.ac.uk/pride/), with the dataset identifier PRIDE: PXD001568, PRIDE: PXD002788 and PRIDE: PXD002789.

  14. High Throughput T Epitope Mapping and Vaccine Development

    PubMed Central

    Li Pira, Giuseppina; Ivaldi, Federico; Moretti, Paolo; Manca, Fabrizio

    2010-01-01

    Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th) and by cytolytic T lymphocytes (CTL) is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP) approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost. PMID:20617148

  15. Foliar fungi of Betula pendula: impact of tree species mixtures and assessment methods

    PubMed Central

    Nguyen, Diem; Boberg, Johanna; Cleary, Michelle; Bruelheide, Helge; Hönig, Lydia; Koricheva, Julia; Stenlid, Jan

    2017-01-01

    Foliar fungi of silver birch (Betula pendula) in an experimental Finnish forest were investigated across a gradient of tree species richness using molecular high-throughput sequencing and visual macroscopic assessment. We hypothesized that the molecular approach detects more fungal taxa than visual assessment, and that there is a relationship among the most common fungal taxa detected by both techniques. Furthermore, we hypothesized that the fungal community composition, diversity, and distribution patterns are affected by changes in tree diversity. Sequencing revealed greater diversity of fungi on birch leaves than the visual assessment method. One species showed a linear relationship between the methods. Species-specific variation in fungal community composition could be partially explained by tree diversity, though overall fungal diversity was not affected by tree diversity. Analysis of specific fungal taxa indicated tree diversity effects at the local neighbourhood scale, where the proportion of birch among neighbouring trees varied, but not at the plot scale. In conclusion, both methods may be used to determine tree diversity effects on the foliar fungal community. However, high-throughput sequencing provided higher resolution of the fungal community, while the visual macroscopic assessment detected functionally active fungal species. PMID:28150710

  16. Peroxisome Mini-Libraries: Systematic Approaches to Study Peroxisomes Made Easy.

    PubMed

    Dahan, Noa; Schuldiner, Maya; Zalckvar, Einat

    2017-01-01

    High-throughput methodologies have been extensively used in the budding yeast, Saccharomyces cerevisiae, to uncover fundamental principles of cell biology. Over the years, several collections of yeast strains (libraries) were built to enable systematic exploration of cellular functions. However, using these libraries experimentally is often labor intensive and restricted to laboratories that hold high throughput platforms. Utilizing the available full genome libraries we handpicked a subset of strains that represent all known and predicted peroxisomal proteins as well as proteins that have central roles in peroxisome biology. These smaller collections of strains, mini-libraries, can be rapidly and easily used for complicated screens by any lab. Since one of the libraries is built such that it can be easily modified in the tag, promoter and selection, we also discuss how these collections form the basis for creating a diversity of new peroxisomal libraries for future studies. Using manual tools, available in any yeast lab, coupled with few simple genetic approaches, we will show how these libraries can be "mixed and matched" to create tailor made libraries for screening. These yeast collections may now be exploited to study uncharted territories in the biology of peroxisomes by anyone, anywhere.

  17. Six-flow operations for catalyst development in Fischer-Tropsch synthesis: Bridging the gap between high-throughput experimentation and extensive product evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sartipi, Sina, E-mail: S.Sartipi@tudelft.nl, E-mail: J.Gascon@tudelft.nl; Jansma, Harrie; Bosma, Duco

    2013-12-15

    Design and operation of a “six-flow fixed-bed microreactor” setup for Fischer-Tropsch synthesis (FTS) is described. The unit consists of feed and mixing, flow division, reaction, separation, and analysis sections. The reactor system is made of five heating blocks with individual temperature controllers, assuring an identical isothermal zone of at least 10 cm along six fixed-bed microreactor inserts (4 mm inner diameter). Such a lab-scale setup allows running six experiments in parallel, under equal feed composition, reaction temperature, and conditions of separation and analysis equipment. It permits separate collection of wax and liquid samples (from each flow line), allowing operation with highmore » productivities of C5+ hydrocarbons. The latter is crucial for a complete understanding of FTS product compositions and will represent an advantage over high-throughput setups with more than ten flows where such instrumental considerations lead to elevated equipment volume, cost, and operation complexity. The identical performance (of the six flows) under similar reaction conditions was assured by testing a same catalyst batch, loaded in all microreactors.« less

  18. A novel approach to the simultaneous extraction and non-targeted analysis of the small molecules metabolome and lipidome using 96-well solid phase extraction plates with column-switching technology.

    PubMed

    Li, Yubo; Zhang, Zhenzhu; Liu, Xinyu; Li, Aizhu; Hou, Zhiguo; Wang, Yuming; Zhang, Yanjun

    2015-08-28

    This study combines solid phase extraction (SPE) using 96-well plates with column-switching technology to construct a rapid and high-throughput method for the simultaneous extraction and non-targeted analysis of small molecules metabolome and lipidome based on ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry. This study first investigated the columns and analytical conditions for small molecules metabolome and lipidome, separated by an HSS T3 and BEH C18 columns, respectively. Next, the loading capacity and actuation duration of SPE were further optimized. Subsequently, SPE and column switching were used together to rapidly and comprehensively analyze the biological samples. The experimental results showed that the new analytical procedure had good precision and maintained sample stability (RSD<15%). The method was then satisfactorily applied to more widely analyze the small molecules metabolome and lipidome to test the throughput. The resulting method represents a new analytical approach for biological samples, and a highly useful tool for researches in metabolomics and lipidomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Theoretical Investigation of oxides for batteries and fuel cell applications

    NASA Astrophysics Data System (ADS)

    Ganesh, Panchapakesan; Lubimtsev, Andrew A.; Balachandran, Janakiraman

    I will present theoretical studies of Li-ion and proton-conducting oxides using a combination of theory and computations that involve Density Functional Theory based atomistic modeling, cluster-expansion based studies, global optimization, high-throughput computations and machine learning based investigation of ionic transport in oxide materials. In Li-ion intercalated oxides, we explain the experimentally observed (Nature Materials 12, 518-522 (2013)) 'intercalation pseudocapacitance' phenomenon, and explain why Nb2O5 is special to show this behavior when Li-ions are intercalated (J. Mater. Chem. A, 2013,1, 14951-14956), but not when Na-ions are used. In addition, we explore Li-ion intercalation theoretically in VO2 (B) phase, which is somewhat structurally similar to Nb2O5 and predict an interesting role of site-trapping on the voltage and capacity of the material, validated by ongoing experiments. Computations of proton conducting oxides explain why Y-doped BaZrO3 , one of the fastest proton conducting oxide, shows a decrease in conductivity above 20% Y-doping. Further, using high throughput computations and machine learning tools we discover general principles to improve proton conductivity. Acknowledgements: LDRD at ORNL and CNMS at ORNL

  20. ProbCD: enrichment analysis accounting for categorization uncertainty.

    PubMed

    Vêncio, Ricardo Z N; Shmulevich, Ilya

    2007-10-12

    As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.

  1. High-throughput fabrication of anti-counterfeiting colloid-based photoluminescent microtags using electrical nanoimprint lithography

    NASA Astrophysics Data System (ADS)

    Diaz, R.; Palleau, E.; Poirot, D.; Sangeetha, N. M.; Ressier, L.

    2014-08-01

    This work demonstrates the excellent capability of the recently developed electrical nanoimprint lithography (e-NIL) technique for quick, high-throughput production of well-defined colloid assemblies on surfaces. This is shown by fabricating micron-sized photoluminescent quick response (QR) codes based on the electrostatic directed trapping (so called nanoxerography process) of 28 nm colloidal lanthanide-doped upconverting NaYF4 nanocrystals. Influencing experimental parameters have been optimized and the contribution of triboelectrification in e-NIL was evidenced. Under the chosen conditions, more than 300 000 nanocrystal-based QR codes were fabricated on a 4 inch silicon wafer, in less than 15 min. These microtags were then transferred to transparent flexible films, to be easily integrated onto desired products. Invisible to the naked eye, they can be decoded and authenticated using an optical microscopy image of their specific photoluminescence mapping. Beyond this very promising application for product tracking and the anti-counterfeiting strategies, e-NIL nanoxerography, potentially applicable to any types of charged and/or polarizable colloids and pattern geometries opens up tremendous opportunities for industrial scale production of various other kinds of colloid-based devices and sensors.

  2. Identification of novel drug scaffolds for inhibition of SARS-CoV 3-Chymotrypsin-like protease using virtual and high-throughput screenings.

    PubMed

    Lee, Hyun; Mittal, Anuradha; Patel, Kavankumar; Gatuz, Joseph L; Truong, Lena; Torres, Jaime; Mulhearn, Debbie C; Johnson, Michael E

    2014-01-01

    We have used a combination of virtual screening (VS) and high-throughput screening (HTS) techniques to identify novel, non-peptidic small molecule inhibitors against human SARS-CoV 3CLpro. A structure-based VS approach integrating docking and pharmacophore based methods was employed to computationally screen 621,000 compounds from the ZINC library. The screening protocol was validated using known 3CLpro inhibitors and was optimized for speed, improved selectivity, and for accommodating receptor flexibility. Subsequently, a fluorescence-based enzymatic HTS assay was developed and optimized to experimentally screen approximately 41,000 compounds from four structurally diverse libraries chosen mainly based on the VS results. False positives from initial HTS hits were eliminated by a secondary orthogonal binding analysis using surface plasmon resonance (SPR). The campaign identified a reversible small molecule inhibitor exhibiting mixed-type inhibition with a K(i) value of 11.1 μM. Together, these results validate our protocols as suitable approaches to screen virtual and chemical libraries, and the newly identified compound reported in our study represents a promising structural scaffold to pursue for further SARS-CoV 3CLpro inhibitor development. Copyright © 2013. Published by Elsevier Ltd.

  3. Transfer, Imaging, and Analysis Plate for Facile Handling of 384 Hanging Drop 3D Tissue Spheroids

    PubMed Central

    Cavnar, Stephen P.; Salomonsson, Emma; Luker, Kathryn E.; Luker, Gary D.; Takayama, Shuichi

    2014-01-01

    Three-dimensional culture systems bridge the experimental gap between in vivo and in vitro physiology. However, nonstandardized formation and limited downstream adaptability of 3D cultures have hindered mainstream adoption of these systems for biological applications, especially for low- and moderate-throughput assays commonly used in biomedical research. Here we build on our recent development of a 384-well hanging drop plate for spheroid culture to design a complementary spheroid transfer and imaging (TRIM) plate. The low-aspect ratio wells of the TRIM plate facilitated highfidelity, user-independent, contact-based collection of hanging drop spheroids. Using the TRIM plate, we demonstrated several downstream analyses, including bulk tissue collection for flow cytometry, high-resolution low working-distance immersion imaging, and timely reagent delivery for enzymatic studies. Low working-distance multiphoton imaging revealed a cell type–dependent, macroscopic spheroid structure. Unlike ovarian cancer spheroids, which formed loose, disk-shaped spheroids, human mammary fibroblasts formed tight, spherical, and nutrient-limited spheroids. Beyond the applications we describe here, we expect the hanging drop spheroid plate and complementary TRIM plate to facilitate analyses of spheroids across the spectrum of throughput, particularly for bulk collection of spheroids and high-content imaging. PMID:24051516

  4. MotifMark: Finding regulatory motifs in DNA sequences.

    PubMed

    Hassanzadeh, Hamid Reza; Kolhe, Pushkar; Isbell, Charles L; Wang, May D

    2017-07-01

    The interaction between proteins and DNA is a key driving force in a significant number of biological processes such as transcriptional regulation, repair, recombination, splicing, and DNA modification. The identification of DNA-binding sites and the specificity of target proteins in binding to these regions are two important steps in understanding the mechanisms of these biological activities. A number of high-throughput technologies have recently emerged that try to quantify the affinity between proteins and DNA motifs. Despite their success, these technologies have their own limitations and fall short in precise characterization of motifs, and as a result, require further downstream analysis to extract useful and interpretable information from a haystack of noisy and inaccurate data. Here we propose MotifMark, a new algorithm based on graph theory and machine learning, that can find binding sites on candidate probes and rank their specificity in regard to the underlying transcription factor. We developed a pipeline to analyze experimental data derived from compact universal protein binding microarrays and benchmarked it against two of the most accurate motif search methods. Our results indicate that MotifMark can be a viable alternative technique for prediction of motif from protein binding microarrays and possibly other related high-throughput techniques.

  5. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  6. Biologically Relevant Heterogeneity: Metrics and Practical Insights.

    PubMed

    Gough, Albert; Stern, Andrew M; Maier, John; Lezon, Timothy; Shun, Tong-Ying; Chennubhotla, Chakra; Schurdak, Mark E; Haney, Steven A; Taylor, D Lansing

    2017-03-01

    Heterogeneity is a fundamental property of biological systems at all scales that must be addressed in a wide range of biomedical applications, including basic biomedical research, drug discovery, diagnostics, and the implementation of precision medicine. There are a number of published approaches to characterizing heterogeneity in cells in vitro and in tissue sections. However, there are no generally accepted approaches for the detection and quantitation of heterogeneity that can be applied in a relatively high-throughput workflow. This review and perspective emphasizes the experimental methods that capture multiplexed cell-level data, as well as the need for standard metrics of the spatial, temporal, and population components of heterogeneity. A recommendation is made for the adoption of a set of three heterogeneity indices that can be implemented in any high-throughput workflow to optimize the decision-making process. In addition, a pairwise mutual information method is suggested as an approach to characterizing the spatial features of heterogeneity, especially in tissue-based imaging. Furthermore, metrics for temporal heterogeneity are in the early stages of development. Example studies indicate that the analysis of functional phenotypic heterogeneity can be exploited to guide decisions in the interpretation of biomedical experiments, drug discovery, diagnostics, and the design of optimal therapeutic strategies for individual patients.

  7. Functional approach to high-throughput plant growth analysis

    PubMed Central

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  8. Introducing Bayesian thinking to high-throughput screening for false-negative rate estimation.

    PubMed

    Wei, Xin; Gao, Lin; Zhang, Xiaolei; Qian, Hong; Rowan, Karen; Mark, David; Peng, Zhengwei; Huang, Kuo-Sen

    2013-10-01

    High-throughput screening (HTS) has been widely used to identify active compounds (hits) that bind to biological targets. Because of cost concerns, the comprehensive screening of millions of compounds is typically conducted without replication. Real hits that fail to exhibit measurable activity in the primary screen due to random experimental errors will be lost as false-negatives. Conceivably, the projected false-negative rate is a parameter that reflects screening quality. Furthermore, it can be used to guide the selection of optimal numbers of compounds for hit confirmation. Therefore, a method that predicts false-negative rates from the primary screening data is extremely valuable. In this article, we describe the implementation of a pilot screen on a representative fraction (1%) of the screening library in order to obtain information about assay variability as well as a preliminary hit activity distribution profile. Using this training data set, we then developed an algorithm based on Bayesian logic and Monte Carlo simulation to estimate the number of true active compounds and potential missed hits from the full library screen. We have applied this strategy to five screening projects. The results demonstrate that this method produces useful predictions on the numbers of false negatives.

  9. An overview of bioinformatics methods for modeling biological pathways in yeast.

    PubMed

    Hou, Jie; Acharya, Lipi; Zhu, Dongxiao; Cheng, Jianlin

    2016-03-01

    The advent of high-throughput genomics techniques, along with the completion of genome sequencing projects, identification of protein-protein interactions and reconstruction of genome-scale pathways, has accelerated the development of systems biology research in the yeast organism Saccharomyces cerevisiae In particular, discovery of biological pathways in yeast has become an important forefront in systems biology, which aims to understand the interactions among molecules within a cell leading to certain cellular processes in response to a specific environment. While the existing theoretical and experimental approaches enable the investigation of well-known pathways involved in metabolism, gene regulation and signal transduction, bioinformatics methods offer new insights into computational modeling of biological pathways. A wide range of computational approaches has been proposed in the past for reconstructing biological pathways from high-throughput datasets. Here we review selected bioinformatics approaches for modeling biological pathways inS. cerevisiae, including metabolic pathways, gene-regulatory pathways and signaling pathways. We start with reviewing the research on biological pathways followed by discussing key biological databases. In addition, several representative computational approaches for modeling biological pathways in yeast are discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. Raman microspectrometer combined with scattering microscopy and lensless imaging for bacteria identification

    NASA Astrophysics Data System (ADS)

    Strola, S. A.; Schultz, E.; Allier, C. P.; DesRoches, B.; Lemmonier, J.; Dinten, J.-M.

    2013-03-01

    In this paper, we report on a compact prototype capable both of lensfree imaging, Raman spectrometry and scattering microscopy from bacteria samples. This instrument allows high-throughput real-time characterization without the need of markers, making it potentially suitable to field label-free biomedical and environmental applications. Samples are illuminated from above with a focused-collimated 532nm laser beam and can be x-y-z scanned. The bacteria detection is based on emerging lensfree imaging technology able to localize cells of interest over a large field-of-view of 24mm2. Raman signal and scattered light are then collected by separate measurement arms simultaneously. In the first arm the emission light is fed by a fiber into a prototype spectrometer, developed by Tornado Spectral System based on Tornado's High Throughput Virtual Slit (HTVS) novel technology. The enhanced light throughput in the spectral region of interest (500-1800 cm-1) reduces Raman acquisition time down to few seconds, thus facilitating experimental protocols and avoiding the bacteria deterioration induced by laser thermal heating. Scattered light impinging in the second arm is collected onto a charge-coupled-device. The reconstructed image allows studying the single bacteria diffraction pattern and their specific structural features. The characterization and identification of different bacteria have been performed to validate and optimize the acquisition system and the component setup. The results obtained demonstrate the benefits of these three techniques combination by providing the precise bacteria localization, their chemical composition and a morphology description. The procedure for a rapid identification of particular pathogen bacteria in a sample is illustrated.

  11. Lensless on-chip imaging of cells provides a new tool for high-throughput cell-biology and medical diagnostics.

    PubMed

    Mudanyali, Onur; Erlinger, Anthony; Seo, Sungkyu; Su, Ting-Wei; Tseng, Derek; Ozcan, Aydogan

    2009-12-14

    Conventional optical microscopes image cells by use of objective lenses that work together with other lenses and optical components. While quite effective, this classical approach has certain limitations for miniaturization of the imaging platform to make it compatible with the advanced state of the art in microfluidics. In this report, we introduce experimental details of a lensless on-chip imaging concept termed LUCAS (Lensless Ultra-wide field-of-view Cell monitoring Array platform based on Shadow imaging) that does not require any microscope objectives or other bulky optical components to image a heterogeneous cell solution over an ultra-wide field of view that can span as large as approximately 18 cm(2). Moreover, unlike conventional microscopes, LUCAS can image a heterogeneous cell solution of interest over a depth-of-field of approximately 5 mm without the need for refocusing which corresponds to up to approximately 9 mL sample volume. This imaging platform records the shadows (i.e., lensless digital holograms) of each cell of interest within its field of view, and automated digital processing of these cell shadows can determine the type, the count and the relative positions of cells within the solution. Because it does not require any bulky optical components or mechanical scanning stages it offers a significantly miniaturized platform that at the same time reduces the cost, which is quite important for especially point of care diagnostic tools. Furthermore, the imaging throughput of this platform is orders of magnitude better than conventional optical microscopes, which could be exceedingly valuable for high-throughput cell-biology experiments.

  12. Lensless On-chip Imaging of Cells Provides a New Tool for High-throughput Cell-Biology and Medical Diagnostics

    PubMed Central

    Mudanyali, Onur; Erlinger, Anthony; Seo, Sungkyu; Su, Ting-Wei; Tseng, Derek; Ozcan, Aydogan

    2009-01-01

    Conventional optical microscopes image cells by use of objective lenses that work together with other lenses and optical components. While quite effective, this classical approach has certain limitations for miniaturization of the imaging platform to make it compatible with the advanced state of the art in microfluidics. In this report, we introduce experimental details of a lensless on-chip imaging concept termed LUCAS (Lensless Ultra-wide field-of-view Cell monitoring Array platform based on Shadow imaging) that does not require any microscope objectives or other bulky optical components to image a heterogeneous cell solution over an ultra-wide field of view that can span as large as ~18 cm2. Moreover, unlike conventional microscopes, LUCAS can image a heterogeneous cell solution of interest over a depth-of-field of ~5 mm without the need for refocusing which corresponds to up to ~9 mL sample volume. This imaging platform records the shadows (i.e., lensless digital holograms) of each cell of interest within its field of view, and automated digital processing of these cell shadows can determine the type, the count and the relative positions of cells within the solution. Because it does not require any bulky optical components or mechanical scanning stages it offers a significantly miniaturized platform that at the same time reduces the cost, which is quite important for especially point of care diagnostic tools. Furthermore, the imaging throughput of this platform is orders of magnitude better than conventional optical microscopes, which could be exceedingly valuable for high-throughput cell-biology experiments. PMID:20010542

  13. Development of a high-throughput brain slice method for studying drug distribution in the central nervous system.

    PubMed

    Fridén, Markus; Ducrozet, Frederic; Middleton, Brian; Antonsson, Madeleine; Bredberg, Ulf; Hammarlund-Udenaes, Margareta

    2009-06-01

    New, more efficient methods of estimating unbound drug concentrations in the central nervous system (CNS) combine the amount of drug in whole brain tissue samples measured by conventional methods with in vitro estimates of the unbound brain volume of distribution (V(u,brain)). Although the brain slice method is the most reliable in vitro method for measuring V(u,brain), it has not previously been adapted for the needs of drug discovery research. The aim of this study was to increase the throughput and optimize the experimental conditions of this method. Equilibrium of drug between the buffer and the brain slice within the 4 to 5 h of incubation is a fundamental requirement. However, it is difficult to meet this requirement for many of the extensively binding, lipophilic compounds in drug discovery programs. In this study, the dimensions of the incubation vessel and mode of stirring influenced the equilibration time, as did the amount of brain tissue per unit of buffer volume. The use of cassette experiments for investigating V(u,brain) in a linear drug concentration range increased the throughput of the method. The V(u,brain) for the model compounds ranged from 4 to 3000 ml . g brain(-1), and the sources of variability are discussed. The optimized setup of the brain slice method allows precise, robust estimation of V(u,brain) for drugs with diverse properties, including highly lipophilic compounds. This is a critical step forward for the implementation of relevant measurements of CNS exposure in the drug discovery setting.

  14. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  15. CoMiniGut-a small volume in vitro colon model for the screening of gut microbial fermentation processes.

    PubMed

    Wiese, Maria; Khakimov, Bekzod; Nielsen, Sebastian; Sørensen, Helena; van den Berg, Frans; Nielsen, Dennis Sandris

    2018-01-01

    Driven by the growing recognition of the influence of the gut microbiota (GM) on human health and disease, there is a rapidly increasing interest in understanding how dietary components, pharmaceuticals and pre- and probiotics influence GM. In vitro colon models represent an attractive tool for this purpose. With the dual objective of facilitating the investigation of rare and expensive compounds, as well as an increased throughput, we have developed a prototype in vitro parallel gut microbial fermentation screening tool with a working volume of only 5 ml consisting of five parallel reactor units that can be expanded with multiples of five to increase throughput. This allows e.g., the investigation of interpersonal variations in gut microbial dynamics and the acquisition of larger data sets with enhanced statistical inference. The functionality of the in vitro colon model, Copenhagen MiniGut (CoMiniGut) was first demonstrated in experiments with two common prebiotics using the oligosaccharide inulin and the disaccharide lactulose at 1% (w/v). We then investigated fermentation of the scarce and expensive human milk oligosaccharides (HMOs) 3-Fucosyllactose, 3-Sialyllactose, 6-Sialyllactose and the more common Fructooligosaccharide in fermentations with infant gut microbial communities. Investigations of microbial community composition dynamics in the CoMiniGut reactors by MiSeq-based 16S rRNA gene amplicon high throughput sequencing showed excellent experimental reproducibility and allowed us to extract significant differences in gut microbial composition after 24 h of fermentation for all investigated substrates and fecal donors. Furthermore, short chain fatty acids (SCFAs) were quantified for all treatments and donors. Fermentations with inulin and lactulose showed that inulin leads to a microbiota dominated by obligate anaerobes, with high relative abundance of Bacteroidetes, while the more easily fermented lactulose leads to higher relative abundance of Proteobacteria. The subsequent study on the influence of HMOs on two infant GM communities, revealed the strongest bifidogenic effect for 3'SL for both infants. Inter-individual differences of infant GM, especially with regards to the occurrence of Bacteroidetes and differences in bifidobacterial species composition, correlated with varying degrees of HMO utilization foremost of 6'SL and 3'FL, indicating species and strain related differences in HMO utilization which was also reflected in SCFAs concentrations, with 3'SL and 6'SL resulting in significantly higher butyrate production compared to 3'FL. In conclusion, the increased throughput of CoMiniGut strengthens experimental conclusions through elimination of statistical interferences originating from low number of repetitions. Its small working volume moreover allows the investigation of rare and expensive bioactives.

  16. Computer Simulation and Field Experiment for Downlink Multiuser MIMO in Mobile WiMAX System.

    PubMed

    Yamaguchi, Kazuhiro; Nagahashi, Takaharu; Akiyama, Takuya; Matsue, Hideaki; Uekado, Kunio; Namera, Takakazu; Fukui, Hiroshi; Nanamatsu, Satoshi

    2015-01-01

    The transmission performance for a downlink mobile WiMAX system with multiuser multiple-input multiple-output (MU-MIMO) systems in a computer simulation and field experiment is described. In computer simulation, a MU-MIMO transmission system can be realized by using the block diagonalization (BD) algorithm, and each user can receive signals without any signal interference from other users. The bit error rate (BER) performance and channel capacity in accordance with modulation schemes and the number of streams were simulated in a spatially correlated multipath fading environment. Furthermore, we propose a method for evaluating the transmission performance for this downlink mobile WiMAX system in this environment by using the computer simulation. In the field experiment, the received power and downlink throughput in the UDP layer were measured on an experimental mobile WiMAX system developed in Azumino City in Japan. In comparison with the simulated and experimented results, the measured maximum throughput performance in the downlink had almost the same performance as the simulated throughput. It was confirmed that the experimental mobile WiMAX system for MU-MIMO transmission successfully increased the total channel capacity of the system.

  17. Computer Simulation and Field Experiment for Downlink Multiuser MIMO in Mobile WiMAX System

    PubMed Central

    Yamaguchi, Kazuhiro; Nagahashi, Takaharu; Akiyama, Takuya; Matsue, Hideaki; Uekado, Kunio; Namera, Takakazu; Fukui, Hiroshi; Nanamatsu, Satoshi

    2015-01-01

    The transmission performance for a downlink mobile WiMAX system with multiuser multiple-input multiple-output (MU-MIMO) systems in a computer simulation and field experiment is described. In computer simulation, a MU-MIMO transmission system can be realized by using the block diagonalization (BD) algorithm, and each user can receive signals without any signal interference from other users. The bit error rate (BER) performance and channel capacity in accordance with modulation schemes and the number of streams were simulated in a spatially correlated multipath fading environment. Furthermore, we propose a method for evaluating the transmission performance for this downlink mobile WiMAX system in this environment by using the computer simulation. In the field experiment, the received power and downlink throughput in the UDP layer were measured on an experimental mobile WiMAX system developed in Azumino City in Japan. In comparison with the simulated and experimented results, the measured maximum throughput performance in the downlink had almost the same performance as the simulated throughput. It was confirmed that the experimental mobile WiMAX system for MU-MIMO transmission successfully increased the total channel capacity of the system. PMID:26421311

  18. GiNA, an efficient and high-throughput software for horticultural phenotyping

    USDA-ARS?s Scientific Manuscript database

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...

  19. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

Top