Sample records for high throughput large

  1. High Throughput Experimental Materials Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakutayev, Andriy; Perkins, John; Schwarting, Marcus

    The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).

  2. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  3. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  4. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  5. Large-scale microfluidics providing high-resolution and high-throughput screening of Caenorhabditis elegans poly-glutamine aggregation model

    NASA Astrophysics Data System (ADS)

    Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela

    2016-10-01

    Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.

  6. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    PubMed

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  7. Evaluation of Compatibility of ToxCast High-Throughput/High-Content Screening Assays with Engineered Nanomaterials

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  8. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  9. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  10. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  11. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  12. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  13. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    NASA Astrophysics Data System (ADS)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  14. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations.

    PubMed

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  15. AOPs and Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making

    EPA Science Inventory

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will b...

  16. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    EPA Science Inventory

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  17. Next-generation sequencing coupled with a cell-free display technology for high-throughput production of reliable interactome data

    PubMed Central

    Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko

    2012-01-01

    Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904

  18. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    NASA Astrophysics Data System (ADS)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  19. Predictive Model of Rat Reproductive Toxicity from ToxCast High Throughput Screening

    EPA Science Inventory

    The EPA ToxCast research program uses high throughput screening for bioactivity profiling and predicting the toxicity of large numbers of chemicals. ToxCast Phase‐I tested 309 well‐characterized chemicals in over 500 assays for a wide range of molecular targets and cellular respo...

  20. High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)

    USDA-ARS?s Scientific Manuscript database

    Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...

  1. Sensitivity of neuroprogenitor cells to chemical-induced apoptosis using a multiplexed assay suitable for high-throughput screening*

    EPA Science Inventory

    AbstractHigh-throughput methods are useful for rapidly screening large numbers of chemicals for biological activity, including the perturbation of pathways that may lead to adverse cellular effects. In vitro assays for the key events of neurodevelopment, including apoptosis, may ...

  2. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    PubMed

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.

  3. A high-throughput microRNA expression profiling system.

    PubMed

    Guo, Yanwen; Mastriano, Stephen; Lu, Jun

    2014-01-01

    As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.

  4. tcpl: the ToxCast pipeline for high-throughput screening data.

    PubMed

    Filer, Dayne L; Kothiya, Parth; Setzer, R Woodrow; Judson, Richard S; Martin, Matthew T

    2017-02-15

    Large high-throughput screening (HTS) efforts are widely used in drug development and chemical toxicity screening. Wide use and integration of these data can benefit from an efficient, transparent and reproducible data pipeline. Summary: The tcpl R package and its associated MySQL database provide a generalized platform for efficiently storing, normalizing and dose-response modeling of large high-throughput and high-content chemical screening data. The novel dose-response modeling algorithm has been tested against millions of diverse dose-response series, and robustly fits data with outliers and cytotoxicity-related signal loss. tcpl is freely available on the Comprehensive R Archive Network under the GPL-2 license. martin.matt@epa.gov. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  5. Quantitative description on structure–property relationships of Li-ion battery materials for high-throughput computations

    PubMed Central

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737

  6. Review of high-throughput techniques for detecting solid phase Transformation from material libraries produced by combinatorial methods

    NASA Technical Reports Server (NTRS)

    Lee, Jonathan A.

    2005-01-01

    High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.

  7. A Framework for Linking High-Throughput Modeling and Measurement Efforts to Advance 21st Century Exposure Science

    EPA Science Inventory

    The past five years have witnessed a rapid shift in the exposure science and toxicology communities towards high-throughput (HT) analyses of chemicals as potential stressors of human and ecological health. Modeling efforts have largely led the charge in the exposure science field...

  8. Uncertainty Quantification in High Throughput Screening: Applications to Models of Endocrine Disruption, Cytotoxicity, and Zebrafish Development (GRC Drug Safety)

    EPA Science Inventory

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...

  9. Differentiating pathway-based toxicity from non-specific effects in high throughput data: A foundation for prioritizing targets for AOP development.

    EPA Science Inventory

    The Environmental Protection Agency has implemented a high throughput screening program, ToxCast, to quickly evaluate large numbers of chemicals for their effects on hundreds of different biological targets. To understand how these measurements relate to adverse effects in an or...

  10. Development of a high-throughput SNP resource to advance genomic, genetic and breeding research in carrot (Daucus carota L.)

    USDA-ARS?s Scientific Manuscript database

    The rapid advancement in high-throughput SNP genotyping technologies along with next generation sequencing (NGS) platforms has decreased the cost, improved the quality of large-scale genome surveys, and allowed specialty crops with limited genomic resources such as carrot (Daucus carota) to access t...

  11. Integrated automation for continuous high-throughput synthetic chromosome assembly and transformation to identify improved yeast strains for industrial production of peptide sweetener brazzein

    USDA-ARS?s Scientific Manuscript database

    Production and recycling of recombinant sweetener peptides in industrial biorefineries involves the evaluation of large numbers of genes and proteins. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly synthesize, clone, and express heterologous gene ope...

  12. Life in the fast lane: high-throughput chemistry for lead generation and optimisation.

    PubMed

    Hunter, D

    2001-01-01

    The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.

  13. High-throughput analysis of yeast replicative aging using a microfluidic system

    PubMed Central

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-01-01

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317

  14. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    PubMed

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  15. Structuring intuition with theory: The high-throughput way

    NASA Astrophysics Data System (ADS)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  16. Strain Library Imaging Protocol for high-throughput, automated single-cell microscopy of large bacterial collections arrayed on multiwell plates.

    PubMed

    Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey

    2017-02-01

    Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.

  17. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  18. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    USDA-ARS?s Scientific Manuscript database

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  19. TCP Throughput Profiles Using Measurements over Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less

  20. Operational evaluation of high-throughput community-based mass prophylaxis using Just-in-time training.

    PubMed

    Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei

    2007-01-01

    Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.

  1. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  2. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    PubMed

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  3. High-Throughput Tabular Data Processor – Platform independent graphical tool for processing large data sets

    PubMed Central

    Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475

  4. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  5. High Throughput, High Content Screening for Novel Pigmentation Regulators Using a Keratinocyte/Melanocyte Co-culture System

    PubMed Central

    Lee, Ju Hee; Chen, Hongxiang; Kolev, Vihren; Aull, Katherine H.; Jung, Inhee; Wang, Jun; Miyamoto, Shoko; Hosoi, Junichi; Mandinova, Anna; Fisher, David E.

    2014-01-01

    Skin pigmentation is a complex process including melanogenesis within melanocytes and melanin transfer to the keratinocytes. To develop a comprehensive screening method for novel pigmentation regulators, we used immortalized melanocytes and keratinocytes in co-culture to screen large numbers of compounds. High-throughput screening plates were subjected to digital automated microscopy to quantify the pigmentation via brightfield microscopy. Compounds with pigment suppression were secondarily tested for their effects on expression of MITF and several pigment regulatory genes, and further validated in terms of non-toxicity to keratinocytes/melanocytes and dose dependent activity. The results demonstrate a high-throughput, high-content screening approach, which is applicable to the analysis of large chemical libraries using a co-culture system. We identified candidate pigmentation inhibitors from 4,000 screened compounds including zoxazolamine, 3-methoxycatechol, and alpha-mangostin, which were also shown to modulate expression of MITF and several key pigmentation factors, and are worthy of further evaluation for potential translation to clinical use. PMID:24438532

  6. A high-throughput pipeline for the production of synthetic antibodies for analysis of ribonucleoprotein complexes

    PubMed Central

    Na, Hong; Laver, John D.; Jeon, Jouhyun; Singh, Fateh; Ancevicius, Kristin; Fan, Yujie; Cao, Wen Xi; Nie, Kun; Yang, Zhenglin; Luo, Hua; Wang, Miranda; Rissland, Olivia; Westwood, J. Timothy; Kim, Philip M.; Smibert, Craig A.; Lipshitz, Howard D.; Sidhu, Sachdev S.

    2016-01-01

    Post-transcriptional regulation of mRNAs plays an essential role in the control of gene expression. mRNAs are regulated in ribonucleoprotein (RNP) complexes by RNA-binding proteins (RBPs) along with associated protein and noncoding RNA (ncRNA) cofactors. A global understanding of post-transcriptional control in any cell type requires identification of the components of all of its RNP complexes. We have previously shown that these complexes can be purified by immunoprecipitation using anti-RBP synthetic antibodies produced by phage display. To develop the large number of synthetic antibodies required for a global analysis of RNP complex composition, we have established a pipeline that combines (i) a computationally aided strategy for design of antigens located outside of annotated domains, (ii) high-throughput antigen expression and purification in Escherichia coli, and (iii) high-throughput antibody selection and screening. Using this pipeline, we have produced 279 antibodies against 61 different protein components of Drosophila melanogaster RNPs. Together with those produced in our low-throughput efforts, we have a panel of 311 antibodies for 67 RNP complex proteins. Tests of a subset of our antibodies demonstrated that 89% immunoprecipitate their endogenous target from embryo lysate. This panel of antibodies will serve as a resource for global studies of RNP complexes in Drosophila. Furthermore, our high-throughput pipeline permits efficient production of synthetic antibodies against any large set of proteins. PMID:26847261

  7. From drug to protein: using yeast genetics for high-throughput target discovery.

    PubMed

    Armour, Christopher D; Lum, Pek Yee

    2005-02-01

    The budding yeast Saccharomyces cerevisiae has long been an effective eukaryotic model system for understanding basic cellular processes. The genetic tractability and ease of manipulation in the laboratory make yeast well suited for large-scale chemical and genetic screens. Several recent studies describing the use of yeast genetics for high-throughput drug target identification are discussed in this review.

  8. Mobile element biology – new possibilities with high-throughput sequencing

    PubMed Central

    Xing, Jinchuan; Witherspoon, David J.; Jorde, Lynn B.

    2014-01-01

    Mobile elements compose more than half of the human genome, but until recently their large-scale detection was time-consuming and challenging. With the development of new high-throughput sequencing technologies, the complete spectrum of mobile element variation in humans can now be identified and analyzed. Thousands of new mobile element insertions have been discovered, yielding new insights into mobile element biology, evolution, and genomic variation. We review several high-throughput methods, with an emphasis on techniques that specifically target mobile element insertions in humans, and we highlight recent applications of these methods in evolutionary studies and in the analysis of somatic alterations in human cancers. PMID:23312846

  9. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less

  10. HTP-OligoDesigner: An Online Primer Design Tool for High-Throughput Gene Cloning and Site-Directed Mutagenesis.

    PubMed

    Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor

    2016-01-01

    Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.

  11. Complementing in vitro hazard assessment with exposure and pharmacokinetics considerations for chemical prioritization

    EPA Science Inventory

    Traditional toxicity testing involves a large investment in resources, often using low-throughput in vivo animal studies for limited numbers of chemicals. An alternative strategy is the emergence of high-throughput (HT) in vitro assays as a rapid, cost-efficient means to screen t...

  12. A high-throughput label-free nanoparticle analyser.

    PubMed

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  13. Industrializing electrophysiology: HT automated patch clamp on SyncroPatch® 96 using instant frozen cells.

    PubMed

    Polonchuk, Liudmila

    2014-01-01

    Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.

  14. The autism sequencing consortium: large-scale, high-throughput sequencing in autism spectrum disorders.

    PubMed

    Buxbaum, Joseph D; Daly, Mark J; Devlin, Bernie; Lehner, Thomas; Roeder, Kathryn; State, Matthew W

    2012-12-20

    Research during the past decade has seen significant progress in the understanding of the genetic architecture of autism spectrum disorders (ASDs), with gene discovery accelerating as the characterization of genomic variation has become increasingly comprehensive. At the same time, this research has highlighted ongoing challenges. Here we address the enormous impact of high-throughput sequencing (HTS) on ASD gene discovery, outline a consensus view for leveraging this technology, and describe a large multisite collaboration developed to accomplish these goals. Similar approaches could prove effective for severe neurodevelopmental disorders more broadly. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Identification of functional modules using network topology and high-throughput data.

    PubMed

    Ulitsky, Igor; Shamir, Ron

    2007-01-26

    With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.

  16. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  17. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Elo; Huang, Amy; Cadag, Eithon

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  18. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE PAGES

    Leung, Elo; Huang, Amy; Cadag, Eithon; ...

    2016-01-20

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  19. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  20. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  1. High-Throughput Assay Optimization and Statistical Interpolation of Rubella-Specific Neutralizing Antibody Titers

    PubMed Central

    Lambert, Nathaniel D.; Pankratz, V. Shane; Larrabee, Beth R.; Ogee-Nwankwo, Adaeze; Chen, Min-hsin; Icenogle, Joseph P.

    2014-01-01

    Rubella remains a social and economic burden due to the high incidence of congenital rubella syndrome (CRS) in some countries. For this reason, an accurate and efficient high-throughput measure of antibody response to vaccination is an important tool. In order to measure rubella-specific neutralizing antibodies in a large cohort of vaccinated individuals, a high-throughput immunocolorimetric system was developed. Statistical interpolation models were applied to the resulting titers to refine quantitative estimates of neutralizing antibody titers relative to the assayed neutralizing antibody dilutions. This assay, including the statistical methods developed, can be used to assess the neutralizing humoral immune response to rubella virus and may be adaptable for assessing the response to other viral vaccines and infectious agents. PMID:24391140

  2. Emory University: High-Throughput Protein-Protein Interaction Dataset for Lung Cancer-Associated Genes | Office of Cancer Genomics

    Cancer.gov

    To discover novel PPI signaling hubs for lung cancer, CTD2 Center at Emory utilized large-scale genomics datasets and literature to compile a set of lung cancer-associated genes. A library of expression vectors were generated for these genes and utilized for detecting pairwise PPIs with cell lysate-based TR-FRET assays in high-throughput screening format. Read the abstract.

  3. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  4. web cellHTS2: a web-application for the analysis of high-throughput screening data.

    PubMed

    Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael

    2010-04-12

    The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  5. Controlling high-throughput manufacturing at the nano-scale

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  6. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  7. Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing

    PubMed Central

    Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad

    2015-01-01

    Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407

  8. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    PubMed

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand.

  9. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic

    PubMed Central

    Kerkhove, Dwight; Tian, Le; Munteanu, Adrian; De Poorter, Eli

    2018-01-01

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand. PMID:29360798

  10. High-throughput and automated SAXS/USAXS experiment for industrial use at BL19B2 in SPring-8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osaka, Keiichi, E-mail: k-osaka@spring8.or.jp; Inoue, Daisuke; Sato, Masugu

    A highly automated system combining a sample transfer robot with focused SR beam has been established for small-angle and ultra small-angle X-ray scattering (SAXS/USAXS) measurement at BL19B2 for industrial use of SPring-8. High-throughput data collection system can be realized by means of X-ray beam of high photon flux density concentrated by a cylindrical mirror, and a two-dimensional pixel detector PILATUS-2M. For SAXS measurement, we can obtain high-quality data within 1 minute for one exposure using this system. The sample transfer robot has a capacity of 90 samples with a large variety of shapes. The fusion of high-throughput and robotic systemmore » has enhanced the usability of SAXS/USAXS capability for industrial application.« less

  11. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  12. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis

    PubMed Central

    2012-01-01

    Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739

  13. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.

    PubMed

    Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong

    2012-01-25

    The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.

  14. High-Throughput Lectin Microarray-Based Analysis of Live Cell Surface Glycosylation

    PubMed Central

    Li, Yu; Tao, Sheng-ce; Zhu, Heng; Schneck, Jonathan P.

    2011-01-01

    Lectins, plant-derived glycan-binding proteins, have long been used to detect glycans on cell surfaces. However, the techniques used to characterize serum or cells have largely been limited to mass spectrometry, blots, flow cytometry, and immunohistochemistry. While these lectin-based approaches are well established and they can discriminate a limited number of sugar isomers by concurrently using a limited number of lectins, they are not amenable for adaptation to a high-throughput platform. Fortunately, given the commercial availability of lectins with a variety of glycan specificities, lectins can be printed on a glass substrate in a microarray format to profile accessible cell-surface glycans. This method is an inviting alternative for analysis of a broad range of glycans in a high-throughput fashion and has been demonstrated to be a feasible method of identifying binding-accessible cell surface glycosylation on living cells. The current unit presents a lectin-based microarray approach for analyzing cell surface glycosylation in a high-throughput fashion. PMID:21400689

  15. Near-common-path interferometer for imaging Fourier-transform spectroscopy in wide-field microscopy

    PubMed Central

    Wadduwage, Dushan N.; Singh, Vijay Raj; Choi, Heejin; Yaqoob, Zahid; Heemskerk, Hans; Matsudaira, Paul; So, Peter T. C.

    2017-01-01

    Imaging Fourier-transform spectroscopy (IFTS) is a powerful method for biological hyperspectral analysis based on various imaging modalities, such as fluorescence or Raman. Since the measurements are taken in the Fourier space of the spectrum, it can also take advantage of compressed sensing strategies. IFTS has been readily implemented in high-throughput, high-content microscope systems based on wide-field imaging modalities. However, there are limitations in existing wide-field IFTS designs. Non-common-path approaches are less phase-stable. Alternatively, designs based on the common-path Sagnac interferometer are stable, but incompatible with high-throughput imaging. They require exhaustive sequential scanning over large interferometric path delays, making compressive strategic data acquisition impossible. In this paper, we present a novel phase-stable, near-common-path interferometer enabling high-throughput hyperspectral imaging based on strategic data acquisition. Our results suggest that this approach can improve throughput over those of many other wide-field spectral techniques by more than an order of magnitude without compromising phase stability. PMID:29392168

  16. High-throughput method for the quantitation of metabolites and co-factors from homocysteine-methionine cycle for nutritional status assessment.

    PubMed

    Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping

    2016-09-01

    There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.

  17. The next generation CdTe technology- Substrate foil based solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferekides, Chris

    The main objective of this project was the development of one of the most promising Photovoltaic (PV) materials CdTe into a versatile, cost effective, and high throughput technology, by demonstrating substrate devices on foil substrates using high throughput fabrication conditions. The typical CdTe cell is of the superstrate configuration where the solar cell is fabricated on a glass superstrate by the sequential deposition of a TCO, n-type heterojunction partner, p-CdTe absorber, and back contact. Large glass modules are heavy and present significant challenges during manufacturing (uniform heating, etc.). If a substrate CdTe cell could be developed (the main goal ofmore » this project) a roll-to-toll high throughput technology could be developed.« less

  18. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  19. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    PubMed Central

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  20. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  1. A new fungal large subunit ribosomal RNA primer for high throughput sequencing surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Rebecca C.; Gallegos-Graves, La Verne; Kuske, Cheryl R.

    The inclusion of phylogenetic metrics in community ecology has provided insights into important ecological processes, particularly when combined with high-throughput sequencing methods; however, these approaches have not been widely used in studies of fungal communities relative to other microbial groups. Two obstacles have been considered: (1) the internal transcribed spacer (ITS) region has limited utility for constructing phylogenies and (2) most PCR primers that target the large subunit (LSU) ribosomal unit generate amplicons that exceed current limits of high-throughput sequencing platforms. We designed and tested a PCR primer (LR22R) to target approximately 300–400 bp region of the D2 hypervariable regionmore » of the fungal LSU for use with the Illumina MiSeq platform. Both in silico and empirical analyses showed that the LR22R–LR3 pair captured a broad range of fungal taxonomic groups with a small fraction of non-fungal groups. Phylogenetic placement of publically available LSU D2 sequences showed broad agreement with taxonomic classification. Comparisons of the LSU D2 and the ITS2 ribosomal regions from environmental samples and known communities showed similar discriminatory abilities of the two primer sets. Altogether, these findings show that the LR22R–LR3 primer pair has utility for phylogenetic analyses of fungal communities using high-throughput sequencing methods.« less

  2. Performance comparison of SNP detection tools with illumina exome sequencing data—an assessment using both family pedigree information and sample-matched SNP array data

    PubMed Central

    Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.

    2014-01-01

    To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545

  3. A new fungal large subunit ribosomal RNA primer for high throughput sequencing surveys

    DOE PAGES

    Mueller, Rebecca C.; Gallegos-Graves, La Verne; Kuske, Cheryl R.

    2015-12-09

    The inclusion of phylogenetic metrics in community ecology has provided insights into important ecological processes, particularly when combined with high-throughput sequencing methods; however, these approaches have not been widely used in studies of fungal communities relative to other microbial groups. Two obstacles have been considered: (1) the internal transcribed spacer (ITS) region has limited utility for constructing phylogenies and (2) most PCR primers that target the large subunit (LSU) ribosomal unit generate amplicons that exceed current limits of high-throughput sequencing platforms. We designed and tested a PCR primer (LR22R) to target approximately 300–400 bp region of the D2 hypervariable regionmore » of the fungal LSU for use with the Illumina MiSeq platform. Both in silico and empirical analyses showed that the LR22R–LR3 pair captured a broad range of fungal taxonomic groups with a small fraction of non-fungal groups. Phylogenetic placement of publically available LSU D2 sequences showed broad agreement with taxonomic classification. Comparisons of the LSU D2 and the ITS2 ribosomal regions from environmental samples and known communities showed similar discriminatory abilities of the two primer sets. Altogether, these findings show that the LR22R–LR3 primer pair has utility for phylogenetic analyses of fungal communities using high-throughput sequencing methods.« less

  4. Enabling systematic interrogation of protein-protein interactions in live cells with a versatile ultra-high-throughput biosensor platform | Office of Cancer Genomics

    Cancer.gov

    The vast datasets generated by next generation gene sequencing and expression profiling have transformed biological and translational research. However, technologies to produce large-scale functional genomics datasets, such as high-throughput detection of protein-protein interactions (PPIs), are still in early development. While a number of powerful technologies have been employed to detect PPIs, a singular PPI biosensor platform featured with both high sensitivity and robustness in a mammalian cell environment remains to be established.

  5. Polymer-Based Dense Fluidic Networks for High Throughput Screening with Ultrasensitive Fluorescence Detection

    PubMed Central

    Okagbare, Paul I.; Soper, Steven A.

    2011-01-01

    Microfluidics represents a viable platform for performing High Throughput Screening (HTS) due to its ability to automate fluid handling and generate fluidic networks with high number densities over small footprints appropriate for the simultaneous optical interrogation of many screening assays. While most HTS campaigns depend on fluorescence, readers typically use point detection and serially address the assay results significantly lowering throughput or detection sensitivity due to a low duty cycle. To address this challenge, we present here the fabrication of a high density microfluidic network packed into the imaging area of a large field-of-view (FoV) ultrasensitive fluorescence detection system. The fluidic channels were 1, 5 or 10 μm (width), 1 μm (depth) with a pitch of 1–10 μm and each fluidic processor was individually addressable. The fluidic chip was produced from a molding tool using hot embossing and thermal fusion bonding to enclose the fluidic channels. A 40X microscope objective (numerical aperture = 0.75) created a FoV of 200 μm, providing the ability to interrogate ~25 channels using the current fluidic configuration. An ultrasensitive fluorescence detection system with a large FoV was used to transduce fluorescence signals simultaneously from each fluidic processor onto the active area of an electron multiplying charge-coupled device (EMCCD). The utility of these multichannel networks for HTS was demonstrated by carrying out the high throughput monitoring of the activity of an enzyme, APE1, used as a model screening assay. PMID:20872611

  6. Target Discovery for Precision Medicine Using High-Throughput Genome Engineering.

    PubMed

    Guo, Xinyi; Chitale, Poonam; Sanjana, Neville E

    2017-01-01

    Over the past few years, programmable RNA-guided nucleases such as the CRISPR/Cas9 system have ushered in a new era of precision genome editing in diverse model systems and in human cells. Functional screens using large libraries of RNA guides can interrogate a large hypothesis space to pinpoint particular genes and genetic elements involved in fundamental biological processes and disease-relevant phenotypes. Here, we review recent high-throughput CRISPR screens (e.g. loss-of-function, gain-of-function, and targeting noncoding elements) and highlight their potential for uncovering novel therapeutic targets, such as those involved in cancer resistance to small molecular drugs and immunotherapies, tumor evolution, infectious disease, inborn genetic disorders, and other therapeutic challenges.

  7. The USC Epigenome Center.

    PubMed

    Laird, Peter W

    2009-10-01

    The University of Southern California (USC, CA, USA) has a long tradition of excellence in epigenetics. With the recent explosive growth and technological maturation of the field of epigenetics, it became clear that a dedicated high-throughput epigenomic data production facility would be needed to remain at the forefront of epigenetic research. To address this need, USC launched the USC Epigenome Center as the first large-scale center in academics dedicated to epigenomic research. The Center is providing high-throughput data production for large-scale genomic and epigenomic studies, and developing novel analysis tools for epigenomic research. This unique facility promises to be a valuable resource for multidisciplinary research, education and training in genomics, epigenomics, bioinformatics, and translational medicine.

  8. Comprehensive analysis of RNA-protein interactions by high-throughput sequencing-RNA affinity profiling.

    PubMed

    Tome, Jacob M; Ozer, Abdullah; Pagano, John M; Gheba, Dan; Schroth, Gary P; Lis, John T

    2014-06-01

    RNA-protein interactions play critical roles in gene regulation, but methods to quantitatively analyze these interactions at a large scale are lacking. We have developed a high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay by adapting a high-throughput DNA sequencer to quantify the binding of fluorescently labeled protein to millions of RNAs anchored to sequenced cDNA templates. Using HiTS-RAP, we measured the affinity of mutagenized libraries of GFP-binding and NELF-E-binding aptamers to their respective targets and identified critical regions of interaction. Mutations additively affected the affinity of the NELF-E-binding aptamer, whose interaction depended mainly on a single-stranded RNA motif, but not that of the GFP aptamer, whose interaction depended primarily on secondary structure.

  9. High throughput screening of CO2-tolerating microalgae using GasPak bags

    PubMed Central

    2013-01-01

    Background Microalgae are diverse in terms of their speciation and function. More than 35,000 algal strains have been described, and thousands of algal cultures are maintained in different culture collection centers. The ability of CO2 uptake by microalgae varies dramatically among algal species. It becomes challenging to select suitable algal candidates that can proliferate under high CO2 concentration from a large collection of algal cultures. Results Here, we described a high throughput screening method to rapidly identify high CO2 affinity microalgae. The system integrates a CO2 mixer, GasPak bags and microplates. Microalgae on the microplates will be cultivated in GasPak bags charged with different CO2 concentrations. Using this method, we identified 17 algal strains whose growth rates were not influenced when the concentration of CO2 was increased from 2 to 20% (v/v). Most CO2 tolerant strains identified in this study were closely related to the species Scenedesmus and Chlorococcum. One of Scenedesmus strains (E7A) has been successfully tested in in the scale up photo bioreactors (500 L) bubbled with flue gas which contains 10-12% CO2. Conclusion Our high throughput CO2 testing system provides a rapid and reliable way for identifying microalgal candidate strains that can grow under high CO2 condition from a large pool of culture collection species. This high throughput system can also be modified for selecting algal strains that can tolerate other gases, such as NOx, SOx, or flue gas. PMID:24341988

  10. Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines

    NASA Astrophysics Data System (ADS)

    Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.

    2017-01-01

    Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.

  11. High-throughput screening and small animal models, where are we?

    PubMed Central

    Giacomotto, Jean; Ségalat, Laurent

    2010-01-01

    Current high-throughput screening methods for drug discovery rely on the existence of targets. Moreover, most of the hits generated during screenings turn out to be invalid after further testing in animal models. To by-pass these limitations, efforts are now being made to screen chemical libraries on whole animals. One of the most commonly used animal model in biology is the murine model Mus musculus. However, its cost limit its use in large-scale therapeutic screening. In contrast, the nematode Caenorhabditis elegans, the fruit fly Drosophila melanogaster, and the fish Danio rerio are gaining momentum as screening tools. These organisms combine genetic amenability, low cost and culture conditions that are compatible with large-scale screens. Their main advantage is to allow high-throughput screening in a whole-animal context. Moreover, their use is not dependent on the prior identification of a target and permits the selection of compounds with an improved safety profile. This review surveys the versatility of these animal models for drug discovery and discuss the options available at this day. PMID:20423335

  12. Essential attributes identified in the design of a Laboratory Information Management System for a high throughput siRNA screening laboratory.

    PubMed

    Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey

    2011-11-01

    In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers

  13. A high throughput spectral image microscopy system

    NASA Astrophysics Data System (ADS)

    Gesley, M.; Puri, R.

    2018-01-01

    A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.

  14. A simple cell-based high throughput screening (HTS) assay for inhibitors of Salmonella enterica RNA polymerase containing the general stress response regulator RpoS (σS).

    PubMed

    Campos-Gomez, Javier; Benitez, Jorge A

    2018-07-01

    RNA polymerase containing the stress response regulator σ S subunit (RpoS) plays a key role in bacterial survival in hostile environments in nature and during infection. Here we devise and validate a simple cell-based high throughput luminescence assay for this holoenzyme suitable for screening large chemical libraries in a robotic platform. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  16. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.

  17. Large-scale Topographical Screen for Investigation of Physical Neural-Guidance Cues

    NASA Astrophysics Data System (ADS)

    Li, Wei; Tang, Qing Yuan; Jadhav, Amol D.; Narang, Ankit; Qian, Wei Xian; Shi, Peng; Pang, Stella W.

    2015-03-01

    A combinatorial approach was used to present primary neurons with a large library of topographical features in the form of micropatterned substrate for high-throughput screening of physical neural-guidance cues that can effectively promote different aspects of neuronal development, including axon and dendritic outgrowth. Notably, the neuronal-guidance capability of specific features was automatically identified using a customized image processing software, thus significantly increasing the screening throughput with minimal subjective bias. Our results indicate that the anisotropic topographies promote axonal and in some cases dendritic extension relative to the isotropic topographies, while dendritic branching showed preference to plain substrates over the microscale features. The results from this work can be readily applied towards engineering novel biomaterials with precise surface topography that can serve as guidance conduits for neuro-regenerative applications. This novel topographical screening strategy combined with the automated processing capability can also be used for high-throughput screening of chemical or genetic regulatory factors in primary neurons.

  18. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  19. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  20. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    PubMed

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  1. Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.

    PubMed

    Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A

    2017-03-07

    The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.

  2. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    NASA Astrophysics Data System (ADS)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  3. Automation of Technology for Cancer Research.

    PubMed

    van der Ent, Wietske; Veneman, Wouter J; Groenewoud, Arwin; Chen, Lanpeng; Tulotta, Claudia; Hogendoorn, Pancras C W; Spaink, Herman P; Snaar-Jagalska, B Ewa

    2016-01-01

    Zebrafish embryos can be obtained for research purposes in large numbers at low cost and embryos develop externally in limited space, making them highly suitable for high-throughput cancer studies and drug screens. Non-invasive live imaging of various processes within the larvae is possible due to their transparency during development, and a multitude of available fluorescent transgenic reporter lines.To perform high-throughput studies, handling large amounts of embryos and larvae is required. With such high number of individuals, even minute tasks may become time-consuming and arduous. In this chapter, an overview is given of the developments in the automation of various steps of large scale zebrafish cancer research for discovering important cancer pathways and drugs for the treatment of human disease. The focus lies on various tools developed for cancer cell implantation, embryo handling and sorting, microfluidic systems for imaging and drug treatment, and image acquisition and analysis. Examples will be given of employment of these technologies within the fields of toxicology research and cancer research.

  4. MGIS: Managing banana (Musa spp.) genetic resources information and high-throughput genotyping data

    USDA-ARS?s Scientific Manuscript database

    Unraveling genetic diversity held in genebanks on a large scale is underway, due to the advances in Next-generation sequence-based technologies that produce high-density genetic markers for a large number of samples at low cost. Genebank users should be in a position to identify and select germplasm...

  5. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    PubMed

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  6. Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.

    PubMed

    Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N

    2004-01-01

    Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.

  7. Advancements in zebrafish applications for 21st century toxicology.

    PubMed

    Garcia, Gloria R; Noyes, Pamela D; Tanguay, Robert L

    2016-05-01

    The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Advancements in zebrafish applications for 21st century toxicology

    PubMed Central

    Garcia, Gloria R.; Noyes, Pamela D.; Tanguay, Robert L.

    2016-01-01

    The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. PMID:27016469

  9. Design of the Extreme Ultraviolet Explorer long-wavelength grazing incidence telescope optics

    NASA Technical Reports Server (NTRS)

    Finley, David S.; Jelinsky, Patrick; Bowyer, Stuart; Malina, Roger F.

    1988-01-01

    Designing optics for photometry in the long-wavelength portion of the EUV spectrum (400-900) A) poses different problems from those arising for optics, operating shortward of 400 A. The available filter materials which transmit radiation longward of 400 A are also highly transparent at wavelengths shortward of 100 A. Conventional EUV optics, with grazing engles of less than about 10 deg, have very high throughput in the EUV, which persists to wavelengths shortward of 100 A. Use of such optics with the longer-wavelength EUV filters thus results in an unacceptably large soft X-ray leak. This problem is overcome by developing a mirror design with larger graze angles of not less than 20 deg, which has high throughput at wavelengths longer than 400 A but at the same time very little throughput shortward of 100 A.

  10. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    PubMed

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes.

  11. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    PubMed Central

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes. PMID:29250096

  12. Statistical Approach To Estimate Vaccinia-Specific Neutralizing Antibody Titers Using a High-Throughput Assay▿

    PubMed Central

    Kennedy, Richard; Pankratz, V. Shane; Swanson, Eric; Watson, David; Golding, Hana; Poland, Gregory A.

    2009-01-01

    Because of the bioterrorism threat posed by agents such as variola virus, considerable time, resources, and effort have been devoted to biodefense preparation. One avenue of this research has been the development of rapid, sensitive, high-throughput assays to validate immune responses to poxviruses. Here we describe the adaptation of a β-galactosidase reporter-based vaccinia virus neutralization assay to large-scale use in a study that included over 1,000 subjects. We also describe the statistical methods involved in analyzing the large quantity of data generated. The assay and its associated methods should prove useful tools in monitoring immune responses to next-generation smallpox vaccines, studying poxvirus immunity, and evaluating therapeutic agents such as vaccinia virus immune globulin. PMID:19535540

  13. Improving bed turnover time with a bed management system.

    PubMed

    Tortorella, Frank; Ukanowicz, Donna; Douglas-Ntagha, Pamela; Ray, Robert; Triller, Maureen

    2013-01-01

    Efficient patient throughput requires a high degree of coordination and communication. Opportunities abound to improve the patient experience by eliminating waste from the process and improving communication among the multiple disciplines involved in facilitating patient flow. In this article, we demonstrate how an interdisciplinary team at a large tertiary cancer center implemented an electronic bed management system to improve the bed turnover component of the patient throughput process.

  14. Custom Super-Resolution Microscope for the Structural Analysis of Nanostructures

    DTIC Science & Technology

    2018-05-29

    research community. As part of our validation of the new design approach, we performed two - color imaging of pairs of adjacent oligo probes hybridized...nanostructures and biological targets. Our microscope features a large field of view and custom optics that facilitate 3D imaging and enhanced contrast in...our imaging throughput by creating two microscopy platforms for high-throughput, super-resolution materials characterization, with the AO set-up being

  15. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping1[C][W][OPEN

    PubMed Central

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-01-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818

  16. Large-scale protein-protein interactions detection by integrating big biosensing data with computational model.

    PubMed

    You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen

    2014-01-01

    Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.

  17. Correction of Microplate Data from High-Throughput Screening.

    PubMed

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  18. A Low Collision and High Throughput Data Collection Mechanism for Large-Scale Super Dense Wireless Sensor Networks.

    PubMed

    Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Gaura, Elena; Brusey, James; Zhang, Xuekun; Dutkiewicz, Eryk

    2016-07-18

    Super dense wireless sensor networks (WSNs) have become popular with the development of Internet of Things (IoT), Machine-to-Machine (M2M) communications and Vehicular-to-Vehicular (V2V) networks. While highly-dense wireless networks provide efficient and sustainable solutions to collect precise environmental information, a new channel access scheme is needed to solve the channel collision problem caused by the large number of competing nodes accessing the channel simultaneously. In this paper, we propose a space-time random access method based on a directional data transmission strategy, by which collisions in the wireless channel are significantly decreased and channel utility efficiency is greatly enhanced. Simulation results show that our proposed method can decrease the packet loss rate to less than 2 % in large scale WSNs and in comparison with other channel access schemes for WSNs, the average network throughput can be doubled.

  19. High-Resolution X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    ODell, Stephen L.; Brissenden, Roger J.; Davis, William; Elsner, Ronald F.; Elvis, Martin; Freeman, Mark; Gaetz, Terry; Gorenstein, Paul; Gubarev, Mikhail V.

    2010-01-01

    Fundamental needs for future x-ray telescopes: a) Sharp images => excellent angular resolution. b) High throughput => large aperture areas. Generation-X optics technical challenges: a) High resolution => precision mirrors & alignment. b) Large apertures => lots of lightweight mirrors. Innovation needed for technical readiness: a) 4 top-level error terms contribute to image size. b) There are approaches to controlling those errors. Innovation needed for manufacturing readiness. Programmatic issues are comparably challenging.

  20. A high-throughput core sampling device for the evaluation of maize stalk composition

    PubMed Central

    2012-01-01

    Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834

  1. Quality control methodology for high-throughput protein-protein interaction screening.

    PubMed

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  2. High-throughput combinatorial screening identifies drugs that cooperate with ibrutinib to kill activated B-cell–like diffuse large B-cell lymphoma cells

    PubMed Central

    Mathews Griner, Lesley A.; Guha, Rajarshi; Shinn, Paul; Young, Ryan M.; Keller, Jonathan M.; Liu, Dongbo; Goldlust, Ian S.; Yasgar, Adam; McKnight, Crystal; Boxer, Matthew B.; Duveau, Damien Y.; Jiang, Jian-Kang; Michael, Sam; Mierzwa, Tim; Huang, Wenwei; Walsh, Martin J.; Mott, Bryan T.; Patel, Paresma; Leister, William; Maloney, David J.; Leclair, Christopher A.; Rai, Ganesha; Jadhav, Ajit; Peyser, Brian D.; Austin, Christopher P.; Martin, Scott E.; Simeonov, Anton; Ferrer, Marc; Staudt, Louis M.; Thomas, Craig J.

    2014-01-01

    The clinical development of drug combinations is typically achieved through trial-and-error or via insight gained through a detailed molecular understanding of dysregulated signaling pathways in a specific cancer type. Unbiased small-molecule combination (matrix) screening represents a high-throughput means to explore hundreds and even thousands of drug–drug pairs for potential investigation and translation. Here, we describe a high-throughput screening platform capable of testing compounds in pairwise matrix blocks for the rapid and systematic identification of synergistic, additive, and antagonistic drug combinations. We use this platform to define potential therapeutic combinations for the activated B-cell–like subtype (ABC) of diffuse large B-cell lymphoma (DLBCL). We identify drugs with synergy, additivity, and antagonism with the Bruton’s tyrosine kinase inhibitor ibrutinib, which targets the chronic active B-cell receptor signaling that characterizes ABC DLBCL. Ibrutinib interacted favorably with a wide range of compounds, including inhibitors of the PI3K-AKT-mammalian target of rapamycin signaling cascade, other B-cell receptor pathway inhibitors, Bcl-2 family inhibitors, and several components of chemotherapy that is the standard of care for DLBCL. PMID:24469833

  3. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  4. High-Throughput Light Sheet Microscopy for the Automated Live Imaging of Larval Zebrafish

    NASA Astrophysics Data System (ADS)

    Baker, Ryan; Logan, Savannah; Dudley, Christopher; Parthasarathy, Raghuveer

    The zebrafish is a model organism with a variety of useful properties; it is small and optically transparent, it reproduces quickly, it is a vertebrate, and there are a large variety of transgenic animals available. Because of these properties, the zebrafish is well suited to study using a variety of optical technologies including light sheet fluorescence microscopy (LSFM), which provides high-resolution three-dimensional imaging over large fields of view. Research progress, however, is often not limited by optical techniques but instead by the number of samples one can examine over the course of an experiment, which in the case of light sheet imaging has so far been severely limited. Here we present an integrated fluidic circuit and microscope which provides rapid, automated imaging of zebrafish using several imaging modes, including LSFM, Hyperspectral Imaging, and Differential Interference Contrast Microscopy. Using this system, we show that we can increase our imaging throughput by a factor of 10 compared to previous techniques. We also show preliminary results visualizing zebrafish immune response, which is sensitive to gut microbiota composition, and which shows a strong variability between individuals that highlights the utility of high throughput imaging. National Science Foundation, Award No. DBI-1427957.

  5. GOTree Machine (GOTM): a web-based platform for interpreting sets of interesting genes using Gene Ontology hierarchies

    PubMed Central

    Zhang, Bing; Schmoyer, Denise; Kirov, Stefan; Snoddy, Jay

    2004-01-01

    Background Microarray and other high-throughput technologies are producing large sets of interesting genes that are difficult to analyze directly. Bioinformatics tools are needed to interpret the functional information in the gene sets. Results We have created a web-based tool for data analysis and data visualization for sets of genes called GOTree Machine (GOTM). This tool was originally intended to analyze sets of co-regulated genes identified from microarray analysis but is adaptable for use with other gene sets from other high-throughput analyses. GOTree Machine generates a GOTree, a tree-like structure to navigate the Gene Ontology Directed Acyclic Graph for input gene sets. This system provides user friendly data navigation and visualization. Statistical analysis helps users to identify the most important Gene Ontology categories for the input gene sets and suggests biological areas that warrant further study. GOTree Machine is available online at . Conclusion GOTree Machine has a broad application in functional genomic, proteomic and other high-throughput methods that generate large sets of interesting genes; its primary purpose is to help users sort for interesting patterns in gene sets. PMID:14975175

  6. HiCTMap: Detection and analysis of chromosome territory structure and position by high-throughput imaging.

    PubMed

    Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom

    2018-06-01

    The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.

  7. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    PubMed

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  8. Accelerating evaluation of converged lattice thermal conductivity

    NASA Astrophysics Data System (ADS)

    Qin, Guangzhao; Hu, Ming

    2018-01-01

    High-throughput computational materials design is an emerging area in materials science, which is based on the fast evaluation of physical-related properties. The lattice thermal conductivity (κ) is a key property of materials for enormous implications. However, the high-throughput evaluation of κ remains a challenge due to the large resources costs and time-consuming procedures. In this paper, we propose a concise strategy to efficiently accelerate the evaluation process of obtaining accurate and converged κ. The strategy is in the framework of phonon Boltzmann transport equation (BTE) coupled with first-principles calculations. Based on the analysis of harmonic interatomic force constants (IFCs), the large enough cutoff radius (rcutoff), a critical parameter involved in calculating the anharmonic IFCs, can be directly determined to get satisfactory results. Moreover, we find a simple way to largely ( 10 times) accelerate the computations by fast reconstructing the anharmonic IFCs in the convergence test of κ with respect to the rcutof, which finally confirms the chosen rcutoff is appropriate. Two-dimensional graphene and phosphorene along with bulk SnSe are presented to validate our approach, and the long-debate divergence problem of thermal conductivity in low-dimensional systems is studied. The quantitative strategy proposed herein can be a good candidate for fast evaluating the reliable κ and thus provides useful tool for high-throughput materials screening and design with targeted thermal transport properties.

  9. Conceptual dissonance: evaluating the efficacy of natural language processing techniques for validating translational knowledge constructs.

    PubMed

    Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B

    2009-03-01

    The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.

  10. High throughput optical lithography by scanning a massive array of bowtie aperture antennas at near-field

    PubMed Central

    Wen, X.; Datta, A.; Traverso, L. M.; Pan, L.; Xu, X.; Moon, E. E.

    2015-01-01

    Optical lithography, the enabling process for defining features, has been widely used in semiconductor industry and many other nanotechnology applications. Advances of nanotechnology require developments of high-throughput optical lithography capabilities to overcome the optical diffraction limit and meet the ever-decreasing device dimensions. We report our recent experimental advancements to scale up diffraction unlimited optical lithography in a massive scale using the near field nanolithography capabilities of bowtie apertures. A record number of near-field optical elements, an array of 1,024 bowtie antenna apertures, are simultaneously employed to generate a large number of patterns by carefully controlling their working distances over the entire array using an optical gap metrology system. Our experimental results reiterated the ability of using massively-parallel near-field devices to achieve high-throughput optical nanolithography, which can be promising for many important nanotechnology applications such as computation, data storage, communication, and energy. PMID:26525906

  11. A high-throughput exploration of magnetic materials by using structure predicting methods

    NASA Astrophysics Data System (ADS)

    Arapan, S.; Nieves, P.; Cuesta-López, S.

    2018-02-01

    We study the capability of a structure predicting method based on genetic/evolutionary algorithm for a high-throughput exploration of magnetic materials. We use the USPEX and VASP codes to predict stable and generate low-energy meta-stable structures for a set of representative magnetic structures comprising intermetallic alloys, oxides, interstitial compounds, and systems containing rare-earths elements, and for both types of ferromagnetic and antiferromagnetic ordering. We have modified the interface between USPEX and VASP codes to improve the performance of structural optimization as well as to perform calculations in a high-throughput manner. We show that exploring the structure phase space with a structure predicting technique reveals large sets of low-energy metastable structures, which not only improve currently exiting databases, but also may provide understanding and solutions to stabilize and synthesize magnetic materials suitable for permanent magnet applications.

  12. From Classical to High Throughput Screening Methods for Feruloyl Esterases: A Review.

    PubMed

    Ramírez-Velasco, Lorena; Armendáriz-Ruiz, Mariana; Rodríguez-González, Jorge Alberto; Müller-Santos, Marcelo; Asaff-Torres, Ali; Mateos-Díaz, Juan Carlos

    2016-01-01

    Feruloyl esterases (FAEs) are a diverse group of hydrolases widely distributed in plants and microorganisms which catalyzes the cleavage and formation of ester bonds between plant cell wall polysaccharides and phenolic acids. FAEs have gained importance in biofuel, medicine and food industries due to their capability of acting on a large range of substrates for cleaving ester bonds and synthesizing highadded value molecules through esterification and transesterification reactions. During the past two decades extensive studies have been carried out on the production, characterization and classification of FAEs, however only a few reports of suitable High Throughput Screening assays for this kind of enzymes have been reported. This review is focused on a concise but complete revision of classical to High Throughput Screening methods for FAEs, highlighting its advantages and disadvantages, and finally suggesting future perspectives for this important research field.

  13. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity

    PubMed Central

    Zhong, Qing; Rüschoff, Jan H.; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J.; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J.; Rupp, Niels J.; Fankhauser, Christian; Buhmann, Joachim M.; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A.; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C.; Jochum, Wolfram; Wild, Peter J.

    2016-01-01

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility. PMID:27052161

  14. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity.

    PubMed

    Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J

    2016-04-07

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.

  15. High-throughput bioinformatics with the Cyrille2 pipeline system

    PubMed Central

    Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ

    2008-01-01

    Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742

  16. A high throughput array microscope for the mechanical characterization of biomaterials

    NASA Astrophysics Data System (ADS)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  17. Continuous inertial microparticle and blood cell separation in straight channels with local microstructures.

    PubMed

    Wu, Zhenlong; Chen, Yu; Wang, Moran; Chung, Aram J

    2016-02-07

    Fluid inertia which has conventionally been neglected in microfluidics has been gaining much attention for particle and cell manipulation because inertia-based methods inherently provide simple, passive, precise and high-throughput characteristics. Particularly, the inertial approach has been applied to blood separation for various biomedical research studies mainly using spiral microchannels. For higher throughput, parallelization is essential; however, it is difficult to realize using spiral channels because of their large two dimensional layouts. In this work, we present a novel inertial platform for continuous sheathless particle and blood cell separation in straight microchannels containing microstructures. Microstructures within straight channels exert secondary flows to manipulate particle positions similar to Dean flow in curved channels but with higher controllability. Through a balance between inertial lift force and microstructure-induced secondary flow, we deterministically position microspheres and cells based on their sizes to be separated downstream. Using our inertial platform, we successfully sorted microparticles and fractionized blood cells with high separation efficiencies, high purities and high throughputs. The inertial separation platform developed here can be operated to process diluted blood with a throughput of 10.8 mL min(-1)via radially arrayed single channels with one inlet and two rings of outlets.

  18. Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.

    PubMed

    Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin

    2016-02-01

    High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.

  19. THE RABIT: A RAPID AUTOMATED BIODOSIMETRY TOOL FOR RADIOLOGICAL TRIAGE

    PubMed Central

    Garty, Guy; Chen, Youhua; Salerno, Alessio; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Brenner, David J.

    2010-01-01

    In response to the recognized need for high throughput biodosimetry methods for use after large scale radiological events, a logical approach is complete automation of standard biodosimetric assays that are currently performed manually. We describe progress to date on the RABIT (Rapid Automated BIodosimetry Tool), designed to score micronuclei or γ-H2AX fluorescence in lymphocytes derived from a single drop of blood from a fingerstick. The RABIT system is designed to be completely automated, from the input of the capillary blood sample into the machine, to the output of a dose estimate. Improvements in throughput are achieved through use of a single drop of blood, optimization of the biological protocols for in-situ analysis in multi-well plates, implementation of robotic plate and liquid handling, and new developments in high-speed imaging. Automating well-established bioassays represents a promising approach to high-throughput radiation biodosimetry, both because high throughputs can be achieved, but also because the time to deployment is potentially much shorter than for a new biological assay. Here we describe the development of each of the individual modules of the RABIT system, and show preliminary data from key modules. Ongoing is system integration, followed by calibration and validation. PMID:20065685

  20. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    PubMed Central

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-01-01

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925

  1. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    PubMed Central

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  2. High-throughput and high-yield fabrication of uniaxially-aligned chitosan-based nanofibers by centrifugal electrospinning

    PubMed Central

    Erickson, Ariane E.; Edmondson, Dennis; Chang, Fei-Chien; Wood, Dave; Gong, Alex; Levengood, Sheeny Lan; Zhang, Miqin

    2016-01-01

    The inability to produce large quantities of nanofibers has been a primary obstacle in advancement and commercialization of electrospinning technologies, especially when aligned nanofibers are desired. Here, we present a high-throughput centrifugal electrospinning (HTP-CES) system capable of producing a large number of highly-aligned nanofiber samples with high-yield and tunable diameters. The versatility of the design was revealed when bead-less nanofibers were produced from copolymer chitosan/polycaprolactone (C-PCL) solutions despite variations in polymer blend composition or spinneret needle gauge. Compared to conventional electrospinning techniques, fibers spun with the HTP-CES not only exhibited superior alignment, but also better diameter uniformity. Nanofiber alignment was quantified using Fast Fourier Transform (FFT) analysis. In addition, a concave correlation between the needle diameter and resultant fiber diameter was identified. This system can be easily scaled up for industrial production of highly-aligned nanofibers with tunable diameters that can potentially meet the requirements for various engineering and biomedical applications. PMID:26428148

  3. Optima MDxt: A high throughput 335 keV mid-dose implanter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisner, Edward; David, Jonathan; Justesen, Perry

    2012-11-06

    The continuing demand for both energy purity and implant angle control along with high wafer throughput drove the development of the Axcelis Optima MDxt mid-dose ion implanter. The system utilizes electrostatic scanning, an electrostatic parallelizing lens and an electrostatic energy filter to produce energetically pure beams with high angular integrity. Based on field proven components, the Optima MDxt beamline architecture offers the high beam currents possible with singly charged species including arsenic at energies up to 335 keV as well as large currents from multiply charged species at energies extending over 1 MeV. Conversely, the excellent energy filtering capability allowsmore » high currents at low beam energies, since it is safe to utilize large deceleration ratios. This beamline is coupled with the >500 WPH capable endstation technology used on the Axcelis Optima XEx high energy ion implanter. The endstation includes in-situ angle measurements of the beam in order to maintain excellent beam-to-wafer implant angle control in both the horizontal and vertical directions. The Optima platform control system provides new generation dose control system that assures excellent dosimetry and charge control. This paper will describe the features and technologies that allow the Optima MDxt to provide superior process performance at the highest wafer throughput, and will provide examples of the process performance achievable.« less

  4. Gold nanoparticles for high-throughput genotyping of long-range haplotypes

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Pan, Dun; Fan, Chunhai; Chen, Jianhua; Huang, Ke; Wang, Dongfang; Zhang, Honglu; Li, You; Feng, Guoyin; Liang, Peiji; He, Lin; Shi, Yongyong

    2011-10-01

    Completion of the Human Genome Project and the HapMap Project has led to increasing demands for mapping complex traits in humans to understand the aetiology of diseases. Identifying variations in the DNA sequence, which affect how we develop disease and respond to pathogens and drugs, is important for this purpose, but it is difficult to identify these variations in large sample sets. Here we show that through a combination of capillary sequencing and polymerase chain reaction assisted by gold nanoparticles, it is possible to identify several DNA variations that are associated with age-related macular degeneration and psoriasis on significant regions of human genomic DNA. Our method is accurate and promising for large-scale and high-throughput genetic analysis of susceptibility towards disease and drug resistance.

  5. Population Studies of Intact Vitamin D Binding Protein by Affinity Capture ESI-TOF-MS

    PubMed Central

    Borges, Chad R.; Jarvis, Jason W.; Oran, Paul E.; Rogers, Stephen P.; Nelson, Randall W.

    2008-01-01

    Blood plasma proteins with molecular weights greater than approximately 30 kDa are refractory to comprehensive, high-throughput qualitative characterization of microheterogeneity across human populations. Analytical techniques for obtaining high mass resolution for targeted, intact protein characterization and, separately, high sample throughput exist, but efficient means of coupling these assay characteristics remain rather limited. This article discusses the impetus for analyzing intact proteins in a targeted manner across populations and describes the methodology required to couple mass spectrometric immunoassay with electrospray ionization mass spectrometry for the purpose of qualitatively characterizing a prototypical large plasma protein, vitamin D binding protein, across populations. PMID:19137103

  6. A holistic high-throughput screening framework for biofuel feedstock assessment that characterises variations in soluble sugars and cell wall composition in Sorghum bicolor

    PubMed Central

    2013-01-01

    Background A major hindrance to the development of high yielding biofuel feedstocks is the ability to rapidly assess large populations for fermentable sugar yields. Whilst recent advances have outlined methods for the rapid assessment of biomass saccharification efficiency, none take into account the total biomass, or the soluble sugar fraction of the plant. Here we present a holistic high-throughput methodology for assessing sweet Sorghum bicolor feedstocks at 10 days post-anthesis for total fermentable sugar yields including stalk biomass, soluble sugar concentrations, and cell wall saccharification efficiency. Results A mathematical method for assessing whole S. bicolor stalks using the fourth internode from the base of the plant proved to be an effective high-throughput strategy for assessing stalk biomass, soluble sugar concentrations, and cell wall composition and allowed calculation of total stalk fermentable sugars. A high-throughput method for measuring soluble sucrose, glucose, and fructose using partial least squares (PLS) modelling of juice Fourier transform infrared (FTIR) spectra was developed. The PLS prediction was shown to be highly accurate with each sugar attaining a coefficient of determination (R 2 ) of 0.99 with a root mean squared error of prediction (RMSEP) of 11.93, 5.52, and 3.23 mM for sucrose, glucose, and fructose, respectively, which constitutes an error of <4% in each case. The sugar PLS model correlated well with gas chromatography–mass spectrometry (GC-MS) and brix measures. Similarly, a high-throughput method for predicting enzymatic cell wall digestibility using PLS modelling of FTIR spectra obtained from S. bicolor bagasse was developed. The PLS prediction was shown to be accurate with an R 2 of 0.94 and RMSEP of 0.64 μg.mgDW-1.h-1. Conclusions This methodology has been demonstrated as an efficient and effective way to screen large biofuel feedstock populations for biomass, soluble sugar concentrations, and cell wall digestibility simultaneously allowing a total fermentable yield calculation. It unifies and simplifies previous screening methodologies to produce a holistic assessment of biofuel feedstock potential. PMID:24365407

  7. Blood group genotyping: from patient to high-throughput donor screening.

    PubMed

    Veldhuisen, B; van der Schoot, C E; de Haas, M

    2009-10-01

    Blood group antigens, present on the cell membrane of red blood cells and platelets, can be defined either serologically or predicted based on the genotypes of genes encoding for blood group antigens. At present, the molecular basis of many antigens of the 30 blood group systems and 17 human platelet antigens is known. In many laboratories, blood group genotyping assays are routinely used for diagnostics in cases where patient red cells cannot be used for serological typing due to the presence of auto-antibodies or after recent transfusions. In addition, DNA genotyping is used to support (un)-expected serological findings. Fetal genotyping is routinely performed when there is a risk of alloimmune-mediated red cell or platelet destruction. In case of patient blood group antigen typing, it is important that a genotyping result is quickly available to support the selection of donor blood, and high-throughput of the genotyping method is not a prerequisite. In addition, genotyping of blood donors will be extremely useful to obtain donor blood with rare phenotypes, for example lacking a high-frequency antigen, and to obtain a fully typed donor database to be used for a better matching between recipient and donor to prevent adverse transfusion reactions. Serological typing of large cohorts of donors is a labour-intensive and expensive exercise and hampered by the lack of sufficient amounts of approved typing reagents for all blood group systems of interest. Currently, high-throughput genotyping based on DNA micro-arrays is a very feasible method to obtain a large pool of well-typed blood donors. Several systems for high-throughput blood group genotyping are developed and will be discussed in this review.

  8. An efficient and high-throughput protocol for Agrobacterium-mediated transformation based on phosphomannose isomerase positive selection in Japonica rice (Oryza sativa L.).

    PubMed

    Duan, Yongbo; Zhai, Chenguang; Li, Hao; Li, Juan; Mei, Wenqian; Gui, Huaping; Ni, Dahu; Song, Fengshun; Li, Li; Zhang, Wanggen; Yang, Jianbo

    2012-09-01

    A number of Agrobacterium-mediated rice transformation systems have been developed and widely used in numerous laboratories and research institutes. However, those systems generally employ antibiotics like kanamycin and hygromycin, or herbicide as selectable agents, and are used for the small-scale experiments. To address high-throughput production of transgenic rice plants via Agrobacterium-mediated transformation, and to eliminate public concern on antibiotic markers, we developed a comprehensive efficient protocol, covering from explant preparation to the acquisition of low copy events by real-time PCR analysis before transplant to field, for high-throughput production of transgenic plants of Japonica rice varieties Wanjing97 and Nipponbare using Escherichia coli phosphomannose isomerase gene (pmi) as a selectable marker. The transformation frequencies (TF) of Wanjing97 and Nipponbare were achieved as high as 54.8 and 47.5%, respectively, in one round of selection of 7.5 or 12.5 g/L mannose appended with 5 g/L sucrose. High-throughput transformation from inoculation to transplant of low copy events was accomplished within 55-60 days. Moreover, the Taqman assay data from a large number of transformants showed 45.2% in Wanjing97 and 31.5% in Nipponbare as a low copy rate, and the transformants are fertile and follow the Mendelian segregation ratio. This protocol facilitates us to perform genome-wide functional annotation of the open reading frames and utilization of the agronomically important genes in rice under a reduced public concern on selectable markers. We describe a comprehensive protocol for large scale production of transgenic Japonica rice plants using non-antibiotic selectable agent, at simplified, cost- and labor-saving manners.

  9. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph

    2009-03-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  10. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Lopez, Rene; Hampton, Meredith; Desimone, Joseph

    2008-10-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  11. Classification of large circulating tumor cells isolated with ultra-high throughput microfluidic Vortex technology.

    PubMed

    Che, James; Yu, Victor; Dhar, Manjima; Renier, Corinne; Matsumoto, Melissa; Heirich, Kyra; Garon, Edward B; Goldman, Jonathan; Rao, Jianyu; Sledge, George W; Pegram, Mark D; Sheth, Shruti; Jeffrey, Stefanie S; Kulkarni, Rajan P; Sollier, Elodie; Di Carlo, Dino

    2016-03-15

    Circulating tumor cells (CTCs) are emerging as rare but clinically significant non-invasive cellular biomarkers for cancer patient prognosis, treatment selection, and treatment monitoring. Current CTC isolation approaches, such as immunoaffinity, filtration, or size-based techniques, are often limited by throughput, purity, large output volumes, or inability to obtain viable cells for downstream analysis. For all technologies, traditional immunofluorescent staining alone has been employed to distinguish and confirm the presence of isolated CTCs among contaminating blood cells, although cells isolated by size may express vastly different phenotypes. Consequently, CTC definitions have been non-trivial, researcher-dependent, and evolving. Here we describe a complete set of objective criteria, leveraging well-established cytomorphological features of malignancy, by which we identify large CTCs. We apply the criteria to CTCs enriched from stage IV lung and breast cancer patient blood samples using the High Throughput Vortex Chip (Vortex HT), an improved microfluidic technology for the label-free, size-based enrichment and concentration of rare cells. We achieve improved capture efficiency (up to 83%), high speed of processing (8 mL/min of 10x diluted blood, or 800 μL/min of whole blood), and high purity (avg. background of 28.8±23.6 white blood cells per mL of whole blood). We show markedly improved performance of CTC capture (84% positive test rate) in comparison to previous Vortex designs and the current FDA-approved gold standard CellSearch assay. The results demonstrate the ability to quickly collect viable and pure populations of abnormal large circulating cells unbiased by molecular characteristics, which helps uncover further heterogeneity in these cells.

  12. Classification of large circulating tumor cells isolated with ultra-high throughput microfluidic Vortex technology

    PubMed Central

    Che, James; Yu, Victor; Dhar, Manjima; Renier, Corinne; Matsumoto, Melissa; Heirich, Kyra; Garon, Edward B.; Goldman, Jonathan; Rao, Jianyu; Sledge, George W.; Pegram, Mark D.; Sheth, Shruti; Jeffrey, Stefanie S.; Kulkarni, Rajan P.; Sollier, Elodie; Di Carlo, Dino

    2016-01-01

    Circulating tumor cells (CTCs) are emerging as rare but clinically significant non-invasive cellular biomarkers for cancer patient prognosis, treatment selection, and treatment monitoring. Current CTC isolation approaches, such as immunoaffinity, filtration, or size-based techniques, are often limited by throughput, purity, large output volumes, or inability to obtain viable cells for downstream analysis. For all technologies, traditional immunofluorescent staining alone has been employed to distinguish and confirm the presence of isolated CTCs among contaminating blood cells, although cells isolated by size may express vastly different phenotypes. Consequently, CTC definitions have been non-trivial, researcher-dependent, and evolving. Here we describe a complete set of objective criteria, leveraging well-established cytomorphological features of malignancy, by which we identify large CTCs. We apply the criteria to CTCs enriched from stage IV lung and breast cancer patient blood samples using the High Throughput Vortex Chip (Vortex HT), an improved microfluidic technology for the label-free, size-based enrichment and concentration of rare cells. We achieve improved capture efficiency (up to 83%), high speed of processing (8 mL/min of 10x diluted blood, or 800 μL/min of whole blood), and high purity (avg. background of 28.8±23.6 white blood cells per mL of whole blood). We show markedly improved performance of CTC capture (84% positive test rate) in comparison to previous Vortex designs and the current FDA-approved gold standard CellSearch assay. The results demonstrate the ability to quickly collect viable and pure populations of abnormal large circulating cells unbiased by molecular characteristics, which helps uncover further heterogeneity in these cells. PMID:26863573

  13. High-throughput analysis of peptide binding modules

    PubMed Central

    Liu, Bernard A.; Engelmann, Brett; Nash, Piers D.

    2014-01-01

    Modular protein interaction domains that recognize linear peptide motifs are found in hundreds of proteins within the human genome. Some protein interaction domains such as SH2, 14-3-3, Chromo and Bromo domains serve to recognize post-translational modification of amino acids (such as phosphorylation, acetylation, methylation etc.) and translate these into discrete cellular responses. Other modules such as SH3 and PDZ domains recognize linear peptide epitopes and serve to organize protein complexes based on localization and regions of elevated concentration. In both cases, the ability to nucleate specific signaling complexes is in large part dependent on the selectivity of a given protein module for its cognate peptide ligand. High throughput analysis of peptide-binding domains by peptide or protein arrays, phage display, mass spectrometry or other HTP techniques provides new insight into the potential protein-protein interactions prescribed by individual or even whole families of modules. Systems level analyses have also promoted a deeper understanding of the underlying principles that govern selective protein-protein interactions and how selectivity evolves. Lastly, there is a growing appreciation for the limitations and potential pitfalls of high-throughput analysis of protein-peptide interactomes. This review will examine some of the common approaches utilized for large-scale studies of protein interaction domains and suggest a set of standards for the analysis and validation of datasets from large-scale studies of peptide-binding modules. We will also highlight how data from large-scale studies of modular interaction domain families can provide insight into systems level properties such as the linguistics of selective interactions. PMID:22610655

  14. Matrix-assisted laser desorption/ionisation, time-of-flight mass spectrometry-based blood group genotyping--the alternative approach.

    PubMed

    Gassner, Christoph; Meyer, Stefan; Frey, Beat M; Vollmert, Caren

    2013-01-01

    Although matrix-assisted laser desorption/ionisation, time-of-flight mass spectrometry (MALDI-TOF MS) has previously been reported for high throughput blood group genotyping, those reports are limited to only a few blood group systems. This review describes the development of a large cooperative Swiss-German project, aiming to employ MALDI-TOF MS for the molecular detection of the blood groups Rh, Kell, Kidd, Duffy, MNSs, a comprehensive collection of low incidence antigens, as well as the platelet and granulocyte antigens HPA and HNA, representing a total of 101 blood group antigens, encoded by 170 alleles, respectively. Recent reports describe MALDI-TOF MS as a technology with short time-to-resolution, ability for high throughput, and cost-efficiency when used in genetic analysis, including forensics, pharmacogenetics, oncology and hematology. Furthermore, Kell and RhD genotyping have been performed on fetal DNA from maternal plasma with excellent results. In summary, this article introduces a new technological approach for high throughput blood group genotyping by means of MALDI-TOF MS. Although all data presented are preliminary, the observed success rates, data quality and concordance with known blood group types are highly impressive, underlining the accuracy and reliability of this cost-efficient high throughput method. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Multiplexed single-molecule force spectroscopy using a centrifuge.

    PubMed

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P

    2016-03-17

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise.

  16. Multiplexed single-molecule force spectroscopy using a centrifuge

    PubMed Central

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P.

    2016-01-01

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise. PMID:26984516

  17. Molecular characterization of a novel rhabdovirus infecting blackcurrant identified by high-throughput sequencing.

    PubMed

    Wu, L-P; Yang, T; Liu, H-W; Postman, J; Li, R

    2018-05-01

    A large contig with sequence similarities to several nucleorhabdoviruses was identified by high-throughput sequencing analysis from a black currant (Ribes nigrum L.) cultivar. The complete genome sequence of this new nucleorhabdovirus is 14,432 nucleotides long. Its genomic organization is very similar to those of unsegmented plant rhabdoviruses, containing six open reading frames in the order 3'-N-P-P3-M-G-L-5. The virus, which is provisionally named "black currant-associated rhabdovirus", is 41-52% identical in its genome nucleotide sequence to other nucleorhabdoviruses and may represent a new species in the genus Nucleorhabdovirus.

  18. High throughput platforms for structural genomics of integral membrane proteins.

    PubMed

    Mancia, Filippo; Love, James

    2011-08-01

    Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. High-throughput telomere length quantification by FISH and its application to human population studies.

    PubMed

    Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A

    2007-03-27

    A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.

  20. Targeted Capture and High-Throughput Sequencing Using Molecular Inversion Probes (MIPs).

    PubMed

    Cantsilieris, Stuart; Stessman, Holly A; Shendure, Jay; Eichler, Evan E

    2017-01-01

    Molecular inversion probes (MIPs) in combination with massively parallel DNA sequencing represent a versatile, yet economical tool for targeted sequencing of genomic DNA. Several thousand genomic targets can be selectively captured using long oligonucleotides containing unique targeting arms and universal linkers. The ability to append sequencing adaptors and sample-specific barcodes allows large-scale pooling and subsequent high-throughput sequencing at relatively low cost per sample. Here, we describe a "wet bench" protocol detailing the capture and subsequent sequencing of >2000 genomic targets from 192 samples, representative of a single lane on the Illumina HiSeq 2000 platform.

  1. Real-time Full-spectral Imaging and Affinity Measurements from 50 Microfluidic Channels using Nanohole Surface Plasmon Resonance†

    PubMed Central

    Lee, Si Hoon; Lindquist, Nathan C.; Wittenberg, Nathan J.; Jordan, Luke R.; Oh, Sang-Hyun

    2012-01-01

    With recent advances in high-throughput proteomics and systems biology, there is a growing demand for new instruments that can precisely quantify a wide range of receptor-ligand binding kinetics in a high-throughput fashion. Here we demonstrate a surface plasmon resonance (SPR) imaging spectroscopy instrument capable of extracting binding kinetics and affinities from 50 parallel microfluidic channels simultaneously. The instrument utilizes large-area (~cm2) metallic nanohole arrays as SPR sensing substrates and combines a broadband light source, a high-resolution imaging spectrometer and a low-noise CCD camera to extract spectral information from every channel in real time with a refractive index resolution of 7.7 × 10−6. To demonstrate the utility of our instrument for quantifying a wide range of biomolecular interactions, each parallel microfluidic channel is coated with a biomimetic supported lipid membrane containing ganglioside (GM1) receptors. The binding kinetics of cholera toxin b (CTX-b) to GM1 are then measured in a single experiment from 50 channels. By combining the highly parallel microfluidic device with large-area periodic nanohole array chips, our SPR imaging spectrometer system enables high-throughput, label-free, real-time SPR biosensing, and its full-spectral imaging capability combined with nanohole arrays could enable integration of SPR imaging with concurrent surface-enhanced Raman spectroscopy. PMID:22895607

  2. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  3. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  4. Applications of high throughput (combinatorial) methodologies to electronic, magnetic, optical, and energy-related materials

    NASA Astrophysics Data System (ADS)

    Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.

    2013-06-01

    High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.

  5. New generation pharmacogenomic tools: a SNP linkage disequilibrium Map, validated SNP assay resource, and high-throughput instrumentation system for large-scale genetic studies.

    PubMed

    De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A

    2002-06-01

    Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.

  6. Complexity Optimization and High-Throughput Low-Latency Hardware Implementation of a Multi-Electrode Spike-Sorting Algorithm

    PubMed Central

    Dragas, Jelena; Jäckel, David; Hierlemann, Andreas; Franke, Felix

    2017-01-01

    Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction. PMID:25415989

  7. Complexity optimization and high-throughput low-latency hardware implementation of a multi-electrode spike-sorting algorithm.

    PubMed

    Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix

    2015-03-01

    Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.

  8. Projection Reduction Exposure with Variable Axis Immersion Lenses (PREVAIL)-A High Throughput E-Beam Projection Approach for Next Generation Lithography

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Hans

    1999-12-01

    Projection reduction exposure with variable axis immersion lenses (PREVAIL) represents the high throughput e-beam projection approach to next generation lithography (NGL), which IBM is pursuing in cooperation with Nikon Corporation as an alliance partner. This paper discusses the challenges and accomplishments of the PREVAIL project. The supreme challenge facing all e-beam lithography approaches has been and still is throughput. Since the throughput of e-beam projection systems is severely limited by the available optical field size, the key to success is the ability to overcome this limitation. The PREVAIL technique overcomes field-limiting off-axis aberrations through the use of variable axis lenses, which electronically shift the optical axis simultaneously with the deflected beam, so that the beam effectively remains on axis. The resist images obtained with the proof-of-concept (POC) system demonstrate that PREVAIL effectively eliminates off-axis aberrations affecting both the resolution and placement accuracy of pixels. As part of the POC system a high emittance gun has been developed to provide uniform illumination of the patterned subfield, and to fill the large numerical aperture projection optics designed to significantly reduce beam blur caused by Coulombinteraction.

  9. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    PubMed

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  10. Luminex and other multiplex high throughput technologies for the identification of, and host response to, environmental triggers of type 1 diabetes.

    PubMed

    Purohit, Sharad; Sharma, Ashok; She, Jin-Xiong

    2015-01-01

    Complex interactions between a series of environmental factors and genes result in progression to clinical type 1 diabetes in genetically susceptible individuals. Despite several decades of research in the area, these interactions remain poorly understood. Several studies have yielded associations of certain foods, infections, and immunizations with the onset and progression of diabetes autoimmunity, but most findings are still inconclusive. Environmental triggers are difficult to identify mainly due to (i) large number and complex nature of environmental exposures, including bacteria, viruses, dietary factors, and environmental pollutants, (ii) reliance on low throughput technology, (iii) less efforts in quantifying host response, (iv) long silent period between the exposure and clinical onset of T1D which may lead to loss of the exposure fingerprints, and (v) limited sample sets. Recent development in multiplex technologies has enabled systematic evaluation of different classes of molecules or macroparticles in a high throughput manner. However, the use of multiplex assays in type 1 diabetes research is limited to cytokine assays. In this review, we will discuss the potential use of multiplex high throughput technologies in identification of environmental triggers and host response in type 1 diabetes.

  11. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  12. High-throughput, pooled sequencing identifies mutations in NUBPL and FOXRED1 in human complex I deficiency

    PubMed Central

    Calvo, Sarah E; Tucker, Elena J; Compton, Alison G; Kirby, Denise M; Crawford, Gabriel; Burtt, Noel P; Rivas, Manuel A; Guiducci, Candace; Bruno, Damien L; Goldberger, Olga A; Redman, Michelle C; Wiltshire, Esko; Wilson, Callum J; Altshuler, David; Gabriel, Stacey B; Daly, Mark J; Thorburn, David R; Mootha, Vamsi K

    2010-01-01

    Discovering the molecular basis of mitochondrial respiratory chain disease is challenging given the large number of both mitochondrial and nuclear genes involved. We report a strategy of focused candidate gene prediction, high-throughput sequencing, and experimental validation to uncover the molecular basis of mitochondrial complex I (CI) disorders. We created five pools of DNA from a cohort of 103 patients and then performed deep sequencing of 103 candidate genes to spotlight 151 rare variants predicted to impact protein function. We used confirmatory experiments to establish genetic diagnoses in 22% of previously unsolved cases, and discovered that defects in NUBPL and FOXRED1 can cause CI deficiency. Our study illustrates how large-scale sequencing, coupled with functional prediction and experimental validation, can reveal novel disease-causing mutations in individual patients. PMID:20818383

  13. [Methods of high-throughput plant phenotyping for large-scale breeding and genetic experiments].

    PubMed

    Afonnikov, D A; Genaev, M A; Doroshkov, A V; Komyshev, E G; Pshenichnikova, T A

    2016-07-01

    Phenomics is a field of science at the junction of biology and informatics which solves the problems of rapid, accurate estimation of the plant phenotype; it was rapidly developed because of the need to analyze phenotypic characteristics in large scale genetic and breeding experiments in plants. It is based on using the methods of computer image analysis and integration of biological data. Owing to automation, new approaches make it possible to considerably accelerate the process of estimating the characteristics of a phenotype, to increase its accuracy, and to remove a subjectivism (inherent to humans). The main technologies of high-throughput plant phenotyping in both controlled and field conditions, their advantages and disadvantages, and also the prospects of their use for the efficient solution of problems of plant genetics and breeding are presented in the review.

  14. Single-cell genome sequencing at ultra-high-throughput with microfluidic droplet barcoding.

    PubMed

    Lan, Freeman; Demaree, Benjamin; Ahmed, Noorsher; Abate, Adam R

    2017-07-01

    The application of single-cell genome sequencing to large cell populations has been hindered by technical challenges in isolating single cells during genome preparation. Here we present single-cell genomic sequencing (SiC-seq), which uses droplet microfluidics to isolate, fragment, and barcode the genomes of single cells, followed by Illumina sequencing of pooled DNA. We demonstrate ultra-high-throughput sequencing of >50,000 cells per run in a synthetic community of Gram-negative and Gram-positive bacteria and fungi. The sequenced genomes can be sorted in silico based on characteristic sequences. We use this approach to analyze the distributions of antibiotic-resistance genes, virulence factors, and phage sequences in microbial communities from an environmental sample. The ability to routinely sequence large populations of single cells will enable the de-convolution of genetic heterogeneity in diverse cell populations.

  15. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    PubMed

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  16. ChemHTPS - A virtual high-throughput screening program suite for the chemical and materials sciences

    NASA Astrophysics Data System (ADS)

    Afzal, Mohammad Atif Faiz; Evangelista, William; Hachmann, Johannes

    The discovery of new compounds, materials, and chemical reactions with exceptional properties is the key for the grand challenges in innovation, energy and sustainability. This process can be dramatically accelerated by means of the virtual high-throughput screening (HTPS) of large-scale candidate libraries. The resulting data can further be used to study the underlying structure-property relationships and thus facilitate rational design capability. This approach has been extensively used for many years in the drug discovery community. However, the lack of openly available virtual HTPS tools is limiting the use of these techniques in various other applications such as photovoltaics, optoelectronics, and catalysis. Thus, we developed ChemHTPS, a general-purpose, comprehensive and user-friendly suite, that will allow users to efficiently perform large in silico modeling studies and high-throughput analyses in these applications. ChemHTPS also includes a massively parallel molecular library generator which offers a multitude of options to customize and restrict the scope of the enumerated chemical space and thus tailor it for the demands of specific applications. To streamline the non-combinatorial exploration of chemical space, we incorporate genetic algorithms into the framework. In addition to implementing smarter algorithms, we also focus on the ease of use, workflow, and code integration to make this technology more accessible to the community.

  17. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis

    PubMed Central

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S.

    2016-01-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical ‘large p, small n’ problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package “adaptiveHM”, which is freely available from https://github.com/benliemory/adaptiveHM. PMID:28919931

  18. Strategies for high-throughput focused-beam ptychography

    DOE PAGES

    Jacobsen, Chris; Deng, Junjing; Nashed, Youssef

    2017-08-08

    X-ray ptychography is being utilized for a wide range of imaging experiments with a resolution beyond the limit of the X-ray optics used. Introducing a parameter for the ptychographic resolution gainG p(the ratio of the beam size over the achieved pixel size in the reconstructed image), strategies for data sampling and for increasing imaging throughput when the specimen is at the focus of an X-ray beam are considered. As a result, the tradeoffs between large and small illumination spots are examined.

  19. Strategies for high-throughput focused-beam ptychography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, Chris; Deng, Junjing; Nashed, Youssef

    X-ray ptychography is being utilized for a wide range of imaging experiments with a resolution beyond the limit of the X-ray optics used. Introducing a parameter for the ptychographic resolution gainG p(the ratio of the beam size over the achieved pixel size in the reconstructed image), strategies for data sampling and for increasing imaging throughput when the specimen is at the focus of an X-ray beam are considered. As a result, the tradeoffs between large and small illumination spots are examined.

  20. Simple fluorescence-based high throughput cell viability assay for filamentous fungi.

    PubMed

    Chadha, S; Kale, S P

    2015-09-01

    Filamentous fungi are important model organisms to understand the eukaryotic process and have been frequently exploited in research and industry. These fungi are also causative agents of serious diseases in plants and humans. Disease management strategies include in vitro susceptibility testing of the fungal pathogens to environmental conditions and antifungal agents. Conventional methods used for antifungal susceptibilities are cumbersome, time-consuming and are not suitable for a large-scale analysis. Here, we report a rapid, high throughput microplate-based fluorescence method for investigating the toxicity of antifungal and stress (osmotic, salt and oxidative) agents on Magnaporthe oryzae and compared it with agar dilution method. This bioassay is optimized for the resazurin reduction to fluorescent resorufin by the fungal hyphae. Resazurin bioassay showed inhibitory rates and IC50 values comparable to the agar dilution method and to previously reported IC50 or MICs for M. oryzae and other fungi. The present method can screen range of test agents from different chemical classes with different modes of action for antifungal activities in a simple, sensitive, time and cost effective manner. A simple fluorescence-based high throughput method is developed to test the effects of stress and antifungal agents on viability of filamentous fungus Magnaporthe oryzae. This resazurin fluorescence assay can detect inhibitory effects comparable to those obtained using the growth inhibition assay with added advantages of simplicity, time and cost effectiveness. This high throughput viability assay has a great potential in large-scale screening of the chemical libraries of antifungal agents, for evaluating the effects of environmental conditions and hyphal kinetic studies in mutant and natural populations of filamentous fungi. © 2015 The Society for Applied Microbiology.

  1. Optical designs of reflection and refraction collection optics for a JT-60SA core Thomson scattering system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tojo, H.; Hatae, T.; Hamano, T.

    2013-09-15

    Collection optics for core measurements in a JT-60SA Thomson scattering system were designed. The collection optics will be installed in a limited space and have a wide field of view and wide wavelength range. Two types of the optics are therefore suggested: refraction and reflection types. The reflection system, with a large primary mirror, avoids large chromatic aberrations. Because the size limit of the primary mirror and vignetting due to the secondary mirror affect the total collection throughput, conditions that provide the high throughput are found through an optimization. A refraction system with four lenses forming an Ernostar system ismore » also employed. The use of high-refractive-index glass materials enhances the freedom of the lens curvatures, resulting in suppression of the spherical and coma aberration. Moreover, sufficient throughput can be achieved, even with smaller lenses than that of a previous design given in [H. Tojo, T. Hatae, T. Sakuma, T. Hamano, K. Itami, Y. Aida, S. Suitoh, and D. Fujie, Rev. Sci. Instrum. 81, 10D539 (2010)]. The optical resolutions of the reflection and refraction systems are both sufficient for understanding the spatial structures in plasma. In particular, the spot sizes at the image of the optics are evaluated as ∼0.3 mm and ∼0.4 mm, respectively. The throughput for the two systems, including the pupil size and transmissivity, are also compared. The results show that good measurement accuracy (<10%) even at high electron temperatures (<30 keV) can be expected in the refraction system.« less

  2. Optical designs of reflection and refraction collection optics for a JT-60SA core Thomson scattering system.

    PubMed

    Tojo, H; Hatae, T; Hamano, T; Sakuma, T; Itami, K

    2013-09-01

    Collection optics for core measurements in a JT-60SA Thomson scattering system were designed. The collection optics will be installed in a limited space and have a wide field of view and wide wavelength range. Two types of the optics are therefore suggested: refraction and reflection types. The reflection system, with a large primary mirror, avoids large chromatic aberrations. Because the size limit of the primary mirror and vignetting due to the secondary mirror affect the total collection throughput, conditions that provide the high throughput are found through an optimization. A refraction system with four lenses forming an Ernostar system is also employed. The use of high-refractive-index glass materials enhances the freedom of the lens curvatures, resulting in suppression of the spherical and coma aberration. Moreover, sufficient throughput can be achieved, even with smaller lenses than that of a previous design given in [H. Tojo, T. Hatae, T. Sakuma, T. Hamano, K. Itami, Y. Aida, S. Suitoh, and D. Fujie, Rev. Sci. Instrum. 81, 10D539 (2010)]. The optical resolutions of the reflection and refraction systems are both sufficient for understanding the spatial structures in plasma. In particular, the spot sizes at the image of the optics are evaluated as ~0.3 mm and ~0.4 mm, respectively. The throughput for the two systems, including the pupil size and transmissivity, are also compared. The results show that good measurement accuracy (<10%) even at high electron temperatures (<30 keV) can be expected in the refraction system.

  3. Multinode acoustic focusing for parallel flow cytometry

    PubMed Central

    Piyasena, Menake E.; Suthanthiraraj, Pearlson P. Austin; Applegate, Robert W.; Goumas, Andrew M.; Woods, Travis A.; López, Gabriel P.; Graves, Steven W.

    2012-01-01

    Flow cytometry can simultaneously measure and analyze multiple properties of single cells or particles with high sensitivity and precision. Yet, conventional flow cytometers have fundamental limitations with regards to analyzing particles larger than about 70 microns, analyzing at flow rates greater than a few hundred microliters per minute, and providing analysis rates greater than 50,000 per second. To overcome these limits, we have developed multi-node acoustic focusing flow cells that can position particles (as small as a red blood cell and as large as 107 microns in diameter) into as many as 37 parallel flow streams. We demonstrate the potential of such flow cells for the development of high throughput, parallel flow cytometers by precision focusing of flow cytometry alignment microspheres, red blood cells, and the analysis of CD4+ cellular immunophenotyping assay. This approach will have significant impact towards the creation of high throughput flow cytometers for rare cell detection applications (e.g. circulating tumor cells), applications requiring large particle analysis, and high volume flow cytometry. PMID:22239072

  4. Variant-aware saturating mutagenesis using multiple Cas9 nucleases identifies regulatory elements at trait-associated loci.

    PubMed

    Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H

    2017-04-01

    Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.

  5. Bibliometrics for Social Validation.

    PubMed

    Hicks, Daniel J

    2016-01-01

    This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion.

  6. Bibliometrics for Social Validation

    PubMed Central

    2016-01-01

    This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion. PMID:28005974

  7. ARQiv-HTS, a versatile whole-organism screening platform enabling in vivo drug discovery at high-throughput rates

    PubMed Central

    White, David T; Eroglu, Arife Unal; Wang, Guohua; Zhang, Liyun; Sengupta, Sumitra; Ding, Ding; Rajpurohit, Surendra K; Walker, Steven L; Ji, Hongkai; Qian, Jiang; Mumm, Jeff S

    2017-01-01

    The zebrafish has emerged as an important model for whole-organism small-molecule screening. However, most zebrafish-based chemical screens have achieved only mid-throughput rates. Here we describe a versatile whole-organism drug discovery platform that can achieve true high-throughput screening (HTS) capacities. This system combines our automated reporter quantification in vivo (ARQiv) system with customized robotics, and is termed ‘ARQiv-HTS’. We detail the process of establishing and implementing ARQiv-HTS: (i) assay design and optimization, (ii) calculation of sample size and hit criteria, (iii) large-scale egg production, (iv) automated compound titration, (v) dispensing of embryos into microtiter plates, and (vi) reporter quantification. We also outline what we see as best practice strategies for leveraging the power of ARQiv-HTS for zebrafish-based drug discovery, and address technical challenges of applying zebrafish to large-scale chemical screens. Finally, we provide a detailed protocol for a recently completed inaugural ARQiv-HTS effort, which involved the identification of compounds that elevate insulin reporter activity. Compounds that increased the number of insulin-producing pancreatic beta cells represent potential new therapeutics for diabetic patients. For this effort, individual screening sessions took 1 week to conclude, and sessions were performed iteratively approximately every other day to increase throughput. At the conclusion of the screen, more than a half million drug-treated larvae had been evaluated. Beyond this initial example, however, the ARQiv-HTS platform is adaptable to almost any reporter-based assay designed to evaluate the effects of chemical compounds in living small-animal models. ARQiv-HTS thus enables large-scale whole-organism drug discovery for a variety of model species and from numerous disease-oriented perspectives. PMID:27831568

  8. US EPA - ToxCast and the Tox21 program: perspectives

    EPA Science Inventory

    ToxCast is a large-scale project being conducted by the U.S. EPA to screen ~2000 chemicals against a large battery of in vitro high-throughput screening (HTS) assays. ToxCast is complemented by the Tox21 project being jointly carried out by the U.S. NIH Chemical Genomics Center (...

  9. Genome sequencing in microfabricated high-density picolitre reactors.

    PubMed

    Margulies, Marcel; Egholm, Michael; Altman, William E; Attiya, Said; Bader, Joel S; Bemben, Lisa A; Berka, Jan; Braverman, Michael S; Chen, Yi-Ju; Chen, Zhoutao; Dewell, Scott B; Du, Lei; Fierro, Joseph M; Gomes, Xavier V; Godwin, Brian C; He, Wen; Helgesen, Scott; Ho, Chun Heen; Ho, Chun He; Irzyk, Gerard P; Jando, Szilveszter C; Alenquer, Maria L I; Jarvie, Thomas P; Jirage, Kshama B; Kim, Jong-Bum; Knight, James R; Lanza, Janna R; Leamon, John H; Lefkowitz, Steven M; Lei, Ming; Li, Jing; Lohman, Kenton L; Lu, Hong; Makhijani, Vinod B; McDade, Keith E; McKenna, Michael P; Myers, Eugene W; Nickerson, Elizabeth; Nobile, John R; Plant, Ramona; Puc, Bernard P; Ronan, Michael T; Roth, George T; Sarkis, Gary J; Simons, Jan Fredrik; Simpson, John W; Srinivasan, Maithreyan; Tartaro, Karrie R; Tomasz, Alexander; Vogt, Kari A; Volkmer, Greg A; Wang, Shally H; Wang, Yong; Weiner, Michael P; Yu, Pengguang; Begley, Richard F; Rothberg, Jonathan M

    2005-09-15

    The proliferation of large-scale DNA-sequencing projects in recent years has driven a search for alternative methods to reduce time and cost. Here we describe a scalable, highly parallel sequencing system with raw throughput significantly greater than that of state-of-the-art capillary electrophoresis instruments. The apparatus uses a novel fibre-optic slide of individual wells and is able to sequence 25 million bases, at 99% or better accuracy, in one four-hour run. To achieve an approximately 100-fold increase in throughput over current Sanger sequencing technology, we have developed an emulsion method for DNA amplification and an instrument for sequencing by synthesis using a pyrosequencing protocol optimized for solid support and picolitre-scale volumes. Here we show the utility, throughput, accuracy and robustness of this system by shotgun sequencing and de novo assembly of the Mycoplasma genitalium genome with 96% coverage at 99.96% accuracy in one run of the machine.

  10. Potential of dynamically harmonized Fourier transform ion cyclotron resonance cell for high-throughput metabolomics fingerprinting: control of data quality.

    PubMed

    Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle

    2018-01-01

    Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.

  11. Improvement of uridine production of Bacillus subtilis by atmospheric and room temperature plasma mutagenesis and high-throughput screening

    PubMed Central

    Li, Guoliang; Yuan, Hui; Zhang, Hongchao; Li, Yanjun; Xie, Xixian; Chen, Ning

    2017-01-01

    In the present study, a novel breeding strategy of atmospheric and room temperature plasma (ARTP) mutagenesis was used to improve the uridine production of engineered Bacillus subtilis TD12np. A high-throughput screening method was established using both resistant plates and 96-well microplates to select the ideal mutants with diverse phenotypes. Mutant F126 accumulated 5.7 and 30.3 g/L uridine after 30 h in shake-flask and 48 h in fed-batch fermentation, respectively, which represented a 4.4- and 8.7-fold increase over the parent strain. Sequence analysis of the pyrimidine nucleotide biosynthetic operon in the representative mutants showed that proline 1016 and glutamate 949 in the large subunit of B. subtilis carbamoyl phosphate synthetase were of importance for the allosteric regulation caused by uridine 5′-monophosphate. The proposed mutation method with efficient high-throughput screening assay was proved to be an appropriate strategy to obtain uridine-overproducing strain. PMID:28472077

  12. Improvement of uridine production of Bacillus subtilis by atmospheric and room temperature plasma mutagenesis and high-throughput screening.

    PubMed

    Fan, Xiaoguang; Wu, Heyun; Li, Guoliang; Yuan, Hui; Zhang, Hongchao; Li, Yanjun; Xie, Xixian; Chen, Ning

    2017-01-01

    In the present study, a novel breeding strategy of atmospheric and room temperature plasma (ARTP) mutagenesis was used to improve the uridine production of engineered Bacillus subtilis TD12np. A high-throughput screening method was established using both resistant plates and 96-well microplates to select the ideal mutants with diverse phenotypes. Mutant F126 accumulated 5.7 and 30.3 g/L uridine after 30 h in shake-flask and 48 h in fed-batch fermentation, respectively, which represented a 4.4- and 8.7-fold increase over the parent strain. Sequence analysis of the pyrimidine nucleotide biosynthetic operon in the representative mutants showed that proline 1016 and glutamate 949 in the large subunit of B. subtilis carbamoyl phosphate synthetase were of importance for the allosteric regulation caused by uridine 5'-monophosphate. The proposed mutation method with efficient high-throughput screening assay was proved to be an appropriate strategy to obtain uridine-overproducing strain.

  13. Remodeling Cildb, a popular database for cilia and links for ciliopathies

    PubMed Central

    2014-01-01

    Background New generation technologies in cell and molecular biology generate large amounts of data hard to exploit for individual proteins. This is particularly true for ciliary and centrosomal research. Cildb is a multi–species knowledgebase gathering high throughput studies, which allows advanced searches to identify proteins involved in centrosome, basal body or cilia biogenesis, composition and function. Combined to localization of genetic diseases on human chromosomes given by OMIM links, candidate ciliopathy proteins can be compiled through Cildb searches. Methods Othology between recent versions of the whole proteomes was computed using Inparanoid and ciliary high throughput studies were remapped on these recent versions. Results Due to constant evolution of the ciliary and centrosomal field, Cildb has been recently upgraded twice, with new species whole proteomes and new ciliary studies, and the latter version displays a novel BioMart interface, much more intuitive than the previous ones. Conclusions This already popular database is designed now for easier use and is up to date in regard to high throughput ciliary studies. PMID:25422781

  14. High-throughput technology for novel SO2 oxidation catalysts

    PubMed Central

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427

  15. A Method of High Throughput Monitoring Crop Physiology Using Chlorophyll Fluorescence and Multispectral Imaging.

    PubMed

    Wang, Heng; Qian, Xiangjie; Zhang, Lan; Xu, Sailong; Li, Haifeng; Xia, Xiaojian; Dai, Liankui; Xu, Liang; Yu, Jingquan; Liu, Xu

    2018-01-01

    We present a high throughput crop physiology condition monitoring system and corresponding monitoring method. The monitoring system can perform large-area chlorophyll fluorescence imaging and multispectral imaging. The monitoring method can determine the crop current condition continuously and non-destructively. We choose chlorophyll fluorescence parameters and relative reflectance of multispectral as the indicators of crop physiological status. Using tomato as experiment subject, the typical crop physiological stress, such as drought, nutrition deficiency and plant disease can be distinguished by the monitoring method. Furthermore, we have studied the correlation between the physiological indicators and the degree of stress. Besides realizing the continuous monitoring of crop physiology, the monitoring system and method provide the possibility of machine automatic diagnosis of the plant physiology. Highlights: A newly designed high throughput crop physiology monitoring system and the corresponding monitoring method are described in this study. Different types of stress can induce distinct fluorescence and spectral characteristics, which can be used to evaluate the physiological status of plants.

  16. In Vitro Testing of Engineered Nanomaterials in the EPA’s ToxCast Program (WC9)

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  17. High Throughput Screening for Inhibitors of Mycobacterium tuberculosis H37Rv

    PubMed Central

    ANANTHAN, SUBRAMANIAM; FAALEOLEA, ELLEN R.; GOLDMAN, ROBERT C.; HOBRATH, JUDITH V.; KWONG, CECIL D.; LAUGHON, BARBARA E.; MADDRY, JOSEPH A.; MEHTA, ALKA; RASMUSSEN, LYNN; REYNOLDS, ROBERT C.; SECRIST, JOHN A.; SHINDO, NICE; SHOWE, DUSTIN N.; SOSA, MELINDA I.; SULING, WILLIAM J.; WHITE, E. LUCILE

    2009-01-01

    SUMMARY There is an urgent need for the discovery and development of new antitubercular agents that target new biochemical pathways and treat drug resistant forms of the disease. One approach to addressing this need is through high throughput screening of medicinally relevant libraries against the whole bacterium in order to discover a variety of new, active scaffolds that will stimulate new biological research and drug discovery. Through the Tuberculosis Antimicrobial Acquisition and Coordinating Facility (www.taacf.org), a large, medicinally relevant chemical library was screened against M. tuberculosis strain H37Rv. The screening methods and a medicinal chemistry analysis of the results are reported herein. PMID:19758845

  18. High-throughput synchronization of mammalian cell cultures by spiral microfluidics.

    PubMed

    Lee, Wong Cheng; Bhagat, Ali Asgar S; Lim, Chwee Teck

    2014-01-01

    The development of mammalian cell cycle synchronization techniques has greatly advanced our understanding of many cellular regulatory events and mechanisms specific to different phases of the cell cycle. In this chapter, we describe a high-throughput microfluidic-based approach for cell cycle synchronization. By exploiting the relationship between cell size and its phase in the cell cycle, large numbers of synchronized cells can be obtained by size fractionation in a spiral microfluidic channel. Protocols for the synchronization of primary cells such as mesenchymal stem cells, and immortal cell lines such as Chinese hamster ovarian cells (CHO-CD36) and HeLa cells are provided as examples.

  19. Development of Novel Random Network Theory-Based Approaches to Identify Network Interactions among Nitrifying Bacteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Cindy

    2015-07-17

    The interactions among different microbial populations in a community could play more important roles in determining ecosystem functioning than species numbers and their abundances, but very little is known about such network interactions at a community level. The goal of this project is to develop novel framework approaches and associated software tools to characterize the network interactions in microbial communities based on high throughput, large scale high-throughput metagenomics data and apply these approaches to understand the impacts of environmental changes (e.g., climate change, contamination) on network interactions among different nitrifying populations and associated microbial communities.

  20. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Wei; Shabbir, Faizan; Gong, Chao

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less

  1. SnO as a potential oxide thermoelectric candidate

    DOE PAGES

    Miller, Samuel A.; Gorai, Prashun; Aydemir, Umut; ...

    2017-08-08

    Here we search for new thermoelectric materials, high-throughput calculations using a combination of semiempirical models and first principles density functional theory present a path to screen large numbers of compounds for the most promising candidates.

  2. In vitro data and in silico models for computational toxicology (Teratology Society ILSI HESI workshop)

    EPA Science Inventory

    The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biolog...

  3. Field-based high throughput phenotyping rapidly identifies genomic regions controlling yield components in rice

    PubMed Central

    Tanger, Paul; Klassen, Stephen; Mojica, Julius P.; Lovell, John T.; Moyers, Brook T.; Baraoidan, Marietta; Naredo, Maria Elizabeth B.; McNally, Kenneth L.; Poland, Jesse; Bush, Daniel R.; Leung, Hei; Leach, Jan E.; McKay, John K.

    2017-01-01

    To ensure food security in the face of population growth, decreasing water and land for agriculture, and increasing climate variability, crop yields must increase faster than the current rates. Increased yields will require implementing novel approaches in genetic discovery and breeding. Here we demonstrate the potential of field-based high throughput phenotyping (HTP) on a large recombinant population of rice to identify genetic variation underlying important traits. We find that detecting quantitative trait loci (QTL) with HTP phenotyping is as accurate and effective as traditional labor-intensive measures of flowering time, height, biomass, grain yield, and harvest index. Genetic mapping in this population, derived from a cross of an modern cultivar (IR64) with a landrace (Aswina), identified four alleles with negative effect on grain yield that are fixed in IR64, demonstrating the potential for HTP of large populations as a strategy for the second green revolution. PMID:28220807

  4. CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.

    PubMed

    Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W

    2010-09-01

    Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.

  5. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  6. A high-throughput assay for quantifying appetite and digestive dynamics.

    PubMed

    Jordi, Josua; Guggiana-Nilo, Drago; Soucy, Edward; Song, Erin Yue; Lei Wee, Caroline; Engert, Florian

    2015-08-15

    Food intake and digestion are vital functions, and their dysregulation is fundamental for many human diseases. Current methods do not support their dynamic quantification on large scales in unrestrained vertebrates. Here, we combine an infrared macroscope with fluorescently labeled food to quantify feeding behavior and intestinal nutrient metabolism with high temporal resolution, sensitivity, and throughput in naturally behaving zebrafish larvae. Using this method and rate-based modeling, we demonstrate that zebrafish larvae match nutrient intake to their bodily demand and that larvae adjust their digestion rate, according to the ingested meal size. Such adaptive feedback mechanisms make this model system amenable to identify potential chemical modulators. As proof of concept, we demonstrate that nicotine, l-lysine, ghrelin, and insulin have analogous impact on food intake as in mammals. Consequently, the method presented here will promote large-scale translational research of food intake and digestive function in a naturally behaving vertebrate. Copyright © 2015 the American Physiological Society.

  7. A high-throughput assay for quantifying appetite and digestive dynamics

    PubMed Central

    Guggiana-Nilo, Drago; Soucy, Edward; Song, Erin Yue; Lei Wee, Caroline; Engert, Florian

    2015-01-01

    Food intake and digestion are vital functions, and their dysregulation is fundamental for many human diseases. Current methods do not support their dynamic quantification on large scales in unrestrained vertebrates. Here, we combine an infrared macroscope with fluorescently labeled food to quantify feeding behavior and intestinal nutrient metabolism with high temporal resolution, sensitivity, and throughput in naturally behaving zebrafish larvae. Using this method and rate-based modeling, we demonstrate that zebrafish larvae match nutrient intake to their bodily demand and that larvae adjust their digestion rate, according to the ingested meal size. Such adaptive feedback mechanisms make this model system amenable to identify potential chemical modulators. As proof of concept, we demonstrate that nicotine, l-lysine, ghrelin, and insulin have analogous impact on food intake as in mammals. Consequently, the method presented here will promote large-scale translational research of food intake and digestive function in a naturally behaving vertebrate. PMID:26108871

  8. New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data

    NASA Astrophysics Data System (ADS)

    Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.

    2007-12-01

    High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.

  9. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  10. Arioc: high-throughput read alignment with GPU-accelerated exploration of the seed-and-extend search space

    PubMed Central

    Budavari, Tamas; Langmead, Ben; Wheelan, Sarah J.; Salzberg, Steven L.; Szalay, Alexander S.

    2015-01-01

    When computing alignments of DNA sequences to a large genome, a key element in achieving high processing throughput is to prioritize locations in the genome where high-scoring mappings might be expected. We formulated this task as a series of list-processing operations that can be efficiently performed on graphics processing unit (GPU) hardware.We followed this approach in implementing a read aligner called Arioc that uses GPU-based parallel sort and reduction techniques to identify high-priority locations where potential alignments may be found. We then carried out a read-by-read comparison of Arioc’s reported alignments with the alignments found by several leading read aligners. With simulated reads, Arioc has comparable or better accuracy than the other read aligners we tested. With human sequencing reads, Arioc demonstrates significantly greater throughput than the other aligners we evaluated across a wide range of sensitivity settings. The Arioc software is available at https://github.com/RWilton/Arioc. It is released under a BSD open-source license. PMID:25780763

  11. Process in manufacturing high efficiency AlGaAs/GaAs solar cells by MO-CVD

    NASA Technical Reports Server (NTRS)

    Yeh, Y. C. M.; Chang, K. I.; Tandon, J.

    1984-01-01

    Manufacturing technology for mass producing high efficiency GaAs solar cells is discussed. A progress using a high throughput MO-CVD reactor to produce high efficiency GaAs solar cells is discussed. Thickness and doping concentration uniformity of metal oxide chemical vapor deposition (MO-CVD) GaAs and AlGaAs layer growth are discussed. In addition, new tooling designs are given which increase the throughput of solar cell processing. To date, 2cm x 2cm AlGaAs/GaAs solar cells with efficiency up to 16.5% were produced. In order to meet throughput goals for mass producing GaAs solar cells, a large MO-CVD system (Cambridge Instrument Model MR-200) with a susceptor which was initially capable of processing 20 wafers (up to 75 mm diameter) during a single growth run was installed. In the MR-200, the sequencing of the gases and the heating power are controlled by a microprocessor-based programmable control console. Hence, operator errors can be reduced, leading to a more reproducible production sequence.

  12. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  13. Development and operation of a high-throughput accurate-wavelength lens-based spectrometer a)

    DOE PAGES

    Bell, Ronald E.

    2014-07-11

    A high-throughput spectrometer for the 400-820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm -1 grating is matched with fast f /1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy ≤ 0.075 arc seconds. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount behind the entrance slit. The computer-controlled hardware allows automated control of wavelength, timing, f-number, automated datamore » collection, and wavelength calibration.« less

  14. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecraft into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  15. A Dual-Mode Large-Arrayed CMOS ISFET Sensor for Accurate and High-Throughput pH Sensing in Biomedical Diagnosis.

    PubMed

    Huang, Xiwei; Yu, Hao; Liu, Xu; Jiang, Yu; Yan, Mei; Wu, Dongping

    2015-09-01

    The existing ISFET-based DNA sequencing detects hydrogen ions released during the polymerization of DNA strands on microbeads, which are scattered into microwell array above the ISFET sensor with unknown distribution. However, false pH detection happens at empty microwells due to crosstalk from neighboring microbeads. In this paper, a dual-mode CMOS ISFET sensor is proposed to have accurate pH detection toward DNA sequencing. Dual-mode sensing, optical and chemical modes, is realized by integrating a CMOS image sensor (CIS) with ISFET pH sensor, and is fabricated in a standard 0.18-μm CIS process. With accurate determination of microbead physical locations with CIS pixel by contact imaging, the dual-mode sensor can correlate local pH for one DNA slice at one location-determined microbead, which can result in improved pH detection accuracy. Moreover, toward a high-throughput DNA sequencing, a correlated-double-sampling readout that supports large array for both modes is deployed to reduce pixel-to-pixel nonuniformity such as threshold voltage mismatch. The proposed CMOS dual-mode sensor is experimentally examined to show a well correlated pH map and optical image for microbeads with a pH sensitivity of 26.2 mV/pH, a fixed pattern noise (FPN) reduction from 4% to 0.3%, and a readout speed of 1200 frames/s. A dual-mode CMOS ISFET sensor with suppressed FPN for accurate large-arrayed pH sensing is proposed and demonstrated with state-of-the-art measured results toward accurate and high-throughput DNA sequencing. The developed dual-mode CMOS ISFET sensor has great potential for future personal genome diagnostics with high accuracy and low cost.

  16. An Automated High-Throughput Metabolic Stability Assay Using an Integrated High-Resolution Accurate Mass Method and Automated Data Analysis Software.

    PubMed

    Shah, Pranav; Kerns, Edward; Nguyen, Dac-Trung; Obach, R Scott; Wang, Amy Q; Zakharov, Alexey; McKew, John; Simeonov, Anton; Hop, Cornelis E C A; Xu, Xin

    2016-10-01

    Advancement of in silico tools would be enabled by the availability of data for metabolic reaction rates and intrinsic clearance (CLint) of a diverse compound structure data set by specific metabolic enzymes. Our goal is to measure CLint for a large set of compounds with each major human cytochrome P450 (P450) isozyme. To achieve our goal, it is of utmost importance to develop an automated, robust, sensitive, high-throughput metabolic stability assay that can efficiently handle a large volume of compound sets. The substrate depletion method [in vitro half-life (t1/2) method] was chosen to determine CLint The assay (384-well format) consisted of three parts: 1) a robotic system for incubation and sample cleanup; 2) two different integrated, ultraperformance liquid chromatography/mass spectrometry (UPLC/MS) platforms to determine the percent remaining of parent compound, and 3) an automated data analysis system. The CYP3A4 assay was evaluated using two long t1/2 compounds, carbamazepine and antipyrine (t1/2 > 30 minutes); one moderate t1/2 compound, ketoconazole (10 < t1/2 < 30 minutes); and two short t1/2 compounds, loperamide and buspirone (t½ < 10 minutes). Interday and intraday precision and accuracy of the assay were within acceptable range (∼12%) for the linear range observed. Using this assay, CYP3A4 CLint and t1/2 values for more than 3000 compounds were measured. This high-throughput, automated, and robust assay allows for rapid metabolic stability screening of large compound sets and enables advanced computational modeling for individual human P450 isozymes. U.S. Government work not protected by U.S. copyright.

  17. Biased ligand quantification in drug discovery: from theory to high throughput screening to identify new biased μ opioid receptor agonists

    PubMed Central

    Winpenny, David; Clark, Mellissa

    2016-01-01

    Background and Purpose Biased GPCR ligands are able to engage with their target receptor in a manner that preferentially activates distinct downstream signalling and offers potential for next generation therapeutics. However, accurate quantification of ligand bias in vitro is complex, and current best practice is not amenable for testing large numbers of compound. We have therefore sought to apply ligand bias theory to an industrial scale screening campaign for the identification of new biased μ receptor agonists. Experimental Approach μ receptor assays with appropriate dynamic range were developed for both Gαi‐dependent signalling and β‐arrestin2 recruitment. Δlog(Emax/EC50) analysis was validated as an alternative for the operational model of agonism in calculating pathway bias towards Gαi‐dependent signalling. The analysis was applied to a high throughput screen to characterize the prevalence and nature of pathway bias among a diverse set of compounds with μ receptor agonist activity. Key Results A high throughput screening campaign yielded 440 hits with greater than 10‐fold bias relative to DAMGO. To validate these results, we quantified pathway bias of a subset of hits using the operational model of agonism. The high degree of correlation across these biased hits confirmed that Δlog(Emax/EC50) was a suitable method for identifying genuine biased ligands within a large collection of diverse compounds. Conclusions and Implications This work demonstrates that using Δlog(Emax/EC50), drug discovery can apply the concept of biased ligand quantification on a large scale and accelerate the deliberate discovery of novel therapeutics acting via this complex pharmacology. PMID:26791140

  18. Infrastructure to Support Ultra High Throughput Biodosimetry Screening after a Radiological Event

    PubMed Central

    Garty, G.; Karam, P.A.; Brenner, D. J.

    2011-01-01

    Purpose After a large-scale radiological event, there will be a pressing need to assess, within a few days, the radiation doses received by tens or hundreds of thousands of individuals. This is for triage, to prevent treatment locations from being overwhelmed, in what is sure to be a resource limited scenario, as well as to facilitate dose-dependent treatment decisions. In addition there are psychosocial considerations, in that active reassurance of minimal exposure is a potentially effective antidote to mass panic, as well as long-term considerations, to facilitate later studies of cancer and other long-term disease risks. Materials and Methods As described elsewhere in this issue, we are developing a Rapid Automated Biodosimetry Tool (RABiT). The RABiT allows high throughput analysis of thousands of blood samples per day, providing a dose estimate that can be used to support clinical triage and treatment decisions. Results Development of the RABiT has motivated us to consider the logistics of incorporating such a system into the existing emergency response scenarios of a large metropolitan area. We present here a view of how one or more centralized biodosimetry readout devices might be incorporated into an infrastructure in which fingerstick blood samples are taken at many distributed locations within an affected city or region and transported to centralized locations. Conclusions High throughput biodosimetry systems offer the opportunity to perform biodosimetric assessments on a large number of persons. As such systems reach a high level of maturity, emergency response scenarios will need to be tweaked to make use of these powerful tools. This can be done relatively easily within the framework of current scenarios. PMID:21675819

  19. Three applications of backscatter x-ray imaging technology to homeland defense

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2005-05-01

    A brief review of backscatter x-ray imaging and a description of three systems currently applying it to homeland defense missions (BodySearch, ZBV and ZBP). These missions include detection of concealed weapons, explosives and contraband on personnel, in vehicles and large cargo containers. An overview of the x-ray imaging subsystems is provided as well as sample images from each system. Key features such as x-ray safety, throughput and detection are discussed. Recent trends in operational modes are described that facilitate 100% inspection at high throughput chokepoints.

  20. Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concavemore » and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.« less

  1. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    PubMed

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  2. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types

    PubMed Central

    Pagès, Hervé

    2018-01-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set. PMID:29723188

  3. Genome-wide association study of rice (Oryza sativa L.) leaf traits with a high-throughput leaf scorer.

    PubMed

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Wang, Ke; Jiang, Ni; Feng, Hui; Chen, Guoxing; Liu, Qian; Xiong, Lizhong

    2015-09-01

    Leaves are the plant's solar panel and food factory, and leaf traits are always key issues to investigate in plant research. Traditional methods for leaf trait measurement are time-consuming. In this work, an engineering prototype has been established for high-throughput leaf scoring (HLS) of a large number of Oryza sativa accessions. The mean absolute per cent of errors in traditional measurements versus HLS were below 5% for leaf number, area, shape, and colour. Moreover, HLS can measure up to 30 leaves per minute. To demonstrate the usefulness of HLS in dissecting the genetic bases of leaf traits, a genome-wide association study (GWAS) was performed for 29 leaf traits related to leaf size, shape, and colour at three growth stages using HLS on a panel of 533 rice accessions. Nine associated loci contained known leaf-related genes, such as Nal1 for controlling the leaf width. In addition, a total of 73, 123, and 177 new loci were detected for traits associated with leaf size, colour, and shape, respectively. In summary, after evaluating the performance with a large number of rice accessions, the combination of GWAS and high-throughput leaf phenotyping (HLS) has proven a valuable strategy to identify the genetic loci controlling rice leaf traits. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  4. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types.

    PubMed

    Lun, Aaron T L; Pagès, Hervé; Smith, Mike L

    2018-05-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set.

  5. Study on high throughput nanomanufacturing of photopatternable nanofibers using tube nozzle electrospinning with multi-tubes and multi-nozzles

    NASA Astrophysics Data System (ADS)

    Fang, Sheng-Po; Jao, PitFee; Senior, David E.; Kim, Kyoung-Tae; Yoon, Yong-Kyu

    2017-12-01

    High throughput nanomanufacturing of photopatternable nanofibers and subsequent photopatterning is reported. For the production of high density nanofibers, the tube nozzle electrospinning (TNE) process has been used, where an array of micronozzles on the sidewall of a plastic tube are used as spinnerets. By increasing the density of nozzles, the electric fields of adjacent nozzles confine the cone of electrospinning and give a higher density of nanofibers. With TNE, higher density nozzles are easily achievable compared to metallic nozzles, e.g. an inter-nozzle distance as small as 0.5 cm and an average semi-vertical repulsion angle of 12.28° for 8-nozzles were achieved. Nanofiber diameter distribution, mass throughput rate, and growth rate of nanofiber stacks in different operating conditions and with different numbers of nozzles, such as 2, 4 and 8 nozzles, and scalability with single and double tube configurations are discussed. Nanofibers made of SU-8, photopatternable epoxy, have been collected to a thickness of over 80 μm in 240 s of electrospinning and the production rate of 0.75 g/h is achieved using the 2 tube 8 nozzle systems, followed by photolithographic micropatterning. TNE is scalable to a large number of nozzles, and offers high throughput production, plug and play capability with standard electrospinning equipment, and little waste of polymer.

  6. High throughput techniques to reveal the molecular physiology and evolution of digestion in spiders.

    PubMed

    Fuzita, Felipe J; Pinkse, Martijn W H; Patane, José S L; Verhaert, Peter D E M; Lopes, Adriana R

    2016-09-07

    Spiders are known for their predatory efficiency and for their high capacity of digesting relatively large prey. They do this by combining both extracorporeal and intracellular digestion. Whereas many high throughput ("-omics") techniques focus on biomolecules in spider venom, so far this approach has not yet been applied to investigate the protein composition of spider midgut diverticula (MD) and digestive fluid (DF). We here report on our investigations of both MD and DF of the spider Nephilingis (Nephilengys) cruentata through the use of next generation sequencing and shotgun proteomics. This shows that the DF is composed of a variety of hydrolases including peptidases, carbohydrases, lipases and nuclease, as well as of toxins and regulatory proteins. We detect 25 astacins in the DF. Phylogenetic analysis of the corresponding transcript(s) in Arachnida suggests that astacins have acquired an unprecedented role for extracorporeal digestion in Araneae, with different orthologs used by each family. The results of a comparative study of spiders in distinct physiological conditions allow us to propose some digestion mechanisms in this interesting animal taxon. All the high throughput data allowed the demonstration that DF is a secretion originating from the MD. We identified enzymes involved in the extracellular and intracellular phases of digestion. Besides that, data analyses show a large gene duplication event in Araneae digestive process evolution, mainly of astacin genes. We were also able to identify proteins expressed and translated in the digestive system, which until now had been exclusively associated to venom glands.

  7. Searching for microbial protein over-expression in a complex matrix using automated high throughput MS-based proteomics tools.

    PubMed

    Akeroyd, Michiel; Olsthoorn, Maurien; Gerritsma, Jort; Gutker-Vermaas, Diana; Ekkelkamp, Laurens; van Rij, Tjeerd; Klaassen, Paul; Plugge, Wim; Smit, Ed; Strupat, Kerstin; Wenzel, Thibaut; van Tilborg, Marcel; van der Hoeven, Rob

    2013-03-10

    In the discovery of new enzymes genomic and cDNA expression libraries containing thousands of differential clones are generated to obtain biodiversity. These libraries need to be screened for the activity of interest. Removing so-called empty and redundant clones significantly reduces the size of these expression libraries and therefore speeds up new enzyme discovery. Here, we present a sensitive, generic workflow for high throughput screening of successful microbial protein over-expression in microtiter plates containing a complex matrix based on mass spectrometry techniques. MALDI-LTQ-Orbitrap screening followed by principal component analysis and peptide mass fingerprinting was developed to obtain a throughput of ∼12,000 samples per week. Alternatively, a UHPLC-MS(2) approach including MS(2) protein identification was developed for microorganisms with a complex protein secretome with a throughput of ∼2000 samples per week. TCA-induced protein precipitation enhanced by addition of bovine serum albumin is used for protein purification prior to MS detection. We show that this generic workflow can effectively reduce large expression libraries from fungi and bacteria to their minimal size by detection of successful protein over-expression using MS. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. High-throughput discovery of rare human nucleotide polymorphisms by Ecotilling

    PubMed Central

    Till, Bradley J.; Zerr, Troy; Bowers, Elisabeth; Greene, Elizabeth A.; Comai, Luca; Henikoff, Steven

    2006-01-01

    Human individuals differ from one another at only ∼0.1% of nucleotide positions, but these single nucleotide differences account for most heritable phenotypic variation. Large-scale efforts to discover and genotype human variation have been limited to common polymorphisms. However, these efforts overlook rare nucleotide changes that may contribute to phenotypic diversity and genetic disorders, including cancer. Thus, there is an increasing need for high-throughput methods to robustly detect rare nucleotide differences. Toward this end, we have adapted the mismatch discovery method known as Ecotilling for the discovery of human single nucleotide polymorphisms. To increase throughput and reduce costs, we developed a universal primer strategy and implemented algorithms for automated band detection. Ecotilling was validated by screening 90 human DNA samples for nucleotide changes in 5 gene targets and by comparing results to public resequencing data. To increase throughput for discovery of rare alleles, we pooled samples 8-fold and found Ecotilling to be efficient relative to resequencing, with a false negative rate of 5% and a false discovery rate of 4%. We identified 28 new rare alleles, including some that are predicted to damage protein function. The detection of rare damaging mutations has implications for models of human disease. PMID:16893952

  9. Recycling isoelectric focusing with computer controlled data acquisition system. [for high resolution electrophoretic separation and purification of biomolecules

    NASA Technical Reports Server (NTRS)

    Egen, N. B.; Twitty, G. E.; Bier, M.

    1979-01-01

    Isoelectric focusing is a high-resolution technique for separating and purifying large peptides, proteins, and other biomolecules. The apparatus described in the present paper constitutes a new approach to fluid stabilization and increased throughput. Stabilization is achieved by flowing the process fluid uniformly through an array of closely spaced filter elements oriented parallel both to the electrodes and the direction of the flow. This seems to overcome the major difficulties of parabolic flow and electroosmosis at the walls, while limiting the convection to chamber compartments defined by adjacent spacers. Increased throughput is achieved by recirculating the process fluid through external heat exchange reservoirs, where the Joule heat is dissipated.

  10. Automated Analysis of siRNA Screens of Virus Infected Cells Based on Immunofluorescence Microscopy

    NASA Astrophysics Data System (ADS)

    Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl

    We present an image analysis approach as part of a high-throughput microscopy screening system based on cell arrays for the identification of genes involved in Hepatitis C and Dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in cells, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behavior of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.

  11. Microplate-Based Method for High-Throughput Screening (HTS) of Chromatographic Conditions Studies for Recombinant Protein Purification.

    PubMed

    Carvalho, Rimenys J; Cruz, Thayana A

    2018-01-01

    High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.

  12. An HTRF® Assay for the Protein Kinase ATM.

    PubMed

    Adams, Phillip; Clark, Jonathan; Hawdon, Simon; Hill, Jennifer; Plater, Andrew

    2017-01-01

    Ataxia telangiectasia mutated (ATM) is a serine/threonine kinase that plays a key role in the regulation of DNA damage pathways and checkpoint arrest. In recent years, there has been growing interest in ATM as a therapeutic target due to its association with cancer cell survival following genotoxic stress such as radio- and chemotherapy. Large-scale targeted drug screening campaigns have been hampered, however, by technical issues associated with the production of sufficient quantities of purified ATM and the availability of a suitable high-throughput assay. Using a purified, functionally active recombinant ATM and one of its physiological substrates, p53, we have developed an in vitro FRET-based activity assay that is suitable for high-throughput drug screening.

  13. Differentiating high priority pathway-based toxicity from non-specific effects in high throughput toxicity data: A foundation for prioritizing AOP development.

    EPA Science Inventory

    The ToxCast chemical screening approach enables the rapid assessment of large numbers of chemicals for biological effects, primarily at the molecular level. Adverse outcome pathways (AOPs) offer a means to link biomolecular effects with potential adverse outcomes at the level of...

  14. Cascading and Parallelising Curvilinear Inertial Focusing Systems for High Volume, Wide Size Distribution, Separation and Concentration of Particles

    PubMed Central

    Miller, B.; Jimenez, M.; Bridle, H.

    2016-01-01

    Inertial focusing is a microfluidic based separation and concentration technology that has expanded rapidly in the last few years. Throughput is high compared to other microfluidic approaches although sample volumes have typically remained in the millilitre range. Here we present a strategy for achieving rapid high volume processing with stacked and cascaded inertial focusing systems, allowing for separation and concentration of particles with a large size range, demonstrated here from 30 μm–300 μm. The system is based on curved channels, in a novel toroidal configuration and a stack of 20 devices has been shown to operate at 1 L/min. Recirculation allows for efficient removal of large particles whereas a cascading strategy enables sequential removal of particles down to a final stage where the target particle size can be concentrated. The demonstration of curved stacked channels operating in a cascaded manner allows for high throughput applications, potentially replacing filtration in applications such as environmental monitoring, industrial cleaning processes, biomedical and bioprocessing and many more. PMID:27808244

  15. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  16. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  17. Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing

    NASA Astrophysics Data System (ADS)

    Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey

    Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.

  18. Extruder system for high-throughput/steady-state hydrogen ice supply and application for pellet fueling of reactor-scale fusion experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combs, S.K.; Foust, C.R.; Qualls, A.L.

    Pellet injection systems for the next-generation fusion devices, such as the proposed International Thermonuclear Experimental Reactor (ITER), will require feed systems capable of providing a continuous supply of hydrogen ice at high throughputs. A straightforward concept in which multiple extruder units operate in tandem has been under development at the Oak Ridge National Laboratory. A prototype with three large-volume extruder units has been fabricated and tested in the laboratory. In experiments, it was found that each extruder could provide volumetric ice flow rates of up to {approximately}1.3 cm{sup 3}/s (for {approximately}10 s), which is sufficient for fueling fusion reactors atmore » the gigawatt power level. With the three extruders of the prototype operating in sequence, a steady rate of {approximately}0.33 cm{sup 3}/s was maintained for a duration of 1 h. Even steady-state rates approaching the full ITER design value ({approximately}1 cm{sup 3}/s) may be feasible with the prototype. However, additional extruder units (1{endash}3) would facilitate operations at the higher throughputs and reduce the duty cycle of each unit. The prototype can easily accommodate steady-state pellet fueling of present large tokamaks or other near-term plasma experiments.« less

  19. Evaluating the Toxicity Pathways Using High-Throughput Environmental Chemical Data

    EPA Science Inventory

    The application of HTS methods to the characterization of human phenotypic response to environmental chemicals is a largely unexplored area of pharmacogenomics. The U.S. Environmental Protection Agency (EPA), through its ToxCast program, is developing predictive toxicity approach...

  20. High-throughput on-chip in vivo neural regeneration studies using femtosecond laser nano-surgery and microfluidics

    NASA Astrophysics Data System (ADS)

    Rohde, Christopher B.; Zeng, Fei; Gilleland, Cody; Samara, Chrysanthi; Yanik, Mehmet F.

    2009-02-01

    In recent years, the advantages of using small invertebrate animals as model systems for human disease have become increasingly apparent and have resulted in three Nobel Prizes in medicine or chemistry during the last six years for studies conducted on the nematode Caenorhabditis elegans (C. elegans). The availability of a wide array of species-specific genetic techniques, along with the transparency of the worm and its ability to grow in minute volumes make C. elegans an extremely powerful model organism. We present a suite of technologies for complex high-throughput whole-animal genetic and drug screens. We demonstrate a high-speed microfluidic sorter that can isolate and immobilize C. elegans in a well-defined geometry, an integrated chip containing individually addressable screening chambers for incubation and exposure of individual animals to biochemical compounds, and a device for delivery of compound libraries in standard multiwell plates to microfluidic devices. The immobilization stability obtained by these devices is comparable to that of chemical anesthesia and the immobilization process does not affect lifespan, progeny production, or other aspects of animal health. The high-stability enables the use of a variety of key optical techniques. We use this to demonstrate femtosecond-laser nanosurgery and three-dimensional multiphoton microscopy. Used alone or in various combinations these devices facilitate a variety of high-throughput assays using whole animals, including mutagenesis and RNAi and drug screens at subcellular resolution, as well as high-throughput high-precision manipulations such as femtosecond-laser nanosurgery for large-scale in vivo neural degeneration and regeneration studies.

  1. Combining high-throughput sequencing with fruit body surveys reveals contrasting life-history strategies in fungi

    PubMed Central

    Ovaskainen, Otso; Schigel, Dmitry; Ali-Kovero, Heini; Auvinen, Petri; Paulin, Lars; Nordén, Björn; Nordén, Jenni

    2013-01-01

    Before the recent revolution in molecular biology, field studies on fungal communities were mostly confined to fruit bodies, whereas mycelial interactions were studied in the laboratory. Here we combine high-throughput sequencing with a fruit body inventory to study simultaneously mycelial and fruit body occurrences in a community of fungi inhabiting dead wood of Norway spruce. We studied mycelial occurrence by extracting DNA from wood samples followed by 454-sequencing of the ITS1 and ITS2 regions and an automated procedure for species identification. In total, we detected 198 species as mycelia and 137 species as fruit bodies. The correlation between mycelial and fruit body occurrences was high for the majority of the species, suggesting that high-throughput sequencing can successfully characterize the dominating fungal communities, despite possible biases related to sampling, PCR, sequencing and molecular identification. We used the fruit body and molecular data to test hypothesized links between life history and population dynamic parameters. We show that the species that have on average a high mycelial abundance also have a high fruiting rate and produce large fruit bodies, leading to a positive feedback loop in their population dynamics. Earlier studies have shown that species with specialized resource requirements are rarely seen fruiting, for which reason they are often classified as red-listed. We show with the help of high-throughput sequencing that some of these species are more abundant as mycelium in wood than what could be expected from their occurrence as fruit bodies. PMID:23575372

  2. Research progress of plant population genomics based on high-throughput sequencing.

    PubMed

    Wang, Yun-sheng

    2016-08-01

    Population genomics, a new paradigm for population genetics, combine the concepts and techniques of genomics with the theoretical system of population genetics and improve our understanding of microevolution through identification of site-specific effect and genome-wide effects using genome-wide polymorphic sites genotypeing. With the appearance and improvement of the next generation high-throughput sequencing technology, the numbers of plant species with complete genome sequences increased rapidly and large scale resequencing has also been carried out in recent years. Parallel sequencing has also been done in some plant species without complete genome sequences. These studies have greatly promoted the development of population genomics and deepened our understanding of the genetic diversity, level of linking disequilibium, selection effect, demographical history and molecular mechanism of complex traits of relevant plant population at a genomic level. In this review, I briely introduced the concept and research methods of population genomics and summarized the research progress of plant population genomics based on high-throughput sequencing. I also discussed the prospect as well as existing problems of plant population genomics in order to provide references for related studies.

  3. Development and application of a fluorescent glucose uptake assay for the high-throughput screening of non-glycoside SGLT2 inhibitors.

    PubMed

    Wu, Szu-Huei; Yao, Chun-Hsu; Hsieh, Chieh-Jui; Liu, Yu-Wei; Chao, Yu-Sheng; Song, Jen-Shin; Lee, Jinq-Chyi

    2015-07-10

    Sodium-dependent glucose co-transporter 2 (SGLT2) inhibitors are of current interest as a treatment for type 2 diabetes. Efforts have been made to discover phlorizin-related glycosides with good SGLT2 inhibitory activity. To increase structural diversity and better understand the role of non-glycoside SGLT2 inhibitors on glycemic control, we initiated a research program to identify non-glycoside hits from high-throughput screening. Here, we report the development of a novel, fluorogenic probe-based glucose uptake system based on a Cu(I)-catalyzed [3+2] cycloaddition. The safer processes and cheaper substances made the developed assay our first priority for large-scale primary screening as compared to the well-known [(14)C]-labeled α-methyl-D-glucopyranoside ([(14)C]-AMG) radioactive assay. This effort culminated in the identification of a benzimidazole, non-glycoside SGLT2 hit with an EC50 value of 0.62 μM by high-throughput screening of 41,000 compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. High-throughput imaging of adult fluorescent zebrafish with an LED fluorescence macroscope

    PubMed Central

    Blackburn, Jessica S; Liu, Sali; Raimondi, Aubrey R; Ignatius, Myron S; Salthouse, Christopher D; Langenau, David M

    2011-01-01

    Zebrafish are a useful vertebrate model for the study of development, behavior, disease and cancer. A major advantage of zebrafish is that large numbers of animals can be economically used for experimentation; however, high-throughput methods for imaging live adult zebrafish had not been developed. Here, we describe protocols for building a light-emitting diode (LED) fluorescence macroscope and for using it to simultaneously image up to 30 adult animals that transgenically express a fluorescent protein, are transplanted with fluorescently labeled tumor cells or are tagged with fluorescent elastomers. These protocols show that the LED fluorescence macroscope is capable of distinguishing five fluorescent proteins and can image unanesthetized swimming adult zebrafish in multiple fluorescent channels simultaneously. The macroscope can be built and used for imaging within 1 day, whereas creating fluorescently labeled adult zebrafish requires 1 hour to several months, depending on the method chosen. The LED fluorescence macroscope provides a low-cost, high-throughput method to rapidly screen adult fluorescent zebrafish and it will be useful for imaging transgenic animals, screening for tumor engraftment, and tagging individual fish for long-term analysis. PMID:21293462

  5. Projection Exposure with Variable Axis Immersion Lenses: A High-Throughput Electron Beam Approach to “Suboptical” Lithography

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Hans

    1995-12-01

    IBM's high-throughput e-beam stepper approach PRojection Exposure with Variable Axis Immersion Lenses (PREVAIL) is reviewed. The PREVAIL concept combines technology building blocks of our probe-forming EL-3 and EL-4 systems with the exposure efficiency of pattern projection. The technology represents an extension of the shaped-beam approach toward massively parallel pixel projection. As demonstrated, the use of variable-axis lenses can provide large field coverage through reduction of off-axis aberrations which limit the performance of conventional projection systems. Subfield pattern sections containing 107 or more pixels can be electronically selected (mask plane), projected and positioned (wafer plane) at high speed. To generate the entire chip pattern subfields must be stitched together sequentially in a combination of electronic and mechanical positioning of mask and wafer. The PREVAIL technology promises throughput levels competitive with those of optical steppers at superior resolution. The PREVAIL project is being pursued to demonstrate the viability of the technology and to develop an e-beam alternative to “suboptical” lithography.

  6. Lateral Temperature-Gradient Method for High-Throughput Characterization of Material Processing by Millisecond Laser Annealing.

    PubMed

    Bell, Robert T; Jacobs, Alan G; Sorg, Victoria C; Jung, Byungki; Hill, Megan O; Treml, Benjamin E; Thompson, Michael O

    2016-09-12

    A high-throughput method for characterizing the temperature dependence of material properties following microsecond to millisecond thermal annealing, exploiting the temperature gradients created by a lateral gradient laser spike anneal (lgLSA), is presented. Laser scans generate spatial thermal gradients of up to 5 °C/μm with peak temperatures ranging from ambient to in excess of 1400 °C, limited only by laser power and materials thermal limits. Discrete spatial property measurements across the temperature gradient are then equivalent to independent measurements after varying temperature anneals. Accurate temperature calibrations, essential to quantitative analysis, are critical and methods for both peak temperature and spatial/temporal temperature profile characterization are presented. These include absolute temperature calibrations based on melting and thermal decomposition, and time-resolved profiles measured using platinum thermistors. A variety of spatially resolved measurement probes, ranging from point-like continuous profiling to large area sampling, are discussed. Examples from annealing of III-V semiconductors, CdSe quantum dots, low-κ dielectrics, and block copolymers are included to demonstrate the flexibility, high throughput, and precision of this technique.

  7. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    PubMed Central

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, ChiHye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I.; Lee, Hoonkyung

    2016-01-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10−3 bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc– or V–porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials. PMID:26902156

  8. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide.

    PubMed

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, ChiHye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I; Lee, Hoonkyung

    2016-02-23

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10(-3) bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  9. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    NASA Astrophysics Data System (ADS)

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, Chihye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I.; Lee, Hoonkyung

    2016-02-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10-3 bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  10. Development and use of molecular markers: past and present.

    PubMed

    Grover, Atul; Sharma, P C

    2016-01-01

    Molecular markers, due to their stability, cost-effectiveness and ease of use provide an immensely popular tool for a variety of applications including genome mapping, gene tagging, genetic diversity diversity, phylogenetic analysis and forensic investigations. In the last three decades, a number of molecular marker techniques have been developed and exploited worldwide in different systems. However, only a handful of these techniques, namely RFLPs, RAPDs, AFLPs, ISSRs, SSRs and SNPs have received global acceptance. A recent revolution in DNA sequencing techniques has taken the discovery and application of molecular markers to high-throughput and ultrahigh-throughput levels. Although, the choice of marker will obviously depend on the targeted use, microsatellites, SNPs and genotyping by sequencing (GBS) largely fulfill most of the user requirements. Further, modern transcriptomic and functional markers will lead the ventures onto high-density genetic map construction, identification of QTLs, breeding and conservation strategies in times to come in combination with other high throughput techniques. This review presents an overview of different marker technologies and their variants with a comparative account of their characteristic features and applications.

  11. A high-throughput method for GMO multi-detection using a microfluidic dynamic array.

    PubMed

    Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J

    2014-02-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.

  12. High-Throughput Assay and Discovery of Small Molecules that Interrupt Malaria Transmission

    PubMed Central

    Plouffe, David M.; Wree, Melanie; Du, Alan Y.; Meister, Stephan; Li, Fengwu; Patra, Kailash; Lubar, Aristea; Okitsu, Shinji L.; Flannery, Erika L.; Kato, Nobutaka; Tanaseichuk, Olga; Comer, Eamon; Zhou, Bin; Kuhen, Kelli; Zhou, Yingyao; Leroy, Didier; Schreiber, Stuart L.; Scherer, Christina A.; Vinetz, Joseph; Winzeler, Elizabeth A.

    2016-01-01

    Summary Preventing transmission is an important element of malaria control. However, most of the current available methods to assay for malaria transmission blocking are relatively low throughput and cannot be applied to large chemical libraries. We have developed a high-throughput and cost-effective assay, the Saponin-lysis Sexual Stage Assay (SaLSSA), for identifying small molecules with transmission-blocking capacity. SaLSSA analysis of 13,983 unique compounds uncovered that >90% of well-characterized antimalarials, including endoperoxides and 4-aminoquinolines, as well as compounds active against asexual blood stages, lost most of their killing activity when parasites developed into metabolically quiescent stage V gametocytes. On the other hand, we identified compounds with consistent low nanomolar transmission-blocking activity, some of which showed cross-reactivity against asexual blood and liver stages. The data clearly emphasize substantial physiological differences between sexual and asexual parasites and provide a tool and starting points for the discovery and development of transmission-blocking drugs. PMID:26749441

  13. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  14. High-radiance LDP source for mask inspection and beam line applications (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Teramoto, Yusuke; Santos, Bárbara; Mertens, Guido; Kops, Ralf; Kops, Margarete; von Wezyk, Alexander; Bergmann, Klaus; Yabuta, Hironobu; Nagano, Akihisa; Ashizawa, Noritaka; Taniguchi, Yuta; Yamatani, Daiki; Shirai, Takahiro; Kasama, Kunihiko

    2017-04-01

    High-throughput actinic mask inspection tools are needed as EUVL begins to enter into volume production phase. One of the key technologies to realize such inspection tools is a high-radiance EUV source of which radiance is supposed to be as high as 100 W/mm2/sr. Ushio is developing laser-assisted discharge-produced plasma (LDP) sources. Ushio's LDP source is able to provide sufficient radiance as well as cleanliness, stability and reliability. Radiance behind the debris mitigation system was confirmed to be 120 W/mm2/sr at 9 kHz and peak radiance at the plasma was increased to over 200 W/mm2/sr in the recent development which supports high-throughput, high-precision mask inspection in the current and future technology nodes. One of the unique features of Ushio's LDP source is cleanliness. Cleanliness evaluation using both grazing-incidence Ru mirrors and normal-incidence Mo/Si mirrors showed no considerable damage to the mirrors other than smooth sputtering of the surface at the pace of a few nm per Gpulse. In order to prove the system reliability, several long-term tests were performed. Data recorded during the tests was analyzed to assess two-dimensional radiance stability. In addition, several operating parameters were monitored to figure out which contributes to the radiance stability. The latest model that features a large opening angle was recently developed so that the tool can utilize a large number of debris-free photons behind the debris shield. The model was designed both for beam line application and high-throughput mask inspection application. At the time of publication, the first product is supposed to be in use at the customer site.

  15. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  16. High-speed Fourier ptychographic microscopy based on programmable annular illuminations.

    PubMed

    Sun, Jiasong; Zuo, Chao; Zhang, Jialin; Fan, Yao; Chen, Qian

    2018-05-16

    High-throughput quantitative phase imaging (QPI) is essential to cellular phenotypes characterization as it allows high-content cell analysis and avoids adverse effects of staining reagents on cellular viability and cell signaling. Among different approaches, Fourier ptychographic microscopy (FPM) is probably the most promising technique to realize high-throughput QPI by synthesizing a wide-field, high-resolution complex image from multiple angle-variably illuminated, low-resolution images. However, the large dataset requirement in conventional FPM significantly limits its imaging speed, resulting in low temporal throughput. Moreover, the underlying theoretical mechanism as well as optimum illumination scheme for high-accuracy phase imaging in FPM remains unclear. Herein, we report a high-speed FPM technique based on programmable annular illuminations (AIFPM). The optical-transfer-function (OTF) analysis of FPM reveals that the low-frequency phase information can only be correctly recovered if the LEDs are precisely located at the edge of the objective numerical aperture (NA) in the frequency space. By using only 4 low-resolution images corresponding to 4 tilted illuminations matching a 10×, 0.4 NA objective, we present the high-speed imaging results of in vitro Hela cells mitosis and apoptosis at a frame rate of 25 Hz with a full-pitch resolution of 655 nm at a wavelength of 525 nm (effective NA = 0.8) across a wide field-of-view (FOV) of 1.77 mm 2 , corresponding to a space-bandwidth-time product of 411 megapixels per second. Our work reveals an important capability of FPM towards high-speed high-throughput imaging of in vitro live cells, achieving video-rate QPI performance across a wide range of scales, both spatial and temporal.

  17. Medium-throughput processing of whole mount in situ hybridisation experiments into gene expression domains.

    PubMed

    Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes

    2012-01-01

    Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.

  18. Biases in the Experimental Annotations of Protein Function and Their Effect on Our Understanding of Protein Function Space

    PubMed Central

    Schnoes, Alexandra M.; Ream, David C.; Thorman, Alexander W.; Babbitt, Patricia C.; Friedberg, Iddo

    2013-01-01

    The ongoing functional annotation of proteins relies upon the work of curators to capture experimental findings from scientific literature and apply them to protein sequence and structure data. However, with the increasing use of high-throughput experimental assays, a small number of experimental studies dominate the functional protein annotations collected in databases. Here, we investigate just how prevalent is the “few articles - many proteins” phenomenon. We examine the experimentally validated annotation of proteins provided by several groups in the GO Consortium, and show that the distribution of proteins per published study is exponential, with 0.14% of articles providing the source of annotations for 25% of the proteins in the UniProt-GOA compilation. Since each of the dominant articles describes the use of an assay that can find only one function or a small group of functions, this leads to substantial biases in what we know about the function of many proteins. Mass-spectrometry, microscopy and RNAi experiments dominate high throughput experiments. Consequently, the functional information derived from these experiments is mostly of the subcellular location of proteins, and of the participation of proteins in embryonic developmental pathways. For some organisms, the information provided by different studies overlap by a large amount. We also show that the information provided by high throughput experiments is less specific than those provided by low throughput experiments. Given the experimental techniques available, certain biases in protein function annotation due to high-throughput experiments are unavoidable. Knowing that these biases exist and understanding their characteristics and extent is important for database curators, developers of function annotation programs, and anyone who uses protein function annotation data to plan experiments. PMID:23737737

  19. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    NASA Astrophysics Data System (ADS)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  20. High Throughput, Real-time, Dual-readout Testing of Intracellular Antimicrobial Activity and Eukaryotic Cell Cytotoxicity

    PubMed Central

    Chiaraviglio, Lucius; Kang, Yoon-Suk; Kirby, James E.

    2016-01-01

    Traditional measures of intracellular antimicrobial activity and eukaryotic cell cytotoxicity rely on endpoint assays. Such endpoint assays require several additional experimental steps prior to readout, such as cell lysis, colony forming unit determination, or reagent addition. When performing thousands of assays, for example, during high-throughput screening, the downstream effort required for these types of assays is considerable. Therefore, to facilitate high-throughput antimicrobial discovery, we developed a real-time assay to simultaneously identify inhibitors of intracellular bacterial growth and assess eukaryotic cell cytotoxicity. Specifically, real-time intracellular bacterial growth detection was enabled by marking bacterial screening strains with either a bacterial lux operon (1st generation assay) or fluorescent protein reporters (2nd generation, orthogonal assay). A non-toxic, cell membrane-impermeant, nucleic acid-binding dye was also added during initial infection of macrophages. These dyes are excluded from viable cells. However, non-viable host cells lose membrane integrity permitting entry and fluorescent labeling of nuclear DNA (deoxyribonucleic acid). Notably, DNA binding is associated with a large increase in fluorescent quantum yield that provides a solution-based readout of host cell death. We have used this combined assay to perform a high-throughput screen in microplate format, and to assess intracellular growth and cytotoxicity by microscopy. Notably, antimicrobials may demonstrate synergy in which the combined effect of two or more antimicrobials when applied together is greater than when applied separately. Testing for in vitro synergy against intracellular pathogens is normally a prodigious task as combinatorial permutations of antibiotics at different concentrations must be assessed. However, we found that our real-time assay combined with automated, digital dispensing technology permitted facile synergy testing. Using these approaches, we were able to systematically survey action of a large number of antimicrobials alone and in combination against the intracellular pathogen, Legionella pneumophila. PMID:27911388

  1. Review of microfluidic microbioreactor technology for high-throughput submerged microbiological cultivation

    PubMed Central

    Hegab, Hanaa M.; ElMekawy, Ahmed; Stakenborg, Tim

    2013-01-01

    Microbial fermentation process development is pursuing a high production yield. This requires a high throughput screening and optimization of the microbial strains, which is nowadays commonly achieved by applying slow and labor-intensive submerged cultivation in shake flasks or microtiter plates. These methods are also limited towards end-point measurements, low analytical data output, and control over the fermentation process. These drawbacks could be overcome by means of scaled-down microfluidic microbioreactors (μBR) that allow for online control over cultivation data and automation, hence reducing cost and time. This review goes beyond previous work not only by providing a detailed update on the current μBR fabrication techniques but also the operation and control of μBRs is compared to large scale fermentation reactors. PMID:24404006

  2. Predicting Novel Bulk Metallic Glasses via High- Throughput Calculations

    NASA Astrophysics Data System (ADS)

    Perim, E.; Lee, D.; Liu, Y.; Toher, C.; Gong, P.; Li, Y.; Simmons, W. N.; Levy, O.; Vlassak, J.; Schroers, J.; Curtarolo, S.

    Bulk metallic glasses (BMGs) are materials which may combine key properties from crystalline metals, such as high hardness, with others typically presented by plastics, such as easy processability. However, the cost of the known BMGs poses a significant obstacle for the development of applications, which has lead to a long search for novel, economically viable, BMGs. The emergence of high-throughput DFT calculations, such as the library provided by the AFLOWLIB consortium, has provided new tools for materials discovery. We have used this data to develop a new glass forming descriptor combining structural factors with thermodynamics in order to quickly screen through a large number of alloy systems in the AFLOWLIB database, identifying the most promising systems and the optimal compositions for glass formation. National Science Foundation (DMR-1436151, DMR-1435820, DMR-1436268).

  3. pH measurement and a rational and practical pH control strategy for high throughput cell culture system.

    PubMed

    Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli

    2010-01-01

    The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers

  4. High Throughput Screening of Toxicity Pathways Perturbed by Environmental Chemicals

    EPA Science Inventory

    Toxicology, a field largely unchanged over the past several decades, is undergoing a significant transformation driven by a number of forces – the increasing number of chemicals needing assessment, changing legal requirements, advances in biology and computer science, and concern...

  5. Overview of ToxCast™

    EPA Science Inventory

    In 2007, EPA launched ToxCast™ in order to develop a cost-effective approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time. Using data from state-of-the-art high throughput screening (HTS) bioassays developed in the pharmaceutical i...

  6. Phenotypic screening for developmental neurotoxicity: mechanistic data at the level of the cell

    EPA Science Inventory

    There are large numbers of environmental chemicals with little or no available information on their toxicity, including developmental neurotoxicity. Because of the resource-intensive nature of traditional animal tests, high-throughput (HTP) methods that can rapidly evaluate chemi...

  7. Protocols and programs for high-throughput growth and aging phenotyping in yeast.

    PubMed

    Jung, Paul P; Christian, Nils; Kay, Daniel P; Skupin, Alexander; Linster, Carole L

    2015-01-01

    In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass) correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS), a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.

  8. Strategic and Operational Plan for Integrating Transcriptomics ...

    EPA Pesticide Factsheets

    Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016

  9. High-Throughput Experimental Approach Capabilities | Materials Science |

    Science.gov Websites

    NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non

  10. Polymorphism discovery and allele frequency estimation using high-throughput DNA sequencing of target-enriched pooled DNA samples

    PubMed Central

    2012-01-01

    Background The central role of the somatotrophic axis in animal post-natal growth, development and fertility is well established. Therefore, the identification of genetic variants affecting quantitative traits within this axis is an attractive goal. However, large sample numbers are a pre-requisite for the identification of genetic variants underlying complex traits and although technologies are improving rapidly, high-throughput sequencing of large numbers of complete individual genomes remains prohibitively expensive. Therefore using a pooled DNA approach coupled with target enrichment and high-throughput sequencing, the aim of this study was to identify polymorphisms and estimate allele frequency differences across 83 candidate genes of the somatotrophic axis, in 150 Holstein-Friesian dairy bulls divided into two groups divergent for genetic merit for fertility. Results In total, 4,135 SNPs and 893 indels were identified during the resequencing of the 83 candidate genes. Nineteen percent (n = 952) of variants were located within 5' and 3' UTRs. Seventy-two percent (n = 3,612) were intronic and 9% (n = 464) were exonic, including 65 indels and 236 SNPs resulting in non-synonymous substitutions (NSS). Significant (P < 0.01) mean allele frequency differentials between the low and high fertility groups were observed for 720 SNPs (58 NSS). Allele frequencies for 43 of the SNPs were also determined by genotyping the 150 individual animals (Sequenom® MassARRAY). No significant differences (P > 0.1) were observed between the two methods for any of the 43 SNPs across both pools (i.e., 86 tests in total). Conclusions The results of the current study support previous findings of the use of DNA sample pooling and high-throughput sequencing as a viable strategy for polymorphism discovery and allele frequency estimation. Using this approach we have characterised the genetic variation within genes of the somatotrophic axis and related pathways, central to mammalian post-natal growth and development and subsequent lactogenesis and fertility. We have identified a large number of variants segregating at significantly different frequencies between cattle groups divergent for calving interval plausibly harbouring causative variants contributing to heritable variation. To our knowledge, this is the first report describing sequencing of targeted genomic regions in any livestock species using groups with divergent phenotypes for an economically important trait. PMID:22235840

  11. Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.

    2006-05-01

    Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less

  12. A high-throughput media design approach for high performance mammalian fed-batch cultures

    PubMed Central

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame. PMID:23563583

  13. Automatic Segmentation of High-Throughput RNAi Fluorescent Cellular Images

    PubMed Central

    Yan, Pingkum; Zhou, Xiaobo; Shah, Mubarak; Wong, Stephen T. C.

    2010-01-01

    High-throughput genome-wide RNA interference (RNAi) screening is emerging as an essential tool to assist biologists in understanding complex cellular processes. The large number of images produced in each study make manual analysis intractable; hence, automatic cellular image analysis becomes an urgent need, where segmentation is the first and one of the most important steps. In this paper, a fully automatic method for segmentation of cells from genome-wide RNAi screening images is proposed. Nuclei are first extracted from the DNA channel by using a modified watershed algorithm. Cells are then extracted by modeling the interaction between them as well as combining both gradient and region information in the Actin and Rac channels. A new energy functional is formulated based on a novel interaction model for segmenting tightly clustered cells with significant intensity variance and specific phenotypes. The energy functional is minimized by using a multiphase level set method, which leads to a highly effective cell segmentation method. Promising experimental results demonstrate that automatic segmentation of high-throughput genome-wide multichannel screening can be achieved by using the proposed method, which may also be extended to other multichannel image segmentation problems. PMID:18270043

  14. Development of Low-cost, High Energy-per-unit-area Solar Cell Modules

    NASA Technical Reports Server (NTRS)

    Jones, G. T.; Chitre, S.; Rhee, S. S.

    1978-01-01

    The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.

  15. 20180312 - Application of a Multiplexed High Content Imaging (HCI) Based Cell Viability and Apoptosis Chemical Screening Assay with Results in MCF-7 Cells (SOT)

    EPA Science Inventory

    The NCCT high throughput transcriptomics (HTTr) screening program uses whole transcriptome profiling assay in human-derived cells to collect concentration-response data for large numbers (100s-1000s) of environmental chemicals. To contextualize HTTr data, chemical effects on cell...

  16. Recent advances in high-throughput QCL-based infrared microspectral imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rowlette, Jeremy A.; Fotheringham, Edeline; Nichols, David; Weida, Miles J.; Kane, Justin; Priest, Allen; Arnone, David B.; Bird, Benjamin; Chapman, William B.; Caffey, David B.; Larson, Paul; Day, Timothy

    2017-02-01

    The field of infrared spectral imaging and microscopy is advancing rapidly due in large measure to the recent commercialization of the first high-throughput, high-spatial-definition quantum cascade laser (QCL) microscope. Having speed, resolution and noise performance advantages while also eliminating the need for cryogenic cooling, its introduction has established a clear path to translating the well-established diagnostic capability of infrared spectroscopy into clinical and pre-clinical histology, cytology and hematology workflows. Demand for even higher throughput while maintaining high-spectral fidelity and low-noise performance continues to drive innovation in QCL-based spectral imaging instrumentation. In this talk, we will present for the first time, recent technological advances in tunable QCL photonics which have led to an additional 10X enhancement in spectral image data collection speed while preserving the high spectral fidelity and SNR exhibited by the first generation of QCL microscopes. This new approach continues to leverage the benefits of uncooled microbolometer focal plane array cameras, which we find to be essential for ensuring both reproducibility of data across instruments and achieving the high-reliability needed in clinical applications. We will discuss the physics underlying these technological advancements as well as the new biomedical applications these advancements are enabling, including automated whole-slide infrared chemical imaging on clinically relevant timescales.

  17. A Protocol for Functional Assessment of Whole-Protein Saturation Mutagenesis Libraries Utilizing High-Throughput Sequencing.

    PubMed

    Stiffler, Michael A; Subramanian, Subu K; Salinas, Victor H; Ranganathan, Rama

    2016-07-03

    Site-directed mutagenesis has long been used as a method to interrogate protein structure, function and evolution. Recent advances in massively-parallel sequencing technology have opened up the possibility of assessing the functional or fitness effects of large numbers of mutations simultaneously. Here, we present a protocol for experimentally determining the effects of all possible single amino acid mutations in a protein of interest utilizing high-throughput sequencing technology, using the 263 amino acid antibiotic resistance enzyme TEM-1 β-lactamase as an example. In this approach, a whole-protein saturation mutagenesis library is constructed by site-directed mutagenic PCR, randomizing each position individually to all possible amino acids. The library is then transformed into bacteria, and selected for the ability to confer resistance to β-lactam antibiotics. The fitness effect of each mutation is then determined by deep sequencing of the library before and after selection. Importantly, this protocol introduces methods which maximize sequencing read depth and permit the simultaneous selection of the entire mutation library, by mixing adjacent positions into groups of length accommodated by high-throughput sequencing read length and utilizing orthogonal primers to barcode each group. Representative results using this protocol are provided by assessing the fitness effects of all single amino acid mutations in TEM-1 at a clinically relevant dosage of ampicillin. The method should be easily extendable to other proteins for which a high-throughput selection assay is in place.

  18. Generalized empirical Bayesian methods for discovery of differential data in high-throughput biology.

    PubMed

    Hardcastle, Thomas J

    2016-01-15

    High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Integration into Big Data: First Steps to Support Reuse of Comprehensive Toxicity Model Modules (SOT)

    EPA Science Inventory

    Data surrounding the needs of human disease and toxicity modeling are largely siloed limiting the ability to extend and reuse modules across knowledge domains. Using an infrastructure that supports integration across knowledge domains (animal toxicology, high-throughput screening...

  20. Activity profiles of 676 ToxCast Phase II compounds in 231 biochemical high-throughput screening assays

    EPA Science Inventory

    Understanding potential health risks posed by environmental chemicals is a significant challenge elevated by large numbers of diverse chemicals with generally uncharacterized exposures, mechanisms and toxicities. The present study is a performance evaluation and critical analysis...

  1. Influence relevance voting: an accurate and interpretable virtual high throughput screening method.

    PubMed

    Swamidass, S Joshua; Azencott, Chloé-Agathe; Lin, Ting-Wan; Gramajo, Hugo; Tsai, Shiou-Chuan; Baldi, Pierre

    2009-04-01

    Given activity training data from high-throughput screening (HTS) experiments, virtual high-throughput screening (vHTS) methods aim to predict in silico the activity of untested chemicals. We present a novel method, the Influence Relevance Voter (IRV), specifically tailored for the vHTS task. The IRV is a low-parameter neural network which refines a k-nearest neighbor classifier by nonlinearly combining the influences of a chemical's neighbors in the training set. Influences are decomposed, also nonlinearly, into a relevance component and a vote component. The IRV is benchmarked using the data and rules of two large, open, competitions, and its performance compared to the performance of other participating methods, as well as of an in-house support vector machine (SVM) method. On these benchmark data sets, IRV achieves state-of-the-art results, comparable to the SVM in one case, and significantly better than the SVM in the other, retrieving three times as many actives in the top 1% of its prediction-sorted list. The IRV presents several other important advantages over SVMs and other methods: (1) the output predictions have a probabilistic semantic; (2) the underlying inferences are interpretable; (3) the training time is very short, on the order of minutes even for very large data sets; (4) the risk of overfitting is minimal, due to the small number of free parameters; and (5) additional information can easily be incorporated into the IRV architecture. Combined with its performance, these qualities make the IRV particularly well suited for vHTS.

  2. Representing high throughput expression profiles via perturbation barcodes reveals compound targets.

    PubMed

    Filzen, Tracey M; Kutchukian, Peter S; Hermes, Jeffrey D; Li, Jing; Tudor, Matthew

    2017-02-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound's high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data.

  3. Representing high throughput expression profiles via perturbation barcodes reveals compound targets

    PubMed Central

    Kutchukian, Peter S.; Li, Jing; Tudor, Matthew

    2017-01-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound’s high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data. PMID:28182661

  4. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    NASA Astrophysics Data System (ADS)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  5. Field-based high throughput phenotyping rapidly identifies genomic regions controlling yield components in rice

    DOE PAGES

    Tanger, Paul; Klassen, Stephen; Mojica, Julius P.; ...

    2017-02-21

    In order to ensure food security in the face of population growth, decreasing water and land for agriculture, and increasing climate variability, crop yields must increase faster than the current rates. Increased yields will require implementing novel approaches in genetic discovery and breeding. We demonstrate the potential of field-based high throughput phenotyping (HTP) on a large recombinant population of rice to identify genetic variation underlying important traits. We find that detecting quantitative trait loci (QTL) with HTP phenotyping is as accurate and effective as traditional labor- intensive measures of flowering time, height, biomass, grain yield, and harvest index. Furthermore, geneticmore » mapping in this population, derived from a cross of an modern cultivar (IR64) with a landrace (Aswina), identified four alleles with negative effect on grain yield that are fixed in IR64, demonstrating the potential for HTP of large populations as a strategy for the second green revolution.« less

  6. Field-based high throughput phenotyping rapidly identifies genomic regions controlling yield components in rice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanger, Paul; Klassen, Stephen; Mojica, Julius P.

    In order to ensure food security in the face of population growth, decreasing water and land for agriculture, and increasing climate variability, crop yields must increase faster than the current rates. Increased yields will require implementing novel approaches in genetic discovery and breeding. We demonstrate the potential of field-based high throughput phenotyping (HTP) on a large recombinant population of rice to identify genetic variation underlying important traits. We find that detecting quantitative trait loci (QTL) with HTP phenotyping is as accurate and effective as traditional labor- intensive measures of flowering time, height, biomass, grain yield, and harvest index. Furthermore, geneticmore » mapping in this population, derived from a cross of an modern cultivar (IR64) with a landrace (Aswina), identified four alleles with negative effect on grain yield that are fixed in IR64, demonstrating the potential for HTP of large populations as a strategy for the second green revolution.« less

  7. A data set from flash X-ray imaging of carboxysomes

    NASA Astrophysics Data System (ADS)

    Hantke, Max F.; Hasse, Dirk; Ekeberg, Tomas; John, Katja; Svenda, Martin; Loh, Duane; Martin, Andrew V.; Timneanu, Nicusor; Larsson, Daniel S. D.; van der Schot, Gijs; Carlsson, Gunilla H.; Ingelman, Margareta; Andreasson, Jakob; Westphal, Daniel; Iwan, Bianca; Uetrecht, Charlotte; Bielecki, Johan; Liang, Mengning; Stellato, Francesco; Deponte, Daniel P.; Bari, Sadia; Hartmann, Robert; Kimmel, Nils; Kirian, Richard A.; Seibert, M. Marvin; Mühlig, Kerstin; Schorb, Sebastian; Ferguson, Ken; Bostedt, Christoph; Carron, Sebastian; Bozek, John D.; Rolles, Daniel; Rudenko, Artem; Foucar, Lutz; Epp, Sascha W.; Chapman, Henry N.; Barty, Anton; Andersson, Inger; Hajdu, Janos; Maia, Filipe R. N. C.

    2016-08-01

    Ultra-intense femtosecond X-ray pulses from X-ray lasers permit structural studies on single particles and biomolecules without crystals. We present a large data set on inherently heterogeneous, polyhedral carboxysome particles. Carboxysomes are cell organelles that vary in size and facilitate up to 40% of Earth’s carbon fixation by cyanobacteria and certain proteobacteria. Variation in size hinders crystallization. Carboxysomes appear icosahedral in the electron microscope. A protein shell encapsulates a large number of Rubisco molecules in paracrystalline arrays inside the organelle. We used carboxysomes with a mean diameter of 115±26 nm from Halothiobacillus neapolitanus. A new aerosol sample-injector allowed us to record 70,000 low-noise diffraction patterns in 12 min. Every diffraction pattern is a unique structure measurement and high-throughput imaging allows sampling the space of structural variability. The different structures can be separated and phased directly from the diffraction data and open a way for accurate, high-throughput studies on structures and structural heterogeneity in biology and elsewhere.

  8. Impact of automation on mass spectrometry.

    PubMed

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  10. Plate-based diversity subset screening: an efficient paradigm for high throughput screening of a large screening file.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Knight, Michelle; Loesel, Jens; Mathias, John; McLoughlin, David; Mills, James; Sharp, Robert E; Williams, Christine; Wood, Terence P

    2013-05-01

    The screening files of many large companies, including Pfizer, have grown considerably due to internal chemistry efforts, company mergers and acquisitions, external contracted synthesis, or compound purchase schemes. In order to screen the targets of interest in a cost-effective fashion, we devised an easy-to-assemble, plate-based diversity subset (PBDS) that represents almost the entire computed chemical space of the screening file whilst comprising only a fraction of the plates in the collection. In order to create this file, we developed new design principles for the quality assessment of screening plates: the Rule of 40 (Ro40) and a plate selection process that insured excellent coverage of both library chemistry and legacy chemistry space. This paper describes the rationale, design, construction, and performance of the PBDS, that has evolved into the standard paradigm for singleton (one compound per well) high-throughput screening in Pfizer since its introduction in 2006.

  11. MS-REDUCE: an ultrafast technique for reduction of big mass spectrometry data for high-throughput processing.

    PubMed

    Awan, Muaaz Gul; Saeed, Fahad

    2016-05-15

    Modern proteomics studies utilize high-throughput mass spectrometers which can produce data at an astonishing rate. These big mass spectrometry (MS) datasets can easily reach peta-scale level creating storage and analytic problems for large-scale systems biology studies. Each spectrum consists of thousands of peaks which have to be processed to deduce the peptide. However, only a small percentage of peaks in a spectrum are useful for peptide deduction as most of the peaks are either noise or not useful for a given spectrum. This redundant processing of non-useful peaks is a bottleneck for streaming high-throughput processing of big MS data. One way to reduce the amount of computation required in a high-throughput environment is to eliminate non-useful peaks. Existing noise removing algorithms are limited in their data-reduction capability and are compute intensive making them unsuitable for big data and high-throughput environments. In this paper we introduce a novel low-complexity technique based on classification, quantization and sampling of MS peaks. We present a novel data-reductive strategy for analysis of Big MS data. Our algorithm, called MS-REDUCE, is capable of eliminating noisy peaks as well as peaks that do not contribute to peptide deduction before any peptide deduction is attempted. Our experiments have shown up to 100× speed up over existing state of the art noise elimination algorithms while maintaining comparable high quality matches. Using our approach we were able to process a million spectra in just under an hour on a moderate server. The developed tool and strategy has been made available to wider proteomics and parallel computing community and the code can be found at https://github.com/pcdslab/MSREDUCE CONTACT: : fahad.saeed@wmich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Holes generation in glass using large spot femtosecond laser pulses

    NASA Astrophysics Data System (ADS)

    Berg, Yuval; Kotler, Zvi; Shacham-Diamand, Yosi

    2018-03-01

    We demonstrate high-throughput, symmetrical, holes generation in fused silica glass using a large spot size, femtosecond IR-laser irradiation which modifies the glass properties and yields an enhanced chemical etching rate. The process relies on a balanced interplay between the nonlinear Kerr effect and multiphoton absorption in the glass which translates into symmetrical glass modification and increased etching rate. The use of a large laser spot size makes it possible to process thick glasses at high speeds over a large area. We have demonstrated such fabricated holes with an aspect ratio of 1:10 in a 1 mm thick glass samples.

  13. Integrative Systems Biology for Data Driven Knowledge Discovery

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2015-01-01

    Integrative systems biology is an approach that brings together diverse high throughput experiments and databases to gain new insights into biological processes or systems at molecular through physiological levels. These approaches rely on diverse high-throughput experimental techniques that generate heterogeneous data by assaying varying aspects of complex biological processes. Computational approaches are necessary to provide an integrative view of these experimental results and enable data-driven knowledge discovery. Hypotheses generated from these approaches can direct definitive molecular experiments in a cost effective manner. Using integrative systems biology approaches, we can leverage existing biological knowledge and large-scale data to improve our understanding of yet unknown components of a system of interest and how its malfunction leads to disease. PMID:21044756

  14. Mining high-throughput experimental data to link gene and function

    PubMed Central

    Blaby-Haas, Crysten E.; de Crécy-Lagard, Valérie

    2011-01-01

    Nearly 2200 genomes encoding some 6 million proteins have now been sequenced. Around 40% of these proteins are of unknown function even when function is loosely and minimally defined as “belonging to a superfamily”. In addition to in silico methods, the swelling stream of high-throughput experimental data can give valuable clues for linking these “unknowns” with precise biological roles. The goal is to develop integrative data-mining platforms that allow the scientific community at large to access and utilize this rich source of experimental knowledge. To this end, we review recent advances in generating whole-genome experimental datasets, where this data can be accessed, and how it can be used to drive prediction of gene function. PMID:21310501

  15. A catalogue of polymorphisms related to xenobiotic metabolism and cancer susceptibility.

    PubMed

    Gemignani, Federica; Landi, Stefano; Vivant, Franck; Zienolddiny, Shanbeh; Brennan, Paul; Canzian, Federico

    2002-08-01

    High-throughput genotyping technology of multiple genes based on large samples of cases and controls are likely to be important in identifying common genes which have a moderate effect on the development of specific diseases. We present here a comprehensive list of 313 known experimentally confirmed polymorphisms in 54 genes which are particularly relevant for metabolism of drugs, alcohol, tobacco, and other potential carcinogens. We have compiled a catalog with a standardized format that summarizes the genetic and biochemical properties of the selected polymorphisms. We have also confirmed or redesigned experimental conditions for simplex or multiplex PCR amplification of a subset of 168 SNPs of particular interest, which will provide the basis for the design of assays compatible with high-throughput genotyping.

  16. Nanowire-nanopore transistor sensor for DNA detection during translocation

    NASA Astrophysics Data System (ADS)

    Xie, Ping; Xiong, Qihua; Fang, Ying; Qing, Quan; Lieber, Charles

    2011-03-01

    Nanopore sequencing, as a promising low cost, high throughput sequencing technique, has been proposed more than a decade ago. Due to the incompatibility between small ionic current signal and fast translocation speed and the technical difficulties on large scale integration of nanopore for direct ionic current sequencing, alternative methods rely on integrated DNA sensors have been proposed, such as using capacitive coupling or tunnelling current etc. But none of them have been experimentally demonstrated yet. Here we show that for the first time an amplified sensor signal has been experimentally recorded from a nanowire-nanopore field effect transistor sensor during DNA translocation. Independent multi-channel recording was also demonstrated for the first time. Our results suggest that the signal is from highly localized potential change caused by DNA translocation in none-balanced buffer condition. Given this method may produce larger signal for smaller nanopores, we hope our experiment can be a starting point for a new generation of nanopore sequencing devices with larger signal, higher bandwidth and large-scale multiplexing capability and finally realize the ultimate goal of low cost high throughput sequencing.

  17. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes

    PubMed Central

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.

    2016-01-01

    Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502

  18. On Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang

    Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which showmore » concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.« less

  19. High-throughput selection for cellulase catalysts using chemical complementation.

    PubMed

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  20. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    PubMed Central

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  1. Novel microscale approaches for easy, rapid determination of protein stability in academic and commercial settings

    PubMed Central

    Alexander, Crispin G.; Wanner, Randy; Johnson, Christopher M.; Breitsprecher, Dennis; Winter, Gerhard; Duhr, Stefan; Baaske, Philipp; Ferguson, Neil

    2014-01-01

    Chemical denaturant titrations can be used to accurately determine protein stability. However, data acquisition is typically labour intensive, has low throughput and is difficult to automate. These factors, combined with high protein consumption, have limited the adoption of chemical denaturant titrations in commercial settings. Thermal denaturation assays can be automated, sometimes with very high throughput. However, thermal denaturation assays are incompatible with proteins that aggregate at high temperatures and large extrapolation of stability parameters to physiological temperatures can introduce significant uncertainties. We used capillary-based instruments to measure chemical denaturant titrations by intrinsic fluorescence and microscale thermophoresis. This allowed higher throughput, consumed several hundred-fold less protein than conventional, cuvette-based methods yet maintained the high quality of the conventional approaches. We also established efficient strategies for automated, direct determination of protein stability at a range of temperatures via chemical denaturation, which has utility for characterising stability for proteins that are difficult to purify in high yield. This approach may also have merit for proteins that irreversibly denature or aggregate in classical thermal denaturation assays. We also developed procedures for affinity ranking of protein–ligand interactions from ligand-induced changes in chemical denaturation data, and proved the principle for this by correctly ranking the affinity of previously unreported peptide–PDZ domain interactions. The increased throughput, automation and low protein consumption of protein stability determinations afforded by using capillary-based methods to measure denaturant titrations, can help to revolutionise protein research. We believe that the strategies reported are likely to find wide applications in academia, biotherapeutic formulation and drug discovery programmes. PMID:25262836

  2. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships

    PubMed Central

    2010-01-01

    Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245

  3. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships.

    PubMed

    Seok, Junhee; Kaushal, Amit; Davis, Ronald W; Xiao, Wenzhong

    2010-01-18

    The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data.

  4. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  5. USDA Potato Small RNA Database

    USDA-ARS?s Scientific Manuscript database

    Small RNAs (sRNAs) are now understood to be involved in gene regulation, function and development. High throughput sequencing (HTS) of sRNAs generates large data sets for analyzing the abundance, source and roles for specific sRNAs. These sRNAs result from transcript degradation as well as specific ...

  6. Modeling Reproductive Toxicity for Chemical Prioritization into an Integrated Testing Strategy

    EPA Science Inventory

    The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals in over 500 assays of different molecular targets, cellular responses and cell-states. Of th...

  7. The ToxCast Chemical Prioritization Program at the US EPA (UCLA Molecular Toxicology Program)

    EPA Science Inventory

    To meet the needs of chemical regulators reviewing large numbers of data-poor chemicals for safety, the EPA's National Center for Computational Toxicology is developing a means of efficiently testing thousands of compounds for potential toxicity. High-throughput bioactivity profi...

  8. PGAS in-memory data processing for the Processing Unit of the Upgraded Electronics of the Tile Calorimeter of the ATLAS Detector

    NASA Astrophysics Data System (ADS)

    Ohene-Kwofie, Daniel; Otoo, Ekow

    2015-10-01

    The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level. We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.

  9. Read count-based method for high-throughput allelic genotyping of transposable elements and structural variants.

    PubMed

    Kuhn, Alexandre; Ong, Yao Min; Quake, Stephen R; Burkholder, William F

    2015-07-08

    Like other structural variants, transposable element insertions can be highly polymorphic across individuals. Their functional impact, however, remains poorly understood. Current genome-wide approaches for genotyping insertion-site polymorphisms based on targeted or whole-genome sequencing remain very expensive and can lack accuracy, hence new large-scale genotyping methods are needed. We describe a high-throughput method for genotyping transposable element insertions and other types of structural variants that can be assayed by breakpoint PCR. The method relies on next-generation sequencing of multiplex, site-specific PCR amplification products and read count-based genotype calls. We show that this method is flexible, efficient (it does not require rounds of optimization), cost-effective and highly accurate. This method can benefit a wide range of applications from the routine genotyping of animal and plant populations to the functional study of structural variants in humans.

  10. Optimal forwarding ratio on dynamical networks with heterogeneous mobility

    NASA Astrophysics Data System (ADS)

    Gan, Yu; Tang, Ming; Yang, Hanxin

    2013-05-01

    Since the discovery of non-Poisson statistics of human mobility trajectories, more attention has been paid to understand the role of these patterns in different dynamics. In this study, we first introduce the heterogeneous mobility of mobile agents into dynamical networks, and then investigate packet forwarding strategy on the heterogeneous dynamical networks. We find that the faster speed and the higher proportion of high-speed agents can enhance the network throughput and reduce the mean traveling time in random forwarding. A hierarchical structure in the dependence of high-speed is observed: the network throughput remains unchanged at small and large high-speed value. It is also interesting to find that a slightly preferential forwarding to high-speed agents can maximize the network capacity. Through theoretical analysis and numerical simulations, we show that the optimal forwarding ratio stems from the local structural heterogeneity of low-speed agents.

  11. High-throughput accurate-wavelength lens-based visible spectrometer.

    PubMed

    Bell, Ronald E; Scotti, Filippo

    2010-10-01

    A scanning visible spectrometer has been prototyped to complement fixed-wavelength transmission grating spectrometers for charge exchange recombination spectroscopy. Fast f/1.8 200 mm commercial lenses are used with a large 2160 mm(-1) grating for high throughput. A stepping-motor controlled sine drive positions the grating, which is mounted on a precision rotary table. A high-resolution optical encoder on the grating stage allows the grating angle to be measured with an absolute accuracy of 0.075 arc  sec, corresponding to a wavelength error ≤0.005 Å. At this precision, changes in grating groove density due to thermal expansion and variations in the refractive index of air are important. An automated calibration procedure determines all the relevant spectrometer parameters to high accuracy. Changes in bulk grating temperature, atmospheric temperature, and pressure are monitored between the time of calibration and the time of measurement to ensure a persistent wavelength calibration.

  12. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    PubMed

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D

    2009-11-01

    While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  13. A High-Throughput Screening Approach to Discovering Good Forms of Biologically Inspired Visual Representation

    PubMed Central

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J.; Cox, David D.

    2009-01-01

    While many models of biological object recognition share a common set of “broad-stroke” properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model—e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct “parts” have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision. PMID:19956750

  14. Fabrication of nanostructured transmissive optical devices on ITO-glass with UV1116 photoresist using high-energy electron beam lithography.

    PubMed

    Williams, Calum; Bartholomew, Richard; Rughoobur, Girish; Gordon, George S D; Flewitt, Andrew J; Wilkinson, Timothy D

    2016-12-02

    High-energy electron beam lithography for patterning nanostructures on insulating substrates can be challenging. For high resolution, conventional resists require large exposure doses and for reasonable throughput, using typical beam currents leads to charge dissipation problems. Here, we use UV1116 photoresist (Dow Chemical Company), designed for photolithographic technologies, with a relatively low area dose at a standard operating current (80 kV, 40-50 μC cm -2 , 1 nAs -1 ) to pattern over large areas on commercially coated ITO-glass cover slips. The minimum linewidth fabricated was ∼33 nm with 80 nm spacing; for isolated structures, ∼45 nm structural width with 50 nm separation. Due to the low beam dose, and nA current, throughput is high. This work highlights the use of UV1116 photoresist as an alternative to conventional e-beam resists on insulating substrates. To evaluate suitability, we fabricate a range of transmissive optical devices, that could find application for customized wire-grid polarisers and spectral filters for imaging, which operate based on the excitation of surface plasmon polaritons in nanosized geometries, with arrays encompassing areas ∼0.25 cm 2 .

  15. Fabrication of nanostructured transmissive optical devices on ITO-glass with UV1116 photoresist using high-energy electron beam lithography

    NASA Astrophysics Data System (ADS)

    Williams, Calum; Bartholomew, Richard; Rughoobur, Girish; Gordon, George S. D.; Flewitt, Andrew J.; Wilkinson, Timothy D.

    2016-12-01

    High-energy electron beam lithography for patterning nanostructures on insulating substrates can be challenging. For high resolution, conventional resists require large exposure doses and for reasonable throughput, using typical beam currents leads to charge dissipation problems. Here, we use UV1116 photoresist (Dow Chemical Company), designed for photolithographic technologies, with a relatively low area dose at a standard operating current (80 kV, 40-50 μC cm-2, 1 nAs-1) to pattern over large areas on commercially coated ITO-glass cover slips. The minimum linewidth fabricated was ˜33 nm with 80 nm spacing; for isolated structures, ˜45 nm structural width with 50 nm separation. Due to the low beam dose, and nA current, throughput is high. This work highlights the use of UV1116 photoresist as an alternative to conventional e-beam resists on insulating substrates. To evaluate suitability, we fabricate a range of transmissive optical devices, that could find application for customized wire-grid polarisers and spectral filters for imaging, which operate based on the excitation of surface plasmon polaritons in nanosized geometries, with arrays encompassing areas ˜0.25 cm2.

  16. The LAMAR: A high throughput X-ray astronomy facility for a moderate cost mission

    NASA Technical Reports Server (NTRS)

    Gorenstein, P.; Schwartz, D.

    1981-01-01

    The performance of a large area modular array of reflectors (LAMAR) is considered in several hypothetical observations relevant to: (1) cosmology, the X-ray background, and large scale structure of the universe; (2) clusters of galaxies and their evolution; (3) quasars and other active galactic nuclei; (4) compact objects in our galaxy; (5) stellar coronae; and (6) energy input to the interstellar medium.

  17. High speed optical object recognition processor with massive holographic memory

    NASA Technical Reports Server (NTRS)

    Chao, T.; Zhou, H.; Reyes, G.

    2002-01-01

    Real-time object recognition using a compact grayscale optical correlator will be introduced. A holographic memory module for storing a large bank of optimum correlation filters, to accommodate the large data throughput rate needed for many real-world applications, has also been developed. System architecture of the optical processor and the holographic memory will be presented. Application examples of this object recognition technology will also be demonstrated.

  18. Large-scale generation of human iPSC-derived neural stem cells/early neural progenitor cells and their neuronal differentiation.

    PubMed

    D'Aiuto, Leonardo; Zhi, Yun; Kumar Das, Dhanjit; Wilcox, Madeleine R; Johnson, Jon W; McClain, Lora; MacDonald, Matthew L; Di Maio, Roberto; Schurdak, Mark E; Piazza, Paolo; Viggiano, Luigi; Sweet, Robert; Kinchington, Paul R; Bhattacharjee, Ayantika G; Yolken, Robert; Nimgaonka, Vishwajit L; Nimgaonkar, Vishwajit L

    2014-01-01

    Induced pluripotent stem cell (iPSC)-based technologies offer an unprecedented opportunity to perform high-throughput screening of novel drugs for neurological and neurodegenerative diseases. Such screenings require a robust and scalable method for generating large numbers of mature, differentiated neuronal cells. Currently available methods based on differentiation of embryoid bodies (EBs) or directed differentiation of adherent culture systems are either expensive or are not scalable. We developed a protocol for large-scale generation of neuronal stem cells (NSCs)/early neural progenitor cells (eNPCs) and their differentiation into neurons. Our scalable protocol allows robust and cost-effective generation of NSCs/eNPCs from iPSCs. Following culture in neurobasal medium supplemented with B27 and BDNF, NSCs/eNPCs differentiate predominantly into vesicular glutamate transporter 1 (VGLUT1) positive neurons. Targeted mass spectrometry analysis demonstrates that iPSC-derived neurons express ligand-gated channels and other synaptic proteins and whole-cell patch-clamp experiments indicate that these channels are functional. The robust and cost-effective differentiation protocol described here for large-scale generation of NSCs/eNPCs and their differentiation into neurons paves the way for automated high-throughput screening of drugs for neurological and neurodegenerative diseases.

  19. High-throughput NGL electron-beam direct-write lithography system

    NASA Astrophysics Data System (ADS)

    Parker, N. William; Brodie, Alan D.; McCoy, John H.

    2000-07-01

    Electron beam lithography systems have historically had low throughput. The only practical solution to this limitation is an approach using many beams writing simultaneously. For single-column multi-beam systems, including projection optics (SCALPELR and PREVAIL) and blanked aperture arrays, throughput and resolution are limited by space-charge effects. Multibeam micro-column (one beam per column) systems are limited by the need for low voltage operation, electrical connection density and fabrication complexities. In this paper, we discuss a new multi-beam concept employing multiple columns each with multiple beams to generate a very large total number of parallel writing beams. This overcomes the limitations of space-charge interactions and low voltage operation. We also discuss a rationale leading to the optimum number of columns and beams per column. Using this approach we show how production throughputs >= 60 wafers per hour can be achieved at CDs

  20. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  1. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  2. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  3. A novel multiplex poliovirus binding inhibition assay applicable for large serosurveillance and vaccine studies, without the use of live poliovirus.

    PubMed

    Schepp, Rutger M; Berbers, Guy A M; Ferreira, José A; Reimerink, Johan H; van der Klis, Fiona R

    2017-03-01

    Large-scale serosurveillance or vaccine studies for poliovirus using the "gold standard" WHO neutralisation test (NT) are very laborious and time consuming. With the polio eradication at hand and with the removal of live attenuated Sabin strains from the oral poliovirus vaccine (OPV), starting with type 2 (as of April 2016), laboratories will need to conform to much more stringent laboratory biosafety regulations when handling live poliovirus strains. In this study, a poliovirus binding inhibition multiplex immunoassay (polio MIA) using inactivated poliovirus vaccine (IPV-Salk) was developed for simultaneous quantification of serum antibodies directed to all three poliovirus types. Our assay shows a good correlation with the NT and an excellent correlation with the ELISA-based binding inhibition assay (POBI). The assay is highly type-specific and reproducible. Additionally, serum sample throughput increases about fivefold relative to NT and POBI and the amount of serum needed is reduced by more than 90%. In conclusion, the polio MIA can be used as a safe and high throughput application, especially for large-scale surveillance and vaccine studies, reducing laboratory time and serum amounts needed. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. An UPLC-ESI-MS/MS Assay Using 6-Aminoquinolyl-N-Hydroxysuccinimidyl Carbamate Derivatization for Targeted Amino Acid Analysis: Application to Screening of Arabidopsis thaliana Mutants.

    PubMed

    Salazar, Carolina; Armenta, Jenny M; Shulaev, Vladimir

    2012-07-06

    In spite of the large arsenal of methodologies developed for amino acid assessment in complex matrices, their implementation in metabolomics studies involving wide-ranging mutant screening is hampered by their lack of high-throughput, sensitivity, reproducibility, and/or wide dynamic range. In response to the challenge of developing amino acid analysis methods that satisfy the criteria required for metabolomic studies, improved reverse-phase high-performance liquid chromatography-mass spectrometry (RPHPLC-MS) methods have been recently reported for large-scale screening of metabolic phenotypes. However, these methods focus on the direct analysis of underivatized amino acids and, therefore, problems associated with insufficient retention and resolution are observed due to the hydrophilic nature of amino acids. It is well known that derivatization methods render amino acids more amenable for reverse phase chromatographic analysis by introducing highly-hydrophobic tags in their carboxylic acid or amino functional group. Therefore, an analytical platform that combines the 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AQC) pre-column derivatization method with ultra performance liquid chromatography-electrospray ionization-tandem mass spectrometry (UPLC-ESI-MS/MS) is presented in this article. For numerous reasons typical amino acid derivatization methods would be inadequate for large scale metabolic projects. However, AQC derivatization is a simple, rapid and reproducible way of obtaining stable amino acid adducts amenable for UPLC-ESI-MS/MS and the applicability of the method for high-throughput metabolomic analysis in Arabidopsis thaliana is demonstrated in this study. Overall, the major advantages offered by this amino acid analysis method include high-throughput, enhanced sensitivity and selectivity; characteristics that showcase its utility for the rapid screening of the preselected plant metabolites without compromising the quality of the metabolic data. The presented method enabled thirty-eight metabolites (proteinogenic amino acids and related compounds) to be analyzed within 10 min with detection limits down to 1.02 × 10-11 M (i.e., atomole level on column), which represents an improved sensitivity of 1 to 5 orders of magnitude compared to existing methods. Our UPLC-ESI-MS/MS method is one of the seven analytical platforms used by the Arabidopsis Metabolomics Consortium. The amino acid dataset obtained by analysis of Arabidopsis T-DNA mutant stocks with our platform is captured and open to the public in the web portal PlantMetabolomics.org. The analytical platform herein described could find important applications in other studies where the rapid, high-throughput and sensitive assessment of low abundance amino acids in complex biosamples is necessary.

  5. An UPLC-ESI-MS/MS Assay Using 6-Aminoquinolyl-N-Hydroxysuccinimidyl Carbamate Derivatization for Targeted Amino Acid Analysis: Application to Screening of Arabidopsis thaliana Mutants

    PubMed Central

    Salazar, Carolina; Armenta, Jenny M.; Shulaev, Vladimir

    2012-01-01

    In spite of the large arsenal of methodologies developed for amino acid assessment in complex matrices, their implementation in metabolomics studies involving wide-ranging mutant screening is hampered by their lack of high-throughput, sensitivity, reproducibility, and/or wide dynamic range. In response to the challenge of developing amino acid analysis methods that satisfy the criteria required for metabolomic studies, improved reverse-phase high-performance liquid chromatography-mass spectrometry (RPHPLC-MS) methods have been recently reported for large-scale screening of metabolic phenotypes. However, these methods focus on the direct analysis of underivatized amino acids and, therefore, problems associated with insufficient retention and resolution are observed due to the hydrophilic nature of amino acids. It is well known that derivatization methods render amino acids more amenable for reverse phase chromatographic analysis by introducing highly-hydrophobic tags in their carboxylic acid or amino functional group. Therefore, an analytical platform that combines the 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AQC) pre-column derivatization method with ultra performance liquid chromatography-electrospray ionization-tandem mass spectrometry (UPLC-ESI-MS/MS) is presented in this article. For numerous reasons typical amino acid derivatization methods would be inadequate for large scale metabolic projects. However, AQC derivatization is a simple, rapid and reproducible way of obtaining stable amino acid adducts amenable for UPLC-ESI-MS/MS and the applicability of the method for high-throughput metabolomic analysis in Arabidopsis thaliana is demonstrated in this study. Overall, the major advantages offered by this amino acid analysis method include high-throughput, enhanced sensitivity and selectivity; characteristics that showcase its utility for the rapid screening of the preselected plant metabolites without compromising the quality of the metabolic data. The presented method enabled thirty-eight metabolites (proteinogenic amino acids and related compounds) to be analyzed within 10 min with detection limits down to 1.02 × 10−11 M (i.e., atomole level on column), which represents an improved sensitivity of 1 to 5 orders of magnitude compared to existing methods. Our UPLC-ESI-MS/MS method is one of the seven analytical platforms used by the Arabidopsis Metabolomics Consortium. The amino acid dataset obtained by analysis of Arabidopsis T-DNA mutant stocks with our platform is captured and open to the public in the web portal PlantMetabolomics.org. The analytical platform herein described could find important applications in other studies where the rapid, high-throughput and sensitive assessment of low abundance amino acids in complex biosamples is necessary. PMID:24957640

  6. High-throughput Cloning and Expression of Integral Membrane Proteins in Escherichia coli

    PubMed Central

    Bruni, Renato

    2014-01-01

    Recently, several structural genomics centers have been established and a remarkable number of three-dimensional structures of soluble proteins have been solved. For membrane proteins, the number of structures solved has been significantly trailing those for their soluble counterparts, not least because over-expression and purification of membrane proteins is a much more arduous process. By using high throughput technologies, a large number of membrane protein targets can be screened simultaneously and a greater number of expression and purification conditions can be employed, leading to a higher probability of successfully determining the structure of membrane proteins. This unit describes the cloning, expression and screening of membrane proteins using high throughput methodologies developed in our laboratory. Basic Protocol 1 deals with the cloning of inserts into expression vectors by ligation-independent cloning. Basic Protocol 2 describes the expression and purification of the target proteins on a miniscale. Lastly, for the targets that express at the miniscale, basic protocols 3 and 4 outline the methods employed for the expression and purification of targets at the midi-scale, as well as a procedure for detergent screening and identification of detergent(s) in which the target protein is stable. PMID:24510647

  7. Candidiasis and the impact of flow cytometry on antifungal drug discovery.

    PubMed

    Ku, Tsun Sheng N; Bernardo, Stella; Walraven, Carla J; Lee, Samuel A

    2017-11-01

    Invasive candidiasis continues to be associated with significant morbidity and mortality as well as substantial health care costs nationally and globally. One of the contributing factors is the development of resistance to antifungal agents that are already in clinical use. Moreover, there are known treatment limitations with all of the available antifungal agents. Since traditional techniques in novel drug discovery are time consuming, high-throughput screening using flow cytometry presents as a potential tool to identify new antifungal agents that would be useful in the management of these patients. Areas covered: In this review, the authors discuss the use of automated high-throughput screening assays based upon flow cytometry to identify potential antifungals from a library comprised of a large number of bioactive compounds. They also review studies that employed the use of this research methodology that has identified compounds with antifungal activity. Expert opinion: High-throughput screening using flow cytometry has substantially decreased the processing time necessary for screening thousands of compounds, and has helped enhance our understanding of fungal pathogenesis. Indeed, the authors see this technology as a powerful tool to help scientists identify new antifungal agents that can be added to the clinician's arsenal in their fight against invasive candidiasis.

  8. Computational Approaches to Phenotyping

    PubMed Central

    Lussier, Yves A.; Liu, Yang

    2007-01-01

    The recent completion of the Human Genome Project has made possible a high-throughput “systems approach” for accelerating the elucidation of molecular underpinnings of human diseases, and subsequent derivation of molecular-based strategies to more effectively prevent, diagnose, and treat these diseases. Although altered phenotypes are among the most reliable manifestations of altered gene functions, research using systematic analysis of phenotype relationships to study human biology is still in its infancy. This article focuses on the emerging field of high-throughput phenotyping (HTP) phenomics research, which aims to capitalize on novel high-throughput computation and informatics technology developments to derive genomewide molecular networks of genotype–phenotype associations, or “phenomic associations.” The HTP phenomics research field faces the challenge of technological research and development to generate novel tools in computation and informatics that will allow researchers to amass, access, integrate, organize, and manage phenotypic databases across species and enable genomewide analysis to associate phenotypic information with genomic data at different scales of biology. Key state-of-the-art technological advancements critical for HTP phenomics research are covered in this review. In particular, we highlight the power of computational approaches to conduct large-scale phenomics studies. PMID:17202287

  9. Revealing complex function, process and pathway interactions with high-throughput expression and biological annotation data.

    PubMed

    Singh, Nitesh Kumar; Ernst, Mathias; Liebscher, Volkmar; Fuellen, Georg; Taher, Leila

    2016-10-20

    The biological relationships both between and within the functions, processes and pathways that operate within complex biological systems are only poorly characterized, making the interpretation of large scale gene expression datasets extremely challenging. Here, we present an approach that integrates gene expression and biological annotation data to identify and describe the interactions between biological functions, processes and pathways that govern a phenotype of interest. The product is a global, interconnected network, not of genes but of functions, processes and pathways, that represents the biological relationships within the system. We validated our approach on two high-throughput expression datasets describing organismal and organ development. Our findings are well supported by the available literature, confirming that developmental processes and apoptosis play key roles in cell differentiation. Furthermore, our results suggest that processes related to pluripotency and lineage commitment, which are known to be critical for development, interact mainly indirectly, through genes implicated in more general biological processes. Moreover, we provide evidence that supports the relevance of cell spatial organization in the developing liver for proper liver function. Our strategy can be viewed as an abstraction that is useful to interpret high-throughput data and devise further experiments.

  10. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development.

    PubMed

    Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V

    2013-02-01

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.

  11. A web platform for the network analysis of high-throughput data in melanoma and its use to investigate mechanisms of resistance to anti-PD1 immunotherapy.

    PubMed

    Dreyer, Florian S; Cantone, Martina; Eberhardt, Martin; Jaitly, Tanushree; Walter, Lisa; Wittmann, Jürgen; Gupta, Shailendra K; Khan, Faiz M; Wolkenhauer, Olaf; Pützer, Brigitte M; Jäck, Hans-Martin; Heinzerling, Lucie; Vera, Julio

    2018-06-01

    Cellular phenotypes are established and controlled by complex and precisely orchestrated molecular networks. In cancer, mutations and dysregulations of multiple molecular factors perturb the regulation of these networks and lead to malignant transformation. High-throughput technologies are a valuable source of information to establish the complex molecular relationships behind the emergence of malignancy, but full exploitation of this massive amount of data requires bioinformatics tools that rely on network-based analyses. In this report we present the Virtual Melanoma Cell, an online tool developed to facilitate the mining and interpretation of high-throughput data on melanoma by biomedical researches. The platform is based on a comprehensive, manually generated and expert-validated regulatory map composed of signaling pathways important in malignant melanoma. The Virtual Melanoma Cell is a tool designed to accept, visualize and analyze user-generated datasets. It is available at: https://www.vcells.net/melanoma. To illustrate the utilization of the web platform and the regulatory map, we have analyzed a large publicly available dataset accounting for anti-PD1 immunotherapy treatment of malignant melanoma patients. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Micellar Surfactant Association in the Presence of a Glucoside-based Amphiphile Detected via High-Throughput Small Angle X-ray Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanic, Vesna; Broadbent, Charlotte; DiMasi, Elaine

    2016-11-14

    The interactions of mixtures of anionic and amphoteric surfactants with sugar amphiphiles were studied via high throughput small angle x-ray scattering (SAXS). The sugar amphiphile was composed of Caprate, Caprylate, and Oleate mixed ester of methyl glucoside, MeGCCO. Optimal surfactant interactions are sought which have desirable physical properties, which must be identified in a cost effective manner that can access the large phase space of possible molecular combinations. X-ray scattering patterns obtained via high throughput SAXS can probe a combinatorial sample space and reveal the incorporation of MeGCCO into the micelles and the molecular associations between surfactant molecules. Such datamore » make it possible to efficiently assess the effects of the new amphiphiles in the formulation. A specific finding of this study is that formulations containing comparatively monodisperse and homogeneous surfactant mixtures can be reliably tuned by addition of NaCl, which swells the surfactant micelles with a monotonic dependence on salt concentration. In contrast, the presence of multiple different surfactants destroys clear correlations with NaCl concentration, even in otherwise similar series of formulations.« less

  13. Application Of Empirical Phase Diagrams For Multidimensional Data Visualization Of High Throughput Microbatch Crystallization Experiments.

    PubMed

    Klijn, Marieke E; Hubbuch, Jürgen

    2018-04-27

    Protein phase diagrams are a tool to investigate cause and consequence of solution conditions on protein phase behavior. The effects are scored according to aggregation morphologies such as crystals or amorphous precipitates. Solution conditions affect morphological features, such as crystal size, as well as kinetic features, such as crystal growth time. Common used data visualization techniques include individual line graphs or symbols-based phase diagrams. These techniques have limitations in terms of handling large datasets, comprehensiveness or completeness. To eliminate these limitations, morphological and kinetic features obtained from crystallization images generated with high throughput microbatch experiments have been visualized with radar charts in combination with the empirical phase diagram (EPD) method. Morphological features (crystal size, shape, and number, as well as precipitate size) and kinetic features (crystal and precipitate onset and growth time) are extracted for 768 solutions with varying chicken egg white lysozyme concentration, salt type, ionic strength and pH. Image-based aggregation morphology and kinetic features were compiled into a single and easily interpretable figure, thereby showing that the EPD method can support high throughput crystallization experiments in its data amount as well as its data complexity. Copyright © 2018. Published by Elsevier Inc.

  14. Comprehensive optical and data management infrastructure for high-throughput light-sheet microscopy of whole mouse brains.

    PubMed

    Müllenbroich, M Caroline; Silvestri, Ludovico; Onofri, Leonardo; Costantini, Irene; Hoff, Marcel Van't; Sacconi, Leonardo; Iannello, Giulio; Pavone, Francesco S

    2015-10-01

    Comprehensive mapping and quantification of neuronal projections in the central nervous system requires high-throughput imaging of large volumes with microscopic resolution. To this end, we have developed a confocal light-sheet microscope that has been optimized for three-dimensional (3-D) imaging of structurally intact clarified whole-mount mouse brains. We describe the optical and electromechanical arrangement of the microscope and give details on the organization of the microscope management software. The software orchestrates all components of the microscope, coordinates critical timing and synchronization, and has been written in a versatile and modular structure using the LabVIEW language. It can easily be adapted and integrated to other microscope systems and has been made freely available to the light-sheet community. The tremendous amount of data routinely generated by light-sheet microscopy further requires novel strategies for data handling and storage. To complete the full imaging pipeline of our high-throughput microscope, we further elaborate on big data management from streaming of raw images up to stitching of 3-D datasets. The mesoscale neuroanatomy imaged at micron-scale resolution in those datasets allows characterization and quantification of neuronal projections in unsectioned mouse brains.

  15. High-Throughput Fabrication of Flexible and Transparent All-Carbon Nanotube Electronics.

    PubMed

    Chen, Yong-Yang; Sun, Yun; Zhu, Qian-Bing; Wang, Bing-Wei; Yan, Xin; Qiu, Song; Li, Qing-Wen; Hou, Peng-Xiang; Liu, Chang; Sun, Dong-Ming; Cheng, Hui-Ming

    2018-05-01

    This study reports a simple and effective technique for the high-throughput fabrication of flexible all-carbon nanotube (CNT) electronics using a photosensitive dry film instead of traditional liquid photoresists. A 10 in. sized photosensitive dry film is laminated onto a flexible substrate by a roll-to-roll technology, and a 5 µm pattern resolution of the resulting CNT films is achieved for the construction of flexible and transparent all-CNT thin-film transistors (TFTs) and integrated circuits. The fabricated TFTs exhibit a desirable electrical performance including an on-off current ratio of more than 10 5 , a carrier mobility of 33 cm 2 V -1 s -1 , and a small hysteresis. The standard deviations of on-current and mobility are, respectively, 5% and 2% of the average value, demonstrating the excellent reproducibility and uniformity of the devices, which allows constructing a large noise margin inverter circuit with a voltage gain of 30. This study indicates that a photosensitive dry film is very promising for the low-cost, fast, reliable, and scalable fabrication of flexible and transparent CNT-based integrated circuits, and opens up opportunities for future high-throughput CNT-based printed electronics.

  16. RAMICS: trainable, high-speed and biologically relevant alignment of high-throughput sequencing reads to coding DNA

    PubMed Central

    Wright, Imogen A.; Travers, Simon A.

    2014-01-01

    The challenge presented by high-throughput sequencing necessitates the development of novel tools for accurate alignment of reads to reference sequences. Current approaches focus on using heuristics to map reads quickly to large genomes, rather than generating highly accurate alignments in coding regions. Such approaches are, thus, unsuited for applications such as amplicon-based analysis and the realignment phase of exome sequencing and RNA-seq, where accurate and biologically relevant alignment of coding regions is critical. To facilitate such analyses, we have developed a novel tool, RAMICS, that is tailored to mapping large numbers of sequence reads to short lengths (<10 000 bp) of coding DNA. RAMICS utilizes profile hidden Markov models to discover the open reading frame of each sequence and aligns to the reference sequence in a biologically relevant manner, distinguishing between genuine codon-sized indels and frameshift mutations. This approach facilitates the generation of highly accurate alignments, accounting for the error biases of the sequencing machine used to generate reads, particularly at homopolymer regions. Performance improvements are gained through the use of graphics processing units, which increase the speed of mapping through parallelization. RAMICS substantially outperforms all other mapping approaches tested in terms of alignment quality while maintaining highly competitive speed performance. PMID:24861618

  17. 20180312 - Evaluating the applicability of read-across tools and high throughput screening data for food relevant chemicals (SOT)

    EPA Science Inventory

    Alternative toxicity assessment methods to characterize the hazards of chemical substances have been proposed to reduce animal testing and screen thousands of chemicals in an efficient manner. Resources to accomplish these goals include utilizing large in vitro chemical screening...

  18. High-Throughput resequencing of maize landraces at genomic regions associated with flowering time

    USDA-ARS?s Scientific Manuscript database

    Despite the reduction in the price of sequencing, it remains expensive to sequence and assemble whole, complex genomes of multiple samples for population studies, particularly for large genomes like those of many crop species. Enrichment of target genome regions coupled with next generation sequenci...

  19. Comparison of L1000 and Affymetrix Microarray for In Vitro Concentration-Response Gene Expression Profiling (SOT)

    EPA Science Inventory

    Advances in high-throughput screening technologies and in vitro systems have opened doors for cost-efficient evaluation of chemical effects on a diversity of biological endpoints. However, toxicogenomics platforms remain too costly to evaluate large libraries of chemicals in conc...

  20. NREL Opens Large Database of Inorganic Thin-Film Materials | News | NREL

    Science.gov Websites

    Inorganic Thin-Film Materials April 3, 2018 An extensive experimental database of inorganic thin-film Energy Laboratory (NREL) is now publicly available. The High Throughput Experimental Materials (HTEM Schroeder / NREL) "All existing experimental databases either contain many entries or have all this

  1. Identification of vascular disruptor compounds by analysis in zebrafish embryos and mouse embryonic endothelial cells

    EPA Science Inventory

    The large number of diverse chemicals in production or in the environment has motivated medium to high throughput in vitro or small animal approaches to efficiently profile chemical-biological interactions and to utilize this information to assess risks of chemical exposures on h...

  2. A Call for Nominations of Quantitative High-Throughput Screening Assays from Relevant Human Toxicity Pathways

    EPA Science Inventory

    The National Research Council of the United States National Academies of Science has recently released a document outlining a long-range vision and strategy for transforming toxicity testing from largely whole animal-based testing to one based on in vitro assays. “Toxicity Testin...

  3. Validation, acceptance, and extension of a predictive model of reproductive toxicity using ToxCast data

    EPA Science Inventory

    The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals (mostly pesticides) in over 500 assays of different molecular targets, cellular responses an...

  4. Computational prediction of dermal diffusivity for large number of chemicals – challenges and applications

    EPA Science Inventory

    The assessment of risk from dermal exposure for thousands of chemicals, such as consumer products, due to their potential to enter the environment as contaminants is a daunting task. A strategy has been developed to integrate high-throughput technologies with toxicity, known as ...

  5. Biological profiling and dose-response modeling tools, characterizing uncertainty

    EPA Science Inventory

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number...

  6. The US EPAs ToxCast Program for the Prioritization and Prediction of Environmental Chemical Toxicity

    EPA Science Inventory

    To meet the need for evaluating large numbers of chemicals for potential toxicity, the U.S. Environmental Protection Agency has initiated a research project call ToxCast that makes use of recent advances in molecular biology and high-throughput screening. These technologies have ...

  7. Predictive Signatures of Developmental Toxicity Modeled with HTS Data from ToxCast™ Bioactivity Profiles

    EPA Science Inventory

    The EPA ToxCast™ research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I contains 309 well-characterized chemicals which are mostly pesticides tested in over 600 assays of different molecular targets, cel...

  8. Use of High-Throughput Cell-Based and Model Organism Assays for Understanding the Potential Toxicity of Engineered Nanomaterials

    EPA Science Inventory

    The rapidly expanding field of nanotechnology is introducing a large number and diversity of engineered nanomaterials into research and commerce with concordant uncertainty regarding the potential adverse health and ecological effects. With costs and time of traditional animal to...

  9. Extraction of drainage networks from large terrain datasets using high throughput computing

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Xie, Jibo

    2009-02-01

    Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.

  10. Arrayed water-in-oil droplet bilayers for membrane transport analysis.

    PubMed

    Watanabe, R; Soga, N; Hara, M; Noji, H

    2016-08-02

    The water-in-oil droplet bilayer is a simple and useful lipid bilayer system for membrane transport analysis. The droplet interface bilayer is readily formed by the contact of two water-in-oil droplets enwrapped by a phospholipid monolayer. However, the size of individual droplets with femtoliter volumes in a high-throughput manner is difficult to control, resulting in low sensitivity and throughput of membrane transport analysis. To overcome this drawback, in this study, we developed a novel micro-device in which a large number of droplet interface bilayers (>500) are formed at a time by using femtoliter-sized droplet arrays immobilized on a hydrophobic/hydrophilic substrate. The droplet volume was controllable from 3.5 to 350 fL by changing the hydrophobic/hydrophilic pattern on the device, allowing high-throughput analysis of membrane transport mechanisms including membrane permeability to solutes (e.g., ions or small molecules) with or without the aid of transport proteins. Thus, this novel platform broadens the versatility of water-in-oil droplet bilayers and will pave the way for novel analytical and pharmacological applications such as drug screening.

  11. PREVAIL: IBM's e-beam technology for next generation lithography

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Hans C.

    2000-07-01

    PREVAIL - Projection Reduction Exposure with Variable Axis Immersion Lenses represents the high throughput e-beam projection approach to NGL which IBM is pursuing in cooperation with Nikon Corporation as alliance partner. This paper discusses the challenges and accomplishments of the PREVAIL project. The supreme challenge facing all e-beam lithography approaches has been and still is throughput. Since the throughput of e-beam projection systems is severely limited by the available optical field size, the key to success is the ability to overcome this limitation. The PREVAIL technique overcomes field-limiting off-axis aberrations through the use of variable axis lenses, which electronically shift the optical axis simultaneously with the deflected beam so that the beam effectively remains on axis. The resist images obtained with the Proof-of-Concept (POC) system demonstrate that PREVAIL effectively eliminates off- axis aberrations affecting both resolution and placement accuracy of pixels. As part of the POC system a high emittance gun has been developed to provide uniform illumination of the patterned subfield and to fill the large numerical aperture projection optics designed to significantly reduce beam blur caused by Coulomb interaction.

  12. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  13. Mining high-throughput experimental data to link gene and function.

    PubMed

    Blaby-Haas, Crysten E; de Crécy-Lagard, Valérie

    2011-04-01

    Nearly 2200 genomes that encode around 6 million proteins have now been sequenced. Around 40% of these proteins are of unknown function, even when function is loosely and minimally defined as 'belonging to a superfamily'. In addition to in silico methods, the swelling stream of high-throughput experimental data can give valuable clues for linking these unknowns with precise biological roles. The goal is to develop integrative data-mining platforms that allow the scientific community at large to access and utilize this rich source of experimental knowledge. To this end, we review recent advances in generating whole-genome experimental datasets, where this data can be accessed, and how it can be used to drive prediction of gene function. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    PubMed Central

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  15. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    DOEpatents

    Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William F.; Yegian, Derek T.; Earnest, Thomas N.; Jaklevich, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.

    2007-09-25

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  16. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    DOEpatents

    Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.

    2005-07-19

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  17. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  18. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  19. SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.

    PubMed

    Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro

    2012-12-01

    Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.

  20. A paper-based microbial fuel cell array for rapid and high-throughput screening of electricity-producing bacteria.

    PubMed

    Choi, Gihoon; Hassett, Daniel J; Choi, Seokheun

    2015-06-21

    There is a large global effort to improve microbial fuel cell (MFC) techniques and advance their translational potential toward practical, real-world applications. Significant boosts in MFC performance can be achieved with the development of new techniques in synthetic biology that can regulate microbial metabolic pathways or control their gene expression. For these new directions, a high-throughput and rapid screening tool for microbial biopower production is needed. In this work, a 48-well, paper-based sensing platform was developed for the high-throughput and rapid characterization of the electricity-producing capability of microbes. 48 spatially distinct wells of a sensor array were prepared by patterning 48 hydrophilic reservoirs on paper with hydrophobic wax boundaries. This paper-based platform exploited the ability of paper to quickly wick fluid and promoted bacterial attachment to the anode pads, resulting in instant current generation upon loading of the bacterial inoculum. We validated the utility of our MFC array by studying how strategic genetic modifications impacted the electrochemical activity of various Pseudomonas aeruginosa mutant strains. Within just 20 minutes, we successfully determined the electricity generation capacity of eight isogenic mutants of P. aeruginosa. These efforts demonstrate that our MFC array displays highly comparable performance characteristics and identifies genes in P. aeruginosa that can trigger a higher power density.

  1. High-throughput Molecular Simulations of MOFs for CO2 Separation: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Erucar, Ilknur; Keskin, Seda

    2018-02-01

    Metal organic frameworks (MOFs) have emerged as great alternatives to traditional nanoporous materials for CO2 separation applications. MOFs are porous materials that are formed by self-assembly of transition metals and organic ligands. The most important advantage of MOFs over well-known porous materials is the possibility to generate multiple materials with varying structural properties and chemical functionalities by changing the combination of metal centers and organic linkers during the synthesis. This leads to a large diversity of materials with various pore sizes and shapes that can be efficiently used for CO2 separations. Since the number of synthesized MOFs has already reached to several thousand, experimental investigation of each MOF at the lab-scale is not practical. High-throughput computational screening of MOFs is a great opportunity to identify the best materials for CO2 separation and to gain molecular-level insights into the structure-performance relationships. This type of knowledge can be used to design new materials with the desired structural features that can lead to extraordinarily high CO2 selectivities. In this mini-review, we focused on developments in high-throughput molecular simulations of MOFs for CO2 separations. After reviewing the current studies on this topic, we discussed the opportunities and challenges in the field and addressed the potential future developments.

  2. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    PubMed Central

    2010-01-01

    Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Conclusions Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available. PMID:21050468

  3. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    PubMed

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available.

  4. Scalable Manufacturing of Plasmonic Nanodisk Dimers and Cusp Nanostructures using Salting-out Quenching Method and Colloidal Lithography

    PubMed Central

    Juluri, Bala Krishna; Chaturvedi, Neetu; Hao, Qingzhen; Lu, Mengqian; Velegol, Darrell; Jensen, Lasse; Huang, Tony Jun

    2014-01-01

    Localization of large electric fields in plasmonic nanostructures enables various processes such as single molecule detection, higher harmonic light generation, and control of molecular fluorescence and absorption. High-throughput, simple nanofabrication techniques are essential for implementing plasmonic nanostructures with large electric fields for practical applications. In this article we demonstrate a scalable, rapid, and inexpensive fabrication method based on the salting-out quenching technique and colloidal lithography for the fabrication of two types of nanostructures with large electric field: nanodisk dimers and cusp nanostructures. Our technique relies on fabricating polystyrene doublets from single beads by controlled aggregation and later using them as soft masks to fabricate metal nanodisk dimers and nanocusp structures. Both of these structures have a well-defined geometry for the localization of large electric fields comparable to structures fabricated by conventional nanofabrication techniques. We also show that various parameters in the fabrication process can be adjusted to tune the geometry of the final structures and control their plasmonic properties. With advantages in throughput, cost, and geometric tunability, our fabrication method can be valuable in many applications that require plasmonic nanostructures with large electric fields. PMID:21692473

  5. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes.

    PubMed

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S

    2016-03-01

    Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical 'large p, small n' problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the 'large p, small n' problem. Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT CONTACT: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Systematic Analysis of Zn2Cys6 Transcription Factors Required for Development and Pathogenicity by High-Throughput Gene Knockout in the Rice Blast Fungus

    PubMed Central

    Huang, Pengyun; Lin, Fucheng

    2014-01-01

    Because of great challenges and workload in deleting genes on a large scale, the functions of most genes in pathogenic fungi are still unclear. In this study, we developed a high-throughput gene knockout system using a novel yeast-Escherichia-Agrobacterium shuttle vector, pKO1B, in the rice blast fungus Magnaporthe oryzae. Using this method, we deleted 104 fungal-specific Zn2Cys6 transcription factor (TF) genes in M. oryzae. We then analyzed the phenotypes of these mutants with regard to growth, asexual and infection-related development, pathogenesis, and 9 abiotic stresses. The resulting data provide new insights into how this rice pathogen of global significance regulates important traits in the infection cycle through Zn2Cys6TF genes. A large variation in biological functions of Zn2Cys6TF genes was observed under the conditions tested. Sixty-one of 104 Zn2Cys6 TF genes were found to be required for fungal development. In-depth analysis of TF genes revealed that TF genes involved in pathogenicity frequently tend to function in multiple development stages, and disclosed many highly conserved but unidentified functional TF genes of importance in the fungal kingdom. We further found that the virulence-required TF genes GPF1 and CNF2 have similar regulation mechanisms in the gene expression involved in pathogenicity. These experimental validations clearly demonstrated the value of a high-throughput gene knockout system in understanding the biological functions of genes on a genome scale in fungi, and provided a solid foundation for elucidating the gene expression network that regulates the development and pathogenicity of M. oryzae. PMID:25299517

  7. Application of ToxCast High-Throughput Screening and ...

    EPA Pesticide Factsheets

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  8. Cell-based medicinal chemistry optimization of high-throughput screening (HTS) hits for orally active antimalarials. Part 1: challenges in potency and absorption, distribution, metabolism, excretion/pharmacokinetics (ADME/PK).

    PubMed

    Chatterjee, Arnab K

    2013-10-24

    Malaria represents a significant health issue, and novel and effective drugs are needed to address parasite resistance that has emerged to the current drug arsenal. Antimalarial drug discovery has historically benefited from a whole-cell (phenotypic) screening approach to identify lead molecules. This approach has been utilized by several groups to optimize weakly active antimalarial pharmacophores, such as the quinolone scaffold, to yield potent and highly efficacious compounds that are now poised to enter clinical trials. More recently, GNF/Novartis, GSK, and others have employed the same approach in high-throughput screening (HTS) of large compound libraries to find novel scaffolds that have also been optimized to clinical candidates by GNF/Novartis. This perspective outlines some of the inherent challenges in cell-based medicinal chemistry optimization, including optimization of oral exposure and hERG activity.

  9. Multiplex amplification of large sets of human exons.

    PubMed

    Porreca, Gregory J; Zhang, Kun; Li, Jin Billy; Xie, Bin; Austin, Derek; Vassallo, Sara L; LeProust, Emily M; Peck, Bill J; Emig, Christopher J; Dahl, Fredrik; Gao, Yuan; Church, George M; Shendure, Jay

    2007-11-01

    A new generation of technologies is poised to reduce DNA sequencing costs by several orders of magnitude. But our ability to fully leverage the power of these technologies is crippled by the absence of suitable 'front-end' methods for isolating complex subsets of a mammalian genome at a scale that matches the throughput at which these platforms will routinely operate. We show that targeting oligonucleotides released from programmable microarrays can be used to capture and amplify approximately 10,000 human exons in a single multiplex reaction. Additionally, we show integration of this protocol with ultra-high-throughput sequencing for targeted variation discovery. Although the multiplex capture reaction is highly specific, we found that nonuniform capture is a key issue that will need to be resolved by additional optimization. We anticipate that highly multiplexed methods for targeted amplification will enable the comprehensive resequencing of human exons at a fraction of the cost of whole-genome resequencing.

  10. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    PubMed

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. High-throughput measurement of recombination rates and genetic interference in Saccharomyces cerevisiae.

    PubMed

    Raffoux, Xavier; Bourge, Mickael; Dumas, Fabrice; Martin, Olivier C; Falque, Matthieu

    2018-06-01

    Allelic recombination owing to meiotic crossovers is a major driver of genome evolution, as well as a key player for the selection of high-performing genotypes in economically important species. Therefore, we developed a high-throughput and low-cost method to measure recombination rates and crossover patterning (including interference) in large populations of the budding yeast Saccharomyces cerevisiae. Recombination and interference were analysed by flow cytometry, which allows time-consuming steps such as tetrad microdissection or spore growth to be avoided. Moreover, our method can also be used to compare recombination in wild-type vs. mutant individuals or in different environmental conditions, even if the changes in recombination rates are small. Furthermore, meiotic mutants often present recombination and/or pairing defects affecting spore viability but our method does not involve growth steps and thus avoids filtering out non-viable spores. Copyright © 2018 John Wiley & Sons, Ltd.

  12. A Low-Cost, Reliable, High-Throughput System for Rodent Behavioral Phenotyping in a Home Cage Environment

    PubMed Central

    Parkison, Steven A.; Carlson, Jay D.; Chaudoin, Tammy R.; Hoke, Traci A.; Schenk, A. Katrin; Goulding, Evan H.; Pérez, Lance C.; Bonasera, Stephen J.

    2016-01-01

    Inexpensive, high-throughput, low maintenance systems for precise temporal and spatial measurement of mouse home cage behavior (including movement, feeding, and drinking) are required to evaluate products from large scale pharmaceutical design and genetic lesion programs. These measurements are also required to interpret results from more focused behavioral assays. We describe the design and validation of a highly-scalable, reliable mouse home cage behavioral monitoring system modeled on a previously described, one-of-a-kind system [1]. Mouse position was determined by solving static equilibrium equations describing the force and torques acting on the system strain gauges; feeding events were detected by a photobeam across the food hopper, and drinking events were detected by a capacitive lick sensor. Validation studies show excellent agreement between mouse position and drinking events measured by the system compared with video-based observation – a gold standard in neuroscience. PMID:23366406

  13. ClusCo: clustering and comparison of protein models.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej

    2013-02-22

    The development, optimization and validation of protein modeling methods require efficient tools for structural comparison. Frequently, a large number of models need to be compared with the target native structure. The main reason for the development of Clusco software was to create a high-throughput tool for all-versus-all comparison, because calculating similarity matrix is the one of the bottlenecks in the protein modeling pipeline. Clusco is fast and easy-to-use software for high-throughput comparison of protein models with different similarity measures (cRMSD, dRMSD, GDT_TS, TM-Score, MaxSub, Contact Map Overlap) and clustering of the comparison results with standard methods: K-means Clustering or Hierarchical Agglomerative Clustering. The application was highly optimized and written in C/C++, including the code for parallel execution on CPU and GPU, which resulted in a significant speedup over similar clustering and scoring computation programs.

  14. Characterization of DNA-protein interactions using high-throughput sequencing data from pulldown experiments

    NASA Astrophysics Data System (ADS)

    Moreland, Blythe; Oman, Kenji; Curfman, John; Yan, Pearlly; Bundschuh, Ralf

    Methyl-binding domain (MBD) protein pulldown experiments have been a valuable tool in measuring the levels of methylated CpG dinucleotides. Due to the frequent use of this technique, high-throughput sequencing data sets are available that allow a detailed quantitative characterization of the underlying interaction between methylated DNA and MBD proteins. Analyzing such data sets, we first found that two such proteins cannot bind closer to each other than 2 bp, consistent with structural models of the DNA-protein interaction. Second, the large amount of sequencing data allowed us to find rather weak but nevertheless clearly statistically significant sequence preferences for several bases around the required CpG. These results demonstrate that pulldown sequencing is a high-precision tool in characterizing DNA-protein interactions. This material is based upon work supported by the National Science Foundation under Grant No. DMR-1410172.

  15. Precise, High-throughput Analysis of Bacterial Growth.

    PubMed

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  16. A high throughput architecture for a low complexity soft-output demapping algorithm

    NASA Astrophysics Data System (ADS)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  17. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  18. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    PubMed

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  20. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  1. The impact of the condenser on cytogenetic image quality in digital microscope system.

    PubMed

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.

  2. Loss of heterozygosity assay for molecular detection of cancer using energy-transfer primers and capillary array electrophoresis.

    PubMed

    Medintz, I L; Lee, C C; Wong, W W; Pirkola, K; Sidransky, D; Mathies, R A

    2000-08-01

    Microsatellite DNA loci are useful markers for the detection of loss of heterozygosity (LOH) and microsatellite instability (MI) associated with primary cancers. To carry out large-scale studies of LOH and MI in cancer progression, high-throughput instrumentation and assays with high accuracy and sensitivity need to be validated. DNA was extracted from 26 renal tumor and paired lymphocyte samples and amplified with two-color energy-transfer (ET) fluorescent primers specific for loci associated with cancer-induced chromosomal changes. PCR amplicons were separated on the MegaBACE-1000 96 capillary array electrophoresis (CAE) instrument and analyzed with MegaBACE Genetic Profiler v.1.0 software. Ninety-six separations were achieved in parallel in 75 minutes. Loss of heterozygosity was easily detected in tumor samples as was the gain/loss of microsatellite core repeats. Allelic ratios were determined with a precision of +/- 10% or better. Prior analysis of these samples with slab gel electrophoresis and radioisotope labeling had not detected these changes with as much sensitivity or precision. This study establishes the validity of this assay and the MegaBACE instrument for large-scale, high-throughput studies of the molecular genetic changes associated with cancer.

  3. Prediction-based association control scheme in dense femtocell networks.

    PubMed

    Sung, Nak Woon; Pham, Ngoc-Thai; Huynh, Thong; Hwang, Won-Joo; You, Ilsun; Choo, Kim-Kwang Raymond

    2017-01-01

    The deployment of large number of femtocell base stations allows us to extend the coverage and efficiently utilize resources in a low cost manner. However, the small cell size of femtocell networks can result in frequent handovers to the mobile user, and consequently throughput degradation. Thus, in this paper, we propose predictive association control schemes to improve the system's effective throughput. Our design focuses on reducing handover frequency without impacting on throughput. The proposed schemes determine handover decisions that contribute most to the network throughput and are proper for distributed implementations. The simulation results show significant gains compared with existing methods in terms of handover frequency and network throughput perspective.

  4. Methods for fabrication of positional and compositionally controlled nanostructures on substrate

    DOEpatents

    Zhu, Ji; Grunes, Jeff; Choi, Yang-Kyu; Bokor, Jeffrey; Somorjai, Gabor

    2013-07-16

    Fabrication methods disclosed herein provide for a nanoscale structure or a pattern comprising a plurality of nanostructures of specific predetermined position, shape and composition, including nanostructure arrays having large area at high throughput necessary for industrial production. The resultant nanostracture patterns are useful for nanostructure arrays, specifically sensor and catalytic arrays.

  5. Evaluation of Microelectrode Array Data using Bayesian Modeling as an Approach to Screening and Prioritization for Neurotoxicity Testing*

    EPA Science Inventory

    The need to assess large numbers of chemicals for their potential toxicities has resulted in increased emphasis on medium- and high-throughput in vitro screening approaches. For such approaches to be useful, efficient and reliable data analysis and hit detection methods are also ...

  6. Phased genotyping-by-sequencing enhances analysis of genetic diversity and reveals divergent copy number variants in maize

    USDA-ARS?s Scientific Manuscript database

    High-throughput sequencing of reduced representation genomic libraries has ushered in an era of genotyping-by-sequencing (GBS), where genome-wide genotype data can be obtained for nearly any species. However, there remains a need for imputation-free GBS methods for genotyping large samples taken fr...

  7. A Novel Two-Step Hierarchial Quantitative Structure-Activity Relationship Modeling Workflow for Predicting Acute Toxicity of Chemicals in Rodents

    EPA Science Inventory

    Background: Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public–private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. Methods and results: A database co...

  8. Arbovirus Detection in Insect Vectors by Rapid, High-Throughput Pyrosequencing

    DTIC Science & Technology

    2010-11-09

    large contigs that by BLAST had their best hit to a rRNA of various fungal origins (including the genera Penicillium and Aspergillus ) and in all five...bioinformatic workflows need to be streamlined for non-expert users. For this approach to ever become part of the public health arsenal, our calculations

  9. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    USDA-ARS?s Scientific Manuscript database

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  10. New DArT markers for oat provide enhanced map coverage and global germplasm characterization

    USDA-ARS?s Scientific Manuscript database

    Genomic discovery in oat and its application to oat improvement have been hindered by a lack of common markers on different genetic maps, and by the difficulty of conducting whole-genome analysis using high throughput markers. In this study we developed, characterized, and applied a large set oat g...

  11. Evaluating ToxCast™ High-Throughput Assays For Their Ability To Detect Direct-Acting Genotoxicants

    EPA Science Inventory

    A standard battery of tests has been in use for the several decades to screen chemicals for genotoxicity. However, the large number of environmental and industrial chemicals that need to be tested overwhelms our ability to test them. ToxCast™ is a multi-year effort to develop a ...

  12. Microfabricated particle focusing device

    DOEpatents

    Ravula, Surendra K.; Arrington, Christian L.; Sigman, Jennifer K.; Branch, Darren W.; Brener, Igal; Clem, Paul G.; James, Conrad D.; Hill, Martyn; Boltryk, Rosemary June

    2013-04-23

    A microfabricated particle focusing device comprises an acoustic portion to preconcentrate particles over large spatial dimensions into particle streams and a dielectrophoretic portion for finer particle focusing into single-file columns. The device can be used for high throughput assays for which it is necessary to isolate and investigate small bundles of particles and single particles.

  13. Semi-high throughput screening for potential drought-tolerance in lettuce (Lactuca sativa) germplasm collections

    USDA-ARS?s Scientific Manuscript database

    This protocol describes a method by which a large collection of the leafy green vegetable lettuce (Lactuca sativa L.) germplasm was screened for likely drought-tolerance traits. Fresh water availability for agricultural use is a growing concern across the United States as well as many regions of th...

  14. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-12-13

    COW , 2015 This work Figure 4: Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either secure...record for continuous-variable QKD (33). BBM92: secure throughput record for two-dimensional entanglement-based QKD (34). COW : distance record for QKD (19). 15

  15. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-10-12

    BBM92, 2009 COW , 2015 This work FIG. 4. Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either...distance record for continuous-variable QKD [29]. BBM92: secure throughput record for two-dimensional entanglement-based QKD [30]. COW : distance record for

  16. High-Rate Field Demonstration of Large-Alphabet Quantum Key Distribution

    DTIC Science & Technology

    2016-12-08

    COW , 2015 This work Figure 4: Comparison of our P&M DO-QKD results to previously published QKD system records, chosen to represent either secure...record for continuous-variable QKD (33). BBM92: secure throughput record for two-dimensional entanglement-based QKD (34). COW : distance record for QKD (19). 15

  17. Optimization of a resazurin-based microplate assay for large-scale compound screenings against Klebsiella pneumoniae.

    PubMed

    Kim, Hyung Jun; Jang, Soojin

    2018-01-01

    A new resazurin-based assay was evaluated and optimized using a microplate (384-well) format for high-throughput screening of antibacterial molecules against Klebsiella pneumoniae . Growth of the bacteria in 384-well plates was more effectively measured and had a > sixfold higher signal-to-background ratio using the resazurin-based assay compared with absorbance measurements at 600 nm. Determination of minimum inhibitory concentrations of the antibiotics revealed that the optimized assay quantitatively measured antibacterial activity of various antibiotics. An edge effect observed in the initial assay was significantly reduced using a 1-h incubation of the bacteria-containing plates at room temperature. There was an approximately 10% decrease in signal variability between the edge and the middle wells along with improvement in the assay robustness ( Z ' = 0.99). This optimized resazurin-based assay is an efficient, inexpensive, and robust assay that can quantitatively measure antibacterial activity using a high-throughput screening system to assess a large number of compounds for discovery of new antibiotics against K. pneumoniae .

  18. High Throughput, Label-free Screening Small Molecule Compound Libraries for Protein-Ligands using Combination of Small Molecule Microarrays and a Special Ellipsometry-based Optical Scanner.

    PubMed

    Landry, James P; Fei, Yiyan; Zhu, X D

    2011-12-01

    Small-molecule compounds remain the major source of therapeutic and preventative drugs. Developing new drugs against a protein target often requires screening large collections of compounds with diverse structures for ligands or ligand fragments that exhibit sufficiently affinity and desirable inhibition effect on the target before further optimization and development. Since the number of small molecule compounds is large, high-throughput screening (HTS) methods are needed. Small-molecule microarrays (SMM) on a solid support in combination with a suitable binding assay form a viable HTS platform. We demonstrate that by combining an oblique-incidence reflectivity difference optical scanner with SMM we can screen 10,000 small-molecule compounds on a single glass slide for protein ligands without fluorescence labeling. Furthermore using such a label-free assay platform we can simultaneously acquire binding curves of a solution-phase protein to over 10,000 immobilized compounds, thus enabling full characterization of protein-ligand interactions over a wide range of affinity constants.

  19. Reducing the critical particle diameter in (highly) asymmetric sieve-based lateral displacement devices.

    PubMed

    Dijkshoorn, J P; Schutyser, M A I; Sebris, M; Boom, R M; Wagterveld, R M

    2017-10-26

    Deterministic lateral displacement technology was originally developed in the realm of microfluidics, but has potential for larger scale separation as well. In our previous studies, we proposed a sieve-based lateral displacement device inspired on the principle of deterministic lateral displacement. The advantages of this new device is that it gives a lower pressure drop, lower risk of particle accumulation, higher throughput and is simpler to manufacture. However, until now this device has only been investigated for its separation of large particles of around 785 µm diameter. To separate smaller particles, we investigate several design parameters for their influence on the critical particle diameter. In a dimensionless evaluation, device designs with different geometry and dimensions were compared. It was found that sieve-based lateral displacement devices are able to displace particles due to the crucial role of the flow profile, despite of their unusual and asymmetric design. These results demonstrate the possibility to actively steer the velocity profile in order to reduce the critical diameter in deterministic lateral displacement devices, which makes this separation principle more accessible for large-scale, high throughput applications.

  20. High performance computing environment for multidimensional image analysis

    PubMed Central

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-01-01

    Background The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. Results We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478× speedup. Conclusion Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets. PMID:17634099

  1. High performance computing environment for multidimensional image analysis.

    PubMed

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-07-10

    The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478x speedup. Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets.

  2. Intellicount: High-Throughput Quantification of Fluorescent Synaptic Protein Puncta by Machine Learning

    PubMed Central

    Fantuzzo, J. A.; Mirabella, V. R.; Zahn, J. D.

    2017-01-01

    Abstract Synapse formation analyses can be performed by imaging and quantifying fluorescent signals of synaptic markers. Traditionally, these analyses are done using simple or multiple thresholding and segmentation approaches or by labor-intensive manual analysis by a human observer. Here, we describe Intellicount, a high-throughput, fully-automated synapse quantification program which applies a novel machine learning (ML)-based image processing algorithm to systematically improve region of interest (ROI) identification over simple thresholding techniques. Through processing large datasets from both human and mouse neurons, we demonstrate that this approach allows image processing to proceed independently of carefully set thresholds, thus reducing the need for human intervention. As a result, this method can efficiently and accurately process large image datasets with minimal interaction by the experimenter, making it less prone to bias and less liable to human error. Furthermore, Intellicount is integrated into an intuitive graphical user interface (GUI) that provides a set of valuable features, including automated and multifunctional figure generation, routine statistical analyses, and the ability to run full datasets through nested folders, greatly expediting the data analysis process. PMID:29218324

  3. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  4. Mapping the Human Toxome by Systems Toxicology

    PubMed Central

    Bouhifd, Mounir; Hogberg, Helena T.; Kleensang, Andre; Maertens, Alexandra; Zhao, Liang; Hartung, Thomas

    2014-01-01

    Toxicity testing typically involves studying adverse health outcomes in animals subjected to high doses of toxicants with subsequent extrapolation to expected human responses at lower doses. The low-throughput of current toxicity testing approaches (which are largely the same for industrial chemicals, pesticides and drugs) has led to a backlog of more than 80,000 chemicals to which human beings are potentially exposed whose potential toxicity remains largely unknown. Employing new testing strategies that employ the use of predictive, high-throughput cell-based assays (of human origin) to evaluate perturbations in key pathways, referred as pathways of toxicity, and to conduct targeted testing against those pathways, we can begin to greatly accelerate our ability to test the vast “storehouses” of chemical compounds using a rational, risk-based approach to chemical prioritization, and provide test results that are more predictive of human toxicity than current methods. The NIH Transformative Research Grant project Mapping the Human Toxome by Systems Toxicology aims at developing the tools for pathway mapping, annotation and validation as well as the respective knowledge base to share this information. PMID:24443875

  5. High-throughput determination of biochemical oxygen demand (BOD) by a microplate-based biosensor.

    PubMed

    Pang, Hei-Leung; Kwok, Nga-Yan; Chan, Pak-Ho; Yeung, Chi-Hung; Lo, Waihung; Wong, Kwok-Yin

    2007-06-01

    The use of the conventional 5-day biochemical oxygen demand (BOD5) method in BOD determination is greatly hampered by its time-consuming sampling procedure and its technical difficulty in the handling of a large pool of wastewater samples. Thus, it is highly desirable to develop a fast and high-throughput biosensor for BOD measurements. This paper describes the construction of a microplate-based biosensor consisting of an organically modified silica (ORMOSIL) oxygen sensing film for high-throughput determination of BOD in wastewater. The ORMOSIL oxygen sensing film was prepared by reacting tetramethoxysilane with dimethyldimethoxysilane in the presence of the oxygen-sensitive dye tris(4,7-diphenyl-1,10-phenanthroline)ruthenium-(II) chloride. The silica composite formed a homogeneous, crack-free oxygen sensing film on polystyrene microtiter plates with high stability, and the embedded ruthenium dye interacted with the dissolved oxygen in wastewater according to the Stern-Volmer relation. The bacterium Stenotrophomonas maltophilia was loaded into the ORMOSIL/ PVA composite (deposited on the top of the oxygen sensing film) and used to metabolize the organic compounds in wastewater. This BOD biosensor was found to be able to determine the BOD values of wastewater samples within 20 min by monitoring the dissolved oxygen concentrations. Moreover, the BOD values determined by the BOD biosensor were in good agreement with those obtained by the conventional BOD5 method.

  6. High-Throughput Analysis and Automation for Glycomics Studies.

    PubMed

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  7. High Throughput Transcriptomics: From screening to pathways

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  8. Development of a high-throughput screening assay for stearoyl-CoA desaturase using rat liver microsomes, deuterium labeled stearoyl-CoA and mass spectrometry.

    PubMed

    Soulard, Patricia; McLaughlin, Meg; Stevens, Jessica; Connolly, Brendan; Coli, Rocco; Wang, Leyu; Moore, Jennifer; Kuo, Ming-Shang T; LaMarr, William A; Ozbal, Can C; Bhat, B Ganesh

    2008-10-03

    Several recent reports suggest that stearoyl-CoA desaturase 1 (SCD1), the rate-limiting enzyme in monounsaturated fatty acid synthesis, plays an important role in regulating lipid homeostasis and lipid oxidation in metabolically active tissues. As several manifestations of type 2 diabetes and related metabolic disorders are associated with alterations in intracellular lipid partitioning, pharmacological manipulation of SCD1 activity might be of benefit in the treatment of these disease states. In an effort to identify small molecule inhibitors of SCD1, we have developed a mass spectrometry based high-throughput screening (HTS) assay using deuterium labeled stearoyl-CoA substrate and induced rat liver microsomes. The methodology developed allows the use of a nonradioactive substrate which avoids interference by the endogenous SCD1 substrate and/or product that exist in the non-purified enzyme source. Throughput of the assay was up to twenty 384-well assay plates per day. The assay was linear with protein concentration and time, and was saturable for stearoyl-CoA substrate (K(m)=10.5 microM). The assay was highly reproducible with an average Z' value=0.6. Conjugated linoleic acid and sterculic acid, known inhibitors of SCD1, exhibited IC(50) values of 0.88 and 0.12 microM, respectively. High-throughput mass spectrometry screening of over 1.7 million compounds in compressed format demonstrated that the enzyme target is druggable. A total of 2515 hits were identified (0.1% hit rate), and 346 were confirmed active (>40% inhibition of total SCD activity at 20 microM--14% conformation rate). Of the confirmed hits 172 had IC(50) values of <10 microM, including 111 <1 microM and 48 <100 nM. A large number of potent drug-like (MW<450) hits representing six different chemical series were identified. The application of mass spectrometry to high-throughput screening permitted the development of a high-quality screening protocol for an otherwise intractable target, SCD1. Further medicinal chemistry and characterization of SCD inhibitors should lead to the development of reagents to treat metabolic disorders.

  9. Assessment of Genetic Diversity and Structure of Large Garlic (Allium sativum) Germplasm Bank, by Diversity Arrays Technology “Genotyping-by-Sequencing” Platform (DArTseq)

    PubMed Central

    Egea, Leticia A.; Mérida-García, Rosa; Kilian, Andrzej; Hernandez, Pilar; Dorado, Gabriel

    2017-01-01

    Garlic (Allium sativum) is used worldwide in cooking and industry, including pharmacology/medicine and cosmetics, for its interesting properties. Identifying redundancies in germplasm blanks to generate core collections is a major concern, mostly in large stocks, in order to reduce space and maintenance costs. Yet, similar appearance and phenotypic plasticity of garlic varieties hinder their morphological classification. Molecular studies are challenging, due to the large and expected complex genome of this species, with asexual reproduction. Classical molecular markers, like isozymes, RAPD, SSR, or AFLP, are not convenient to generate germplasm core-collections for this species. The recent emergence of high-throughput genotyping-by-sequencing (GBS) approaches, like DArTseq, allow to overcome such limitations to characterize and protect genetic diversity. Therefore, such technology was used in this work to: (i) assess genetic diversity and structure of a large garlic-germplasm bank (417 accessions); (ii) create a core collection; (iii) relate genotype to agronomical features; and (iv) describe a cost-effective method to manage genetic diversity in garlic-germplasm banks. Hierarchical-cluster analysis, principal-coordinates analysis and STRUCTURE showed general consistency, generating three main garlic-groups, mostly determined by variety and geographical origin. In addition, high-resolution genotyping identified 286 unique and 131 redundant accessions, used to select a reduced size germplasm-bank core collection. This demonstrates that DArTseq is a cost-effective method to analyze species with large and expected complex genomes, like garlic. To the best of our knowledge, this is the first report of high-throughput genotyping of a large garlic germplasm. This is particularly interesting for garlic adaptation and improvement, to fight biotic and abiotic stresses, in the current context of climate change and global warming. PMID:28775737

  10. Assessment of Genetic Diversity and Structure of Large Garlic (Allium sativum) Germplasm Bank, by Diversity Arrays Technology "Genotyping-by-Sequencing" Platform (DArTseq).

    PubMed

    Egea, Leticia A; Mérida-García, Rosa; Kilian, Andrzej; Hernandez, Pilar; Dorado, Gabriel

    2017-01-01

    Garlic ( Allium sativum ) is used worldwide in cooking and industry, including pharmacology/medicine and cosmetics, for its interesting properties. Identifying redundancies in germplasm blanks to generate core collections is a major concern, mostly in large stocks, in order to reduce space and maintenance costs. Yet, similar appearance and phenotypic plasticity of garlic varieties hinder their morphological classification. Molecular studies are challenging, due to the large and expected complex genome of this species, with asexual reproduction. Classical molecular markers, like isozymes, RAPD, SSR, or AFLP, are not convenient to generate germplasm core-collections for this species. The recent emergence of high-throughput genotyping-by-sequencing (GBS) approaches, like DArTseq, allow to overcome such limitations to characterize and protect genetic diversity. Therefore, such technology was used in this work to: (i) assess genetic diversity and structure of a large garlic-germplasm bank (417 accessions); (ii) create a core collection; (iii) relate genotype to agronomical features; and (iv) describe a cost-effective method to manage genetic diversity in garlic-germplasm banks. Hierarchical-cluster analysis, principal-coordinates analysis and STRUCTURE showed general consistency, generating three main garlic-groups, mostly determined by variety and geographical origin. In addition, high-resolution genotyping identified 286 unique and 131 redundant accessions, used to select a reduced size germplasm-bank core collection. This demonstrates that DArTseq is a cost-effective method to analyze species with large and expected complex genomes, like garlic. To the best of our knowledge, this is the first report of high-throughput genotyping of a large garlic germplasm. This is particularly interesting for garlic adaptation and improvement, to fight biotic and abiotic stresses, in the current context of climate change and global warming.

  11. Monolayer graphene-insulator-semiconductor emitter for large-area electron lithography

    NASA Astrophysics Data System (ADS)

    Kirley, Matthew P.; Aloui, Tanouir; Glass, Jeffrey T.

    2017-06-01

    The rapid adoption of nanotechnology in fields as varied as semiconductors, energy, and medicine requires the continual improvement of nanopatterning tools. Lithography is central to this evolving nanotechnology landscape, but current production systems are subject to high costs, low throughput, or low resolution. Herein, we present a solution to these problems with the use of monolayer graphene in a graphene-insulator-semiconductor (GIS) electron emitter device for large-area electron lithography. Our GIS device displayed high emission efficiency (up to 13%) and transferred large patterns (500 × 500 μm) with high fidelity (<50% spread). The performance of our device demonstrates a feasible path to dramatic improvements in lithographic patterning systems, enabling continued progress in existing industries and opening opportunities in nanomanufacturing.

  12. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  13. Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...

  14. Next Generation MUT-MAP, a High-Sensitivity High-Throughput Microfluidics Chip-Based Mutation Analysis Panel

    PubMed Central

    Patel, Rajesh; Tsan, Alison; Sumiyoshi, Teiko; Fu, Ling; Desai, Rupal; Schoenbrunner, Nancy; Myers, Thomas W.; Bauer, Keith; Smith, Edward; Raja, Rajiv

    2014-01-01

    Molecular profiling of tumor tissue to detect alterations, such as oncogenic mutations, plays a vital role in determining treatment options in oncology. Hence, there is an increasing need for a robust and high-throughput technology to detect oncogenic hotspot mutations. Although commercial assays are available to detect genetic alterations in single genes, only a limited amount of tissue is often available from patients, requiring multiplexing to allow for simultaneous detection of mutations in many genes using low DNA input. Even though next-generation sequencing (NGS) platforms provide powerful tools for this purpose, they face challenges such as high cost, large DNA input requirement, complex data analysis, and long turnaround times, limiting their use in clinical settings. We report the development of the next generation mutation multi-analyte panel (MUT-MAP), a high-throughput microfluidic, panel for detecting 120 somatic mutations across eleven genes of therapeutic interest (AKT1, BRAF, EGFR, FGFR3, FLT3, HRAS, KIT, KRAS, MET, NRAS, and PIK3CA) using allele-specific PCR (AS-PCR) and Taqman technology. This mutation panel requires as little as 2 ng of high quality DNA from fresh frozen or 100 ng of DNA from formalin-fixed paraffin-embedded (FFPE) tissues. Mutation calls, including an automated data analysis process, have been implemented to run 88 samples per day. Validation of this platform using plasmids showed robust signal and low cross-reactivity in all of the newly added assays and mutation calls in cell line samples were found to be consistent with the Catalogue of Somatic Mutations in Cancer (COSMIC) database allowing for direct comparison of our platform to Sanger sequencing. High correlation with NGS when compared to the SuraSeq500 panel run on the Ion Torrent platform in a FFPE dilution experiment showed assay sensitivity down to 0.45%. This multiplexed mutation panel is a valuable tool for high-throughput biomarker discovery in personalized medicine and cancer drug development. PMID:24658394

  15. Time to "go large" on biofilm research: advantages of an omics approach.

    PubMed

    Azevedo, Nuno F; Lopes, Susana P; Keevil, Charles W; Pereira, Maria O; Vieira, Maria J

    2009-04-01

    In nature, the biofilm mode of life is of great importance in the cell cycle for many microorganisms. Perhaps because of biofilm complexity and variability, the characterization of a given microbial system, in terms of biofilm formation potential, structure and associated physiological activity, in a large-scale, standardized and systematic manner has been hindered by the absence of high-throughput methods. This outlook is now starting to change as new methods involving the utilization of microtiter-plates and automated spectrophotometry and microscopy systems are being developed to perform large-scale testing of microbial biofilms. Here, we evaluate if the time is ripe to start an integrated omics approach, i.e., the generation and interrogation of large datasets, to biofilms--"biofomics". This omics approach would bring much needed insight into how biofilm formation ability is affected by a number of environmental, physiological and mutational factors and how these factors interplay between themselves in a standardized manner. This could then lead to the creation of a database where biofilm signatures are identified and interrogated. Nevertheless, and before embarking on such an enterprise, the selection of a versatile, robust, high-throughput biofilm growing device and of appropriate methods for biofilm analysis will have to be performed. Whether such device and analytical methods are already available, particularly for complex heterotrophic biofilms is, however, very debatable.

  16. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  17. A Proteomic Workflow Using High-Throughput De Novo Sequencing Towards Complementation of Genome Information for Improved Comparative Crop Science.

    PubMed

    Turetschek, Reinhard; Lyon, David; Desalegn, Getinet; Kaul, Hans-Peter; Wienkoop, Stefanie

    2016-01-01

    The proteomic study of non-model organisms, such as many crop plants, is challenging due to the lack of comprehensive genome information. Changing environmental conditions require the study and selection of adapted cultivars. Mutations, inherent to cultivars, hamper protein identification and thus considerably complicate the qualitative and quantitative comparison in large-scale systems biology approaches. With this workflow, cultivar-specific mutations are detected from high-throughput comparative MS analyses, by extracting sequence polymorphisms with de novo sequencing. Stringent criteria are suggested to filter for confidential mutations. Subsequently, these polymorphisms complement the initially used database, which is ready to use with any preferred database search algorithm. In our example, we thereby identified 26 specific mutations in two cultivars of Pisum sativum and achieved an increased number (17 %) of peptide spectrum matches.

  18. New classes of piezoelectrics, ferroelectrics, and antiferroelectrics by first-principles high-throughput materials design

    NASA Astrophysics Data System (ADS)

    Bennett, Joseph

    2013-03-01

    Functional materials, such as piezoelectrics, ferroelectrics, and antiferroelectrics, exhibit large changes with applied fields and stresses. This behavior enables their incorporation into a wide variety of devices in technological fields such as energy conversion/storage and information processing/storage. Discovery of functional materials with improved performance or even new types of responses is thus not only a scientific challenge, but can have major impacts on society. In this talk I will review our efforts to uncover new families of functional materials using a combined crystallographic database/high-throughput first-principles approach. I will describe our work on the design and discovery of thousands of new functional materials, specifically the LiAlSi family as piezoelectrics, the LiGaGe family as ferroelectrics, and the MgSrSi family as antiferroelectrics.

  19. Systems biology coupled with label-free high-throughput detection as a novel approach for diagnosis of chronic obstructive pulmonary disease

    PubMed Central

    Richens, Joanna L; Urbanowicz, Richard A; Lunt, Elizabeth AM; Metcalf, Rebecca; Corne, Jonathan; Fairclough, Lucy; O'Shea, Paul

    2009-01-01

    Chronic obstructive pulmonary disease (COPD) is a treatable and preventable disease state, characterised by progressive airflow limitation that is not fully reversible. Although COPD is primarily a disease of the lungs there is now an appreciation that many of the manifestations of disease are outside the lung, leading to the notion that COPD is a systemic disease. Currently, diagnosis of COPD relies on largely descriptive measures to enable classification, such as symptoms and lung function. Here the limitations of existing diagnostic strategies of COPD are discussed and systems biology approaches to diagnosis that build upon current molecular knowledge of the disease are described. These approaches rely on new 'label-free' sensing technologies, such as high-throughput surface plasmon resonance (SPR), that we also describe. PMID:19386108

  20. Mapping specificity landscapes of RNA-protein interactions by high throughput sequencing.

    PubMed

    Jankowsky, Eckhard; Harris, Michael E

    2017-04-15

    To function in a biological setting, RNA binding proteins (RBPs) have to discriminate between alternative binding sites in RNAs. This discrimination can occur in the ground state of an RNA-protein binding reaction, in its transition state, or in both. The extent by which RBPs discriminate at these reaction states defines RBP specificity landscapes. Here, we describe the HiTS-Kin and HiTS-EQ techniques, which combine kinetic and equilibrium binding experiments with high throughput sequencing to quantitatively assess substrate discrimination for large numbers of substrate variants at ground and transition states of RNA-protein binding reactions. We discuss experimental design, practical considerations and data analysis and outline how a combination of HiTS-Kin and HiTS-EQ allows the mapping of RBP specificity landscapes. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Identification of antigen-specific human monoclonal antibodies using high-throughput sequencing of the antibody repertoire.

    PubMed

    Liu, Ju; Li, Ruihua; Liu, Kun; Li, Liangliang; Zai, Xiaodong; Chi, Xiangyang; Fu, Ling; Xu, Junjie; Chen, Wei

    2016-04-22

    High-throughput sequencing of the antibody repertoire provides a large number of antibody variable region sequences that can be used to generate human monoclonal antibodies. However, current screening methods for identifying antigen-specific antibodies are inefficient. In the present study, we developed an antibody clone screening strategy based on clone dynamics and relative frequency, and used it to identify antigen-specific human monoclonal antibodies. Enzyme-linked immunosorbent assay showed that at least 52% of putative positive immunoglobulin heavy chains composed antigen-specific antibodies. Combining information on dynamics and relative frequency improved identification of positive clones and elimination of negative clones. and increase the credibility of putative positive clones. Therefore the screening strategy could simplify the subsequent experimental screening and may facilitate the generation of antigen-specific antibodies. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Biofuel metabolic engineering with biosensors.

    PubMed

    Morgan, Stacy-Anne; Nadler, Dana C; Yokoo, Rayka; Savage, David F

    2016-12-01

    Metabolic engineering offers the potential to renewably produce important classes of chemicals, particularly biofuels, at an industrial scale. DNA synthesis and editing techniques can generate large pathway libraries, yet identifying the best variants is slow and cumbersome. Traditionally, analytical methods like chromatography and mass spectrometry have been used to evaluate pathway variants, but such techniques cannot be performed with high throughput. Biosensors - genetically encoded components that actuate a cellular output in response to a change in metabolite concentration - are therefore a promising tool for rapid and high-throughput evaluation of candidate pathway variants. Applying biosensors can also dynamically tune pathways in response to metabolic changes, improving balance and productivity. Here, we describe the major classes of biosensors and briefly highlight recent progress in applying them to biofuel-related metabolic pathway engineering. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Automated recycling of chemistry for virtual screening and library design.

    PubMed

    Vainio, Mikko J; Kogej, Thierry; Raubacher, Florian

    2012-07-23

    An early stage drug discovery project needs to identify a number of chemically diverse and attractive compounds. These hit compounds are typically found through high-throughput screening campaigns. The diversity of the chemical libraries used in screening is therefore important. In this study, we describe a virtual high-throughput screening system called Virtual Library. The system automatically "recycles" validated synthetic protocols and available starting materials to generate a large number of virtual compound libraries, and allows for fast searches in the generated libraries using a 2D fingerprint based screening method. Virtual Library links the returned virtual hit compounds back to experimental protocols to quickly assess the synthetic accessibility of the hits. The system can be used as an idea generator for library design to enrich the screening collection and to explore the structure-activity landscape around a specific active compound.

  4. A kinase-focused compound collection: compilation and screening strategy.

    PubMed

    Sun, Dongyu; Chuaqui, Claudio; Deng, Zhan; Bowes, Scott; Chin, Donovan; Singh, Juswinder; Cullen, Patrick; Hankins, Gretchen; Lee, Wen-Cherng; Donnelly, Jason; Friedman, Jessica; Josiah, Serene

    2006-06-01

    Lead identification by high-throughput screening of large compound libraries has been supplemented with virtual screening and focused compound libraries. To complement existing approaches for lead identification at Biogen Idec, a kinase-focused compound collection was designed, developed and validated. Two strategies were adopted to populate the compound collection: a ligand shape-based virtual screening and a receptor-based approach (structural interaction fingerprint). Compounds selected with the two approaches were cherry-picked from an existing high-throughput screening compound library, ordered from suppliers and supplemented with specific medicinal compounds from internal programs. Promising hits and leads have been generated from the kinase-focused compound collection against multiple kinase targets. The principle of the collection design and screening strategy was validated and the use of the kinase-focused compound collection for lead identification has been added to existing strategies.

  5. Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach

    PubMed Central

    Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.

    2007-01-01

    Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408

  6. Electric Propulsion of a Different Class: The Challenges of Testing for MegaWatt Missions

    DTIC Science & Technology

    2012-08-01

    mode akin to steady state. Realizing that the pumping capacity of the Large Vacuum Test Facility (LVTF) at PEPL... Pumping High T/P thruster testing requires high propellant throughput. This reality necessitates the careful survey and selection of appropriate...test facilities to ensure that they have 1) sufficient pumping speed to maintain desired operating pressures and 2) adequate size to mitigate facility

  7. Investigation of rare and low-frequency variants using high-throughput sequencing with pooled DNA samples

    PubMed Central

    Wang, Jingwen; Skoog, Tiina; Einarsdottir, Elisabet; Kaartokallio, Tea; Laivuori, Hannele; Grauers, Anna; Gerdhem, Paul; Hytönen, Marjo; Lohi, Hannes; Kere, Juha; Jiao, Hong

    2016-01-01

    High-throughput sequencing using pooled DNA samples can facilitate genome-wide studies on rare and low-frequency variants in a large population. Some major questions concerning the pooling sequencing strategy are whether rare and low-frequency variants can be detected reliably, and whether estimated minor allele frequencies (MAFs) can represent the actual values obtained from individually genotyped samples. In this study, we evaluated MAF estimates using three variant detection tools with two sets of pooled whole exome sequencing (WES) and one set of pooled whole genome sequencing (WGS) data. Both GATK and Freebayes displayed high sensitivity, specificity and accuracy when detecting rare or low-frequency variants. For the WGS study, 56% of the low-frequency variants in Illumina array have identical MAFs and 26% have one allele difference between sequencing and individual genotyping data. The MAF estimates from WGS correlated well (r = 0.94) with those from Illumina arrays. The MAFs from the pooled WES data also showed high concordance (r = 0.88) with those from the individual genotyping data. In conclusion, the MAFs estimated from pooled DNA sequencing data reflect the MAFs in individually genotyped samples well. The pooling strategy can thus be a rapid and cost-effective approach for the initial screening in large-scale association studies. PMID:27633116

  8. High Content Screening of a Kinase-Focused Library Reveals Compounds Broadly-Active against Dengue Viruses

    PubMed Central

    Li, Xiaolan; Milan Bonotto, Rafaela; No, Joo Hwan; Kim, Keum Hyun; Baek, Sungmin; Kim, Hee Young; Windisch, Marc Peter; Pamplona Mosimann, Ana Luiza; de Borba, Luana; Liuzzi, Michel; Hansen, Michael Adsetts Edberg; Nunes Duarte dos Santos, Claudia; Freitas-Junior, Lucio Holanda

    2013-01-01

    Dengue virus is a mosquito-borne flavivirus that has a large impact in global health. It is considered as one of the medically important arboviruses, and developing a preventive or therapeutic solution remains a top priority in the medical and scientific community. Drug discovery programs for potential dengue antivirals have increased dramatically over the last decade, largely in part to the introduction of high-throughput assays. In this study, we have developed an image-based dengue high-throughput/high-content assay (HT/HCA) using an innovative computer vision approach to screen a kinase-focused library for anti-dengue compounds. Using this dengue HT/HCA, we identified a group of compounds with a 4-(1-aminoethyl)-N-methylthiazol-2-amine as a common core structure that inhibits dengue viral infection in a human liver-derived cell line (Huh-7.5 cells). Compounds CND1201, CND1203 and CND1243 exhibited strong antiviral activities against all four dengue serotypes. Plaque reduction and time-of-addition assays suggests that these compounds interfere with the late stage of viral infection cycle. These findings demonstrate that our image-based dengue HT/HCA is a reliable tool that can be used to screen various chemical libraries for potential dengue antiviral candidates. PMID:23437413

  9. High throughput screening using acoustic droplet ejection to combine protein crystals and chemical libraries on crystallization plates at high density

    DOE PAGES

    Teplitsky, Ella; Joshi, Karan; Ericson, Daniel L.; ...

    2015-07-01

    We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using thismore » system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. Moreover, a fragment mini-library was screened to observe two known lysozyme We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using this system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. A fragment mini-library was screened to observe two known lysozyme ligands using both co-crystallization and soaking. A similar approach was used to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.ds using both co-crystallization and soaking. We used a A similar approach to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.« less

  10. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  11. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  12. Use of Activity-Based Probes to Develop High Throughput Screening Assays That Can Be Performed in Complex Cell Extracts

    PubMed Central

    Deu, Edgar; Yang, Zhimou; Wang, Flora; Klemba, Michael; Bogyo, Matthew

    2010-01-01

    Background High throughput screening (HTS) is one of the primary tools used to identify novel enzyme inhibitors. However, its applicability is generally restricted to targets that can either be expressed recombinantly or purified in large quantities. Methodology and Principal Findings Here, we described a method to use activity-based probes (ABPs) to identify substrates that are sufficiently selective to allow HTS in complex biological samples. Because ABPs label their target enzymes through the formation of a permanent covalent bond, we can correlate labeling of target enzymes in a complex mixture with inhibition of turnover of a substrate in that same mixture. Thus, substrate specificity can be determined and substrates with sufficiently high selectivity for HTS can be identified. In this study, we demonstrate this method by using an ABP for dipeptidyl aminopeptidases to identify (Pro-Arg)2-Rhodamine as a specific substrate for DPAP1 in Plasmodium falciparum lysates and Cathepsin C in rat liver extracts. We then used this substrate to develop highly sensitive HTS assays (Z’>0.8) that are suitable for use in screening large collections of small molecules (i.e >300,000) for inhibitors of these proteases. Finally, we demonstrate that it is possible to use broad-spectrum ABPs to identify target-specific substrates. Conclusions We believe that this approach will have value for many enzymatic systems where access to large amounts of active enzyme is problematic. PMID:20700487

  13. RAMICS: trainable, high-speed and biologically relevant alignment of high-throughput sequencing reads to coding DNA.

    PubMed

    Wright, Imogen A; Travers, Simon A

    2014-07-01

    The challenge presented by high-throughput sequencing necessitates the development of novel tools for accurate alignment of reads to reference sequences. Current approaches focus on using heuristics to map reads quickly to large genomes, rather than generating highly accurate alignments in coding regions. Such approaches are, thus, unsuited for applications such as amplicon-based analysis and the realignment phase of exome sequencing and RNA-seq, where accurate and biologically relevant alignment of coding regions is critical. To facilitate such analyses, we have developed a novel tool, RAMICS, that is tailored to mapping large numbers of sequence reads to short lengths (<10 000 bp) of coding DNA. RAMICS utilizes profile hidden Markov models to discover the open reading frame of each sequence and aligns to the reference sequence in a biologically relevant manner, distinguishing between genuine codon-sized indels and frameshift mutations. This approach facilitates the generation of highly accurate alignments, accounting for the error biases of the sequencing machine used to generate reads, particularly at homopolymer regions. Performance improvements are gained through the use of graphics processing units, which increase the speed of mapping through parallelization. RAMICS substantially outperforms all other mapping approaches tested in terms of alignment quality while maintaining highly competitive speed performance. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Systematic Identification of Combinatorial Drivers and Targets in Cancer Cell Lines

    PubMed Central

    Tabchy, Adel; Eltonsy, Nevine; Housman, David E.; Mills, Gordon B.

    2013-01-01

    There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance. PMID:23577104

  15. A high-throughput approach to profile RNA structure.

    PubMed

    Delli Ponti, Riccardo; Marti, Stefanie; Armaos, Alexandros; Tartaglia, Gian Gaetano

    2017-03-17

    Here we introduce the Computational Recognition of Secondary Structure (CROSS) method to calculate the structural profile of an RNA sequence (single- or double-stranded state) at single-nucleotide resolution and without sequence length restrictions. We trained CROSS using data from high-throughput experiments such as Selective 2΄-Hydroxyl Acylation analyzed by Primer Extension (SHAPE; Mouse and HIV transcriptomes) and Parallel Analysis of RNA Structure (PARS; Human and Yeast transcriptomes) as well as high-quality NMR/X-ray structures (PDB database). The algorithm uses primary structure information alone to predict experimental structural profiles with >80% accuracy, showing high performances on large RNAs such as Xist (17 900 nucleotides; Area Under the ROC Curve AUC of 0.75 on dimethyl sulfate (DMS) experiments). We integrated CROSS in thermodynamics-based methods to predict secondary structure and observed an increase in their predictive power by up to 30%. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Systematic identification of combinatorial drivers and targets in cancer cell lines.

    PubMed

    Tabchy, Adel; Eltonsy, Nevine; Housman, David E; Mills, Gordon B

    2013-01-01

    There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance.

  17. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    PubMed

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  18. RAD sequencing yields a high success rate for westslope cutthroat and rainbow trout species-diagnostic SNP assays

    USGS Publications Warehouse

    Stephen J. Amish,; Paul A. Hohenlohe,; Sally Painter,; Robb F. Leary,; Muhlfeld, Clint C.; Fred W. Allendorf,; Luikart, Gordon

    2012-01-01

    Hybridization with introduced rainbow trout threatens most native westslope cutthroat trout populations. Understanding the genetic effects of hybridization and introgression requires a large set of high-throughput, diagnostic genetic markers to inform conservation and management. Recently, we identified several thousand candidate single-nucleotide polymorphism (SNP) markers based on RAD sequencing of 11 westslope cutthroat trout and 13 rainbow trout individuals. Here, we used flanking sequence for 56 of these candidate SNP markers to design high-throughput genotyping assays. We validated the assays on a total of 92 individuals from 22 populations and seven hatchery strains. Forty-six assays (82%) amplified consistently and allowed easy identification of westslope cutthroat and rainbow trout alleles as well as heterozygote controls. The 46 SNPs will provide high power for early detection of population admixture and improved identification of hybrid and nonhybridized individuals. This technique shows promise as a very low-cost, reliable and relatively rapid method for developing and testing SNP markers for nonmodel organisms with limited genomic resources.

  19. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.

  20. A confidence interval analysis of sampling effort, sequencing depth, and taxonomic resolution of fungal community ecology in the era of high-throughput sequencing.

    PubMed

    Oono, Ryoko

    2017-01-01

    High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions 'how and why are communities different?' This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences.

  1. Profiling Cholinesterase Adduction: A High-Throughput Prioritization Method for Organophosphate Exposure Samples

    PubMed Central

    Carter, Melissa D.; Crow, Brian S.; Pantazides, Brooke G.; Watson, Caroline M.; deCastro, B. Rey; Thomas, Jerry D.; Blake, Thomas A.; Johnson, Rudolph C.

    2017-01-01

    A high-throughput prioritization method was developed for use with a validated confirmatory method detecting organophosphorus nerve agent exposure by immunomagnetic separation-HPLC-MS/MS. A ballistic gradient was incorporated into this analytical method in order to profile unadducted butyrylcholinesterase (BChE) in clinical samples. With Zhang, et al. 1999’s Z′-factor of 0.88 ± 0.01 (SD) of control analytes and Z-factor of 0.25 ± 0.06 (SD) of serum samples, the assay is rated an “excellent assay” for the synthetic peptide controls used and a “double assay” when used to prioritize clinical samples. Hits, defined as samples containing BChE Ser-198 adducts or no BChE present, were analyzed in a confirmatory method for identification and quantitation of the BChE adduct, if present. The ability to prioritize samples by highest exposure for confirmatory analysis is of particular importance in an exposure to cholinesterase inhibitors such as organophosphorus nerve agents where a large number of clinical samples may be collected. In an initial blind screen, 67 out of 70 samples were accurately identified giving an assay accuracy of 96% and yielded no false negatives. The method is the first to provide a high-throughput prioritization assay for profiling adduction of Ser-198 BChE in clinical samples. PMID:23954929

  2. Modeling Steroidogenesis Disruption Using High-Throughput ...

    EPA Pesticide Factsheets

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  3. Supplemental treatment of air in airborne infection isolation rooms using high-throughput in-room air decontamination units.

    PubMed

    Bergeron, Vance; Chalfine, Annie; Misset, Benoît; Moules, Vincent; Laudinet, Nicolas; Carlet, Jean; Lina, Bruno

    2011-05-01

    Evidence has recently emerged indicating that in addition to large airborne droplets, fine aerosol particles can be an important mode of influenza transmission that may have been hitherto underestimated. Furthermore, recent performance studies evaluating airborne infection isolation (AII) rooms designed to house infectious patients have revealed major discrepancies between what is prescribed and what is actually measured. We conducted an experimental study to investigate the use of high-throughput in-room air decontamination units for supplemental protection against airborne contamination in areas that host infectious patients. The study included both intrinsic performance tests of the air-decontamination unit against biological aerosols of particular epidemiologic interest and field tests in a hospital AII room under different ventilation scenarios. The unit tested efficiently eradicated airborne H5N2 influenza and Mycobacterium bovis (a 4- to 5-log single-pass reduction) and, when implemented with a room extractor, reduced the peak contamination levels by a factor of 5, with decontamination rates at least 33% faster than those achieved with the extractor alone. High-throughput in-room air treatment units can provide supplemental control of airborne pathogen levels in patient isolation rooms. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  4. A confidence interval analysis of sampling effort, sequencing depth, and taxonomic resolution of fungal community ecology in the era of high-throughput sequencing

    PubMed Central

    2017-01-01

    High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions ‘how and why are communities different?’ This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences. PMID:29253889

  5. Clinical application of high throughput molecular screening techniques for pharmacogenomics

    PubMed Central

    Wiita, Arun P; Schrijver, Iris

    2011-01-01

    Genetic analysis is one of the fastest-growing areas of clinical diagnostics. Fortunately, as our knowledge of clinically relevant genetic variants rapidly expands, so does our ability to detect these variants in patient samples. Increasing demand for genetic information may necessitate the use of high throughput diagnostic methods as part of clinically validated testing. Here we provide a general overview of our current and near-future abilities to perform large-scale genetic testing in the clinical laboratory. First we review in detail molecular methods used for high throughput mutation detection, including techniques able to monitor thousands of genetic variants for a single patient or to genotype a single genetic variant for thousands of patients simultaneously. These methods are analyzed in the context of pharmacogenomic testing in the clinical laboratories, with a focus on tests that are currently validated as well as those that hold strong promise for widespread clinical application in the near future. We further discuss the unique economic and clinical challenges posed by pharmacogenomic markers. Our ability to detect genetic variants frequently outstrips our ability to accurately interpret them in a clinical context, carrying implications both for test development and introduction into patient management algorithms. These complexities must be taken into account prior to the introduction of any pharmacogenomic biomarker into routine clinical testing. PMID:23226057

  6. Rapid high-throughput cloning and stable expression of antibodies in HEK293 cells.

    PubMed

    Spidel, Jared L; Vaessen, Benjamin; Chan, Yin Yin; Grasso, Luigi; Kline, J Bradford

    2016-12-01

    Single-cell based amplification of immunoglobulin variable regions is a rapid and powerful technique for cloning antigen-specific monoclonal antibodies (mAbs) for purposes ranging from general laboratory reagents to therapeutic drugs. From the initial screening process involving small quantities of hundreds or thousands of mAbs through in vitro characterization and subsequent in vivo experiments requiring large quantities of only a few, having a robust system for generating mAbs from cloning through stable cell line generation is essential. A protocol was developed to decrease the time, cost, and effort required by traditional cloning and expression methods by eliminating bottlenecks in these processes. Removing the clonal selection steps from the cloning process using a highly efficient ligation-independent protocol and from the stable cell line process by utilizing bicistronic plasmids to generate stable semi-clonal cell pools facilitated an increased throughput of the entire process from plasmid assembly through transient transfections and selection of stable semi-clonal cell pools. Furthermore, the time required by a single individual to clone, express, and select stable cell pools in a high-throughput format was reduced from 4 to 6months to only 4 to 6weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Development of a high-throughput screen to detect inhibitors of TRPS1 sumoylation.

    PubMed

    Brandt, Martin; Szewczuk, Lawrence M; Zhang, Hong; Hong, Xuan; McCormick, Patricia M; Lewis, Tia S; Graham, Taylor I; Hung, Sunny T; Harper-Jones, Amber D; Kerrigan, John J; Wang, Da-Yuan; Dul, Edward; Hou, Wangfang; Ho, Thau F; Meek, Thomas D; Cheung, Mui H; Johanson, Kyung O; Jones, Christopher S; Schwartz, Benjamin; Kumar, Sanjay; Oliff, Allen I; Kirkpatrick, Robert B

    2013-06-01

    Small ubiquitin-like modifier (SUMO) belongs to the family of ubiquitin-like proteins (Ubls) that can be reversibly conjugated to target-specific lysines on substrate proteins. Although covalently sumoylated products are readily detectible in gel-based assays, there has been little progress toward the development of robust quantitative sumoylation assay formats for the evaluation of large compound libraries. In an effort to identify inhibitors of ubiquitin carrier protein 9 (Ubc9)-dependent sumoylation, a high-throughput fluorescence polarization assay was developed, which allows detection of Lys-1201 sumoylation, corresponding to the major site of functional sumoylation within the transcriptional repressor trichorhino-phalangeal syndrome type I protein (TRPS1). A minimal hexapeptide substrate peptide, TMR-VVK₁₂₀₁TEK, was used in this assay format to afford high-throughput screening of the GlaxoSmithKline diversity compound collection. A total of 728 hits were confirmed but no specific noncovalent inhibitors of Ubc9 dependent trans-sumoylation were found. However, several diaminopyrimidine compounds were identified as inhibitors in the assay with IC₅₀ values of 12.5 μM. These were further characterized to be competent substrates which were subject to sumoylation by SUMO-Ubc9 and which were competitive with the sumoylation of the TRPS1 peptide substrates.

  8. GPU Lossless Hyperspectral Data Compression System

    NASA Technical Reports Server (NTRS)

    Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.

    2014-01-01

    Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.

  9. Multilength Scale Patterning of Functional Layers by Roll-to-Roll Ultraviolet-Light-Assisted Nanoimprint Lithography.

    PubMed

    Leitgeb, Markus; Nees, Dieter; Ruttloff, Stephan; Palfinger, Ursula; Götz, Johannes; Liska, Robert; Belegratis, Maria R; Stadlober, Barbara

    2016-05-24

    Top-down fabrication of nanostructures with high throughput is still a challenge. We demonstrate the fast (>10 m/min) and continuous fabrication of multilength scale structures by roll-to-roll UV-nanoimprint lithography on a 250 mm wide web. The large-area nanopatterning is enabled by a multicomponent UV-curable resist system (JRcure) with viscous, mechanical, and surface properties that are tunable over a wide range to either allow for usage as polymer stamp material or as imprint resist. The adjustable elasticity and surface chemistry of the resist system enable multistep self-replication of structured resist layers. Decisive for defect-free UV-nanoimprinting in roll-to-roll is the minimization of the surface energies of stamp and resist, and the stepwise reduction of the stiffness from one layer to the next is essential for optimizing the reproduction fidelity especially for nanoscale features. Accordingly, we demonstrate the continuous replication of 3D nanostructures and the high-throughput fabrication of multilength scale resist structures resulting in flexible polyethylenetherephtalate film rolls with superhydrophobic properties. Moreover, a water-soluble UV-imprint resist (JRlift) is introduced that enables residue-free nanoimprinting in roll-to-roll. Thereby we could demonstrate high-throughput fabrication of metallic patterns with only 200 nm line width.

  10. High-throughput screening of dye-ligands for chromatography.

    PubMed

    Kumar, Sunil; Punekar, Narayan S

    2014-01-01

    Dye-ligand-based chromatography has become popular after Cibacron Blue, the first reactive textile dye, found application for protein purification. Many other textile dyes have since been successfully used to purify a number of proteins and enzymes. While the exact nature of their interaction with target proteins is often unclear, dye-ligands are thought to mimic the structural features of their corresponding substrates, cofactors, etc. The dye-ligand affinity matrices are therefore considered pseudo-affinity matrices. In addition, dye-ligands may simply bind with proteins due to electrostatic, hydrophobic, and hydrogen-bonding interactions. Because of their low cost, ready availability, and structural stability, dye-ligand affinity matrices have gained much popularity. Choice of a large number of dye structures offers a range of matrices to be prepared and tested. When presented in the high-throughput screening mode, these dye-ligand matrices provide a formidable tool for protein purification. One could pick from the list of dye-ligands already available or build a systematic library of such structures for use. A high-throughput screen may be set up to choose best dye-ligand matrix as well as ideal conditions for binding and elution, for a given protein. The mode of operation could be either manual or automated. The technology is available to test the performance of dye-ligand matrices in small volumes in an automated liquid-handling workstation. Screening a systematic library of dye-ligand structures can help establish a structure-activity relationship. While the origins of dye-ligand chromatography lay in exploiting pseudo-affinity, it is now possible to design very specific biomimetic dye structures. High-throughput screening will be of value in this endeavor as well.

  11. Analysis of high-throughput screening reveals the effect of surface topographies on cellular morphology.

    PubMed

    Hulsman, Marc; Hulshof, Frits; Unadkat, Hemant; Papenburg, Bernke J; Stamatialis, Dimitrios F; Truckenmüller, Roman; van Blitterswijk, Clemens; de Boer, Jan; Reinders, Marcel J T

    2015-03-01

    Surface topographies of materials considerably impact cellular behavior as they have been shown to affect cell growth, provide cell guidance, and even induce cell differentiation. Consequently, for successful application in tissue engineering, the contact interface of biomaterials needs to be optimized to induce the required cell behavior. However, a rational design of biomaterial surfaces is severely hampered because knowledge is lacking on the underlying biological mechanisms. Therefore, we previously developed a high-throughput screening device (TopoChip) that measures cell responses to large libraries of parameterized topographical material surfaces. Here, we introduce a computational analysis of high-throughput materiome data to capture the relationship between the surface topographies of materials and cellular morphology. We apply robust statistical techniques to find surface topographies that best promote a certain specified cellular response. By augmenting surface screening with data-driven modeling, we determine which properties of the surface topographies influence the morphological properties of the cells. With this information, we build models that predict the cellular response to surface topographies that have not yet been measured. We analyze cellular morphology on 2176 surfaces, and find that the surface topography significantly affects various cellular properties, including the roundness and size of the nucleus, as well as the perimeter and orientation of the cells. Our learned models capture and accurately predict these relationships and reveal a spectrum of topographies that induce various levels of cellular morphologies. Taken together, this novel approach of high-throughput screening of materials and subsequent analysis opens up possibilities for a rational design of biomaterial surfaces. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  12. WE-E-BRE-03: Biological Validation of a Novel High-Throughput Irradiator for Predictive Radiation Sensitivity Bioassays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, TL; Martin, JA; Shepard, AJ

    2014-06-15

    Purpose: The large dose-response variation in both tumor and normal cells between individual patients has led to the recent implementation of predictive bioassays of patient-specific radiation sensitivity in order to personalize radiation therapy. This exciting new clinical paradigm has led us to develop a novel high-throughput, variable dose-rate irradiator to accompany these efforts. Here we present the biological validation of this irradiator through the use of human cells as a relative dosimeter assessed by two metrics, DNA double-strand break repair pathway modulation and intercellular reactive oxygen species production. Methods: Immortalized human tonsilar epithelial cells were cultured in 96-well micro titermore » plates and irradiated in groups of eight wells to absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy. High-throughput immunofluorescent microscopy was used to detect γH2AX, a DNA double-strand break repair mechanism recruiter. The same analysis was performed with the cells stained with CM-H2DCFDA that produces a fluorescent adduct when exposed to reactive oxygen species during the irradiation cycle. Results: Irradiations of the immortalized human tonsilar epithelial cells at absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy produced excellent linearity in γH2AX and CM-H2DCFDA with R2 values of 0.9939 and 0.9595 respectively. Single cell gel electrophoresis experimentation for the detection of physical DNA double-strand breaks in ongoing. Conclusions: This work indicates significant potential for our high-throughput variable dose rate irradiator for patient-specific predictive radiation sensitivity bioassays. This irradiator provides a powerful tool by increasing the efficiency and number of assay techniques available to help personalize radiation therapy.« less

  13. 1001 Ways to run AutoDock Vina for virtual screening

    NASA Astrophysics Data System (ADS)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  14. 1001 Ways to run AutoDock Vina for virtual screening.

    PubMed

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  15. A high throughput mechanical screening device for cartilage tissue engineering.

    PubMed

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  16. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  17. Prediction-based association control scheme in dense femtocell networks

    PubMed Central

    Pham, Ngoc-Thai; Huynh, Thong; Hwang, Won-Joo; You, Ilsun; Choo, Kim-Kwang Raymond

    2017-01-01

    The deployment of large number of femtocell base stations allows us to extend the coverage and efficiently utilize resources in a low cost manner. However, the small cell size of femtocell networks can result in frequent handovers to the mobile user, and consequently throughput degradation. Thus, in this paper, we propose predictive association control schemes to improve the system’s effective throughput. Our design focuses on reducing handover frequency without impacting on throughput. The proposed schemes determine handover decisions that contribute most to the network throughput and are proper for distributed implementations. The simulation results show significant gains compared with existing methods in terms of handover frequency and network throughput perspective. PMID:28328992

  18. Spatial tuning of acoustofluidic pressure nodes by altering net sonic velocity enables high-throughput, efficient cell sorting

    DOE PAGES

    Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...

    2015-01-07

    Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.

  19. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor.

    PubMed

    Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim

    2016-03-14

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.

  20. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  1. Development of a High-Throughput Resequencing Array for the Detection of Pathogenic Mutations in Osteogenesis Imperfecta

    PubMed Central

    Wang, Yao; Cui, Yazhou; Zhou, Xiaoyan; Han, Jinxiang

    2015-01-01

    Objective Osteogenesis imperfecta (OI) is a rare inherited skeletal disease, characterized by bone fragility and low bone density. The mutations in this disorder have been widely reported to be on various exonal hotspots of the candidate genes, including COL1A1, COL1A2, CRTAP, LEPRE1, and FKBP10, thus creating a great demand for precise genetic tests. However, large genome sizes make the process daunting and the analyses, inefficient and expensive. Therefore, we aimed at developing a fast, accurate, efficient, and cheaper sequencing platform for OI diagnosis; and to this end, use of an advanced array-based technique was proposed. Method A CustomSeq Affymetrix Resequencing Array was established for high-throughput sequencing of five genes simultaneously. Genomic DNA extraction from 13 OI patients and 85 normal controls and amplification using long-range PCR (LR-PCR) were followed by DNA fragmentation and chip hybridization, according to standard Affymetrix protocols. Hybridization signals were determined using GeneChip Sequence Analysis Software (GSEQ). To examine the feasibility, the outcome from new resequencing approach was validated by conventional capillary sequencing method. Result Overall call rates using resequencing array was 96–98% and the agreement between microarray and capillary sequencing was 99.99%. 11 out of 13 OI patients with pathogenic mutations were successfully detected by the chip analysis without adjustment, and one mutation could also be identified using manual visual inspection. Conclusion A high-throughput resequencing array was developed that detects the disease-associated mutations in OI, providing a potential tool to facilitate large-scale genetic screening for OI patients. Through this method, a novel mutation was also found. PMID:25742658

  2. A rapid and high-throughput microplate spectrophotometric method for field measurement of nitrate in seawater and freshwater.

    PubMed

    Wu, Jiapeng; Hong, Yiguo; Guan, Fengjie; Wang, Yan; Tan, Yehui; Yue, Weizhong; Wu, Meilin; Bin, Liying; Wang, Jiaping; Wen, Jiali

    2016-02-01

    The well-known zinc-cadmium reduction method is frequently used for determination of nitrate. However, this method is seldom to be applied on field research of nitrate due to the long time consuming and large sample volume demand. Here, we reported a modified zinc-cadmium reduction method (MZCRM) for measurement of nitrate at natural-abundance level in both seawater and freshwater. The main improvements of MZCRM include using small volume disposable tubes for reaction, a vortex apparatus for shaking to increase reduction rate, and a microplate reader for high-throughput spectrophotometric measurements. Considering salt effect, two salinity sections (5~10 psu and 20~35 psu) were set up for more accurate determination of nitrate in low and high salinity condition respectively. Under optimized experimental conditions, the reduction rates were stabilized on 72% and 63% on the salinity of 5 and 20 psu respectively. The lowest detection limit for nitrate was 0.5 μM and was linear up to 100 μM (RSDs was 4.8%). Environmental samples assay demonstrated that MZCRM was well consistent with conventional zinc-cadmium reduction method. In total, this modified method improved accuracy and efficiency of operations greatly, and would be realized a rapid and high-throughput determination of nitrate in field analysis of nitrate with low cost.

  3. Morphology control in polymer blend fibers—a high throughput computing approach

    NASA Astrophysics Data System (ADS)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  4. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCVmore » DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.« less

  5. Performance Evaluation of the Sysmex CS-5100 Automated Coagulation Analyzer.

    PubMed

    Chen, Liming; Chen, Yu

    2015-01-01

    Coagulation testing is widely applied clinically, and laboratories increasingly demand automated coagulation analyzers with short turn-around times and high-throughput. The purpose of this study was to evaluate the performance of the Sysmex CS-5100 automated coagulation analyzer for routine use in a clinical laboratory. The prothrombin time (PT), international normalized ratio (INR), activated partial thromboplastin time (APTT), fibrinogen (Fbg), and D-dimer were compared between the Sysmex CS-5100 and Sysmex CA-7000 analyzers, and the imprecision, comparison, throughput, STAT function, and performance for abnormal samples were measured in each. The within-run and between-run coefficients of variation (CV) for the PT, APTT, INR, and D-dimer analyses showed excellent results both in the normal and pathologic ranges. The correlation coefficients between the Sysmex CS-5100 and Sysmex CA-7000 were highly correlated. The throughput of the Sysmex CS-5100 was faster than that of the Sysmex CA-7000. There was no interference at all by total bilirubin concentrations and triglyceride concentrations in the Sysmex CS-5100 analyzer. We demonstrated that the Sysmex CS-5100 performs with satisfactory imprecision and is well suited for coagulation analysis in laboratories processing large sample numbers and icteric and lipemic samples.

  6. Real-time traffic sign detection and recognition

    NASA Astrophysics Data System (ADS)

    Herbschleb, Ernst; de With, Peter H. N.

    2009-01-01

    The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput

  7. WIYN bench upgrade: a revitalized spectrograph

    NASA Astrophysics Data System (ADS)

    Bershady, M.; Barden, S.; Blanche, P.-A.; Blanco, D.; Corson, C.; Crawford, S.; Glaspey, J.; Habraken, S.; Jacoby, G.; Keyes, J.; Knezek, P.; Lemaire, P.; Liang, M.; McDougall, E.; Poczulp, G.; Sawyer, D.; Westfall, K.; Willmarth, D.

    2008-07-01

    We describe the redesign and upgrade of the versatile fiber-fed Bench Spectrograph on the WIYN 3.5m telescope. The spectrograph is fed by either the Hydra multi-object positioner or integral-field units (IFUs) at two other ports, and can be configured with an adjustable camera-collimator angle to use low-order and echelle gratings. The upgrade, including a new collimator, charge-coupled device (CCD) and modern controller, and volume-phase holographic gratings (VPHG), has high performance-to-cost ratio by combining new technology with a system reconfiguration that optimizes throughput while utilizing as much of the existing instrument as possible. A faster, all-refractive collimator enhances throughput by 60%, nearly eliminates the slit-function due to vignetting, and improves image quality to maintain instrumental resolution. Two VPH gratings deliver twice the diffraction efficiency of existing surface-relief gratings: A 740 l/mm grating (float-glass and post-polished) used in 1st and 2nd-order, and a large 3300 l/mm grating (spectral resolution comparable to the R2 echelle). The combination of collimator, high-quantum efficiency (QE) CCD, and VPH gratings yields throughput gain-factors of up to 3.5.

  8. Development of multitissue microfluidic dynamic array for assessing changes in gene expression associated with channel catfish appetite, growth, metabolism, and intestinal health

    USDA-ARS?s Scientific Manuscript database

    Large-scale, gene expression methods allow for high throughput analysis of physiological pathways at a fraction of the cost of individual gene expression analysis. Systems, such as the Fluidigm quantitative PCR array described here, can provide powerful assessments of the effects of diet, environme...

  9. Comparison and Analysis of Toxcast Data with In Vivo Data for Food-Relevant Compounds Using The Risk21 Approach

    EPA Science Inventory

    The ToxCast program has generated a great wealth of in vitro high throughput screening (HTS) data on a large number of compounds, providing a unique resource of information on the bioactivity of these compounds. However, analysis of these data are ongoing, and interpretation and ...

  10. Evaluating differences in the homeostatic intestinal microbiome between selected strains of catfish Ictalurus punctatus and blue catfish Ictalurus furcatus

    USDA-ARS?s Scientific Manuscript database

    The intestinal microbiome (IM), or the community of commensal and pathogenic microbes living within the gut, have long been thought to play a large role in digestion and fish performance. Recently, high-throughput molecular sequencing has yielded insights into the importance of the IM in aquaculture...

  11. Investigation of the influence of leaf thickness on canopy reflectance and physiological traits in upland and Pima cotton populations

    USDA-ARS?s Scientific Manuscript database

    Field-based, high-throughput phenotyping (FB-HTP) methods are becoming more prevalent in plant genetics and breeding because they enable the evaluation of large numbers of genotypes under actual field conditions. Many systems for FB-HTP quantify and characterize the reflected radiation from the crop...

  12. High-throughput screening (HTS) and modeling of the retinoid ...

    EPA Pesticide Factsheets

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  13. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    EPA Science Inventory

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  14. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  15. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  16. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  17. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    PubMed

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  18. Outlook for Development of High-throughput Cryopreservation for Small-bodied Biomedical Model Fishes★

    PubMed Central

    Tiersch, Terrence R.; Yang, Huiping; Hu, E.

    2011-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666

  19. Optimisation of insect cell growth in deep-well blocks: development of a high-throughput insect cell expression screen.

    PubMed

    Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian

    2005-01-01

    This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.

  20. TCP throughput adaptation in WiMax networks using replicator dynamics.

    PubMed

    Anastasopoulos, Markos P; Petraki, Dionysia K; Kannan, Rajgopal; Vasilakos, Athanasios V

    2010-06-01

    The high-frequency segment (10-66 GHz) of the IEEE 802.16 standard seems promising for the implementation of wireless backhaul networks carrying large volumes of Internet traffic. In contrast to wireline backbone networks, where channel errors seldom occur, the TCP protocol in IEEE 802.16 Worldwide Interoperability for Microwave Access networks is conditioned exclusively by wireless channel impairments rather than by congestion. This renders a cross-layer design approach between the transport and physical layers more appropriate during fading periods. In this paper, an adaptive coding and modulation (ACM) scheme for TCP throughput maximization is presented. In the current approach, Internet traffic is modulated and coded employing an adaptive scheme that is mathematically equivalent to the replicator dynamics model. The stability of the proposed ACM scheme is proven, and the dependence of the speed of convergence on various physical-layer parameters is investigated. It is also shown that convergence to the strategy that maximizes TCP throughput may be further accelerated by increasing the amount of information from the physical layer.

  1. Microscale screening systems for 3D cellular microenvironments: platforms, advances, and challenges

    PubMed Central

    Montanez-Sauri, Sara I.; Beebe, David J.; Sung, Kyung Eun

    2015-01-01

    The increasing interest in studying cells using more in vivo-like three-dimensional (3D) microenvironments has created a need for advanced 3D screening platforms with enhanced functionalities and increased throughput. 3D screening platforms that better mimic in vivo microenvironments with enhanced throughput would provide more in-depth understanding of the complexity and heterogeneity of microenvironments. The platforms would also better predict the toxicity and efficacy of potential drugs in physiologically relevant conditions. Traditional 3D culture models (e.g. spinner flasks, gyratory rotation devices, non-adhesive surfaces, polymers) were developed to create 3D multicellular structures. However, these traditional systems require large volumes of reagents and cells, and are not compatible with high throughput screening (HTS) systems. Microscale technology offers the miniaturization of 3D cultures and allows efficient screening of various conditions. This review will discuss the development, most influential works, and current advantages and challenges of microscale culture systems for screening cells in 3D microenvironments. PMID:25274061

  2. Double Sampling with Multiple Imputation to Answer Large Sample Meta-Research Questions: Introduction and Illustration by Evaluating Adherence to Two Simple CONSORT Guidelines

    PubMed Central

    Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.

    2015-01-01

    Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135

  3. High-throughput measurements of biochemical responses using the plate::vision multimode 96 minilens array reader.

    PubMed

    Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich

    2006-01-01

    The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.

  4. High-Throughput Fabrication of Ultradense Annular Nanogap Arrays for Plasmon-Enhanced Spectroscopy.

    PubMed

    Cai, Hongbing; Meng, Qiushi; Zhao, Hui; Li, Mingling; Dai, Yanmeng; Lin, Yue; Ding, Huaiyi; Pan, Nan; Tian, Yangchao; Luo, Yi; Wang, Xiaoping

    2018-06-13

    The confinement of light into nanometer-sized metallic nanogaps can lead to an extremely high field enhancement, resulting in dramatically enhanced absorption, emission, and surface-enhanced Raman scattering (SERS) of molecules embedded in nanogaps. However, low-cost, high-throughput, and reliable fabrication of ultra-high-dense nanogap arrays with precise control of the gap size still remains a challenge. Here, by combining colloidal lithography and atomic layer deposition technique, a reproducible method for fabricating ultra-high-dense arrays of hexagonal close-packed annular nanogaps over large areas is demonstrated. The annular nanogap arrays with a minimum diameter smaller than 100 nm and sub-1 nm gap width have been produced, showing excellent SERS performance with a typical enhancement factor up to 3.1 × 10 6 and a detection limit of 10 -11 M. Moreover, it can also work as a high-quality field enhancement substrate for studying two-dimensional materials, such as MoSe 2 . Our method provides an attractive approach to produce controllable nanogaps for enhanced light-matter interaction at the nanoscale.

  5. Using ALFA for high throughput, distributed data transmission in the ALICE O2 system

    NASA Astrophysics Data System (ADS)

    Wegrzynek, A.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion detector designed to study the physics of strongly interacting matter (the Quark-Gluon Plasma at the CERN LHC (Large Hadron Collider). ALICE has been successfully collecting physics data in Run 2 since spring 2015. In parallel, preparations for a major upgrade of the computing system, called O2 (Online-Offline), scheduled for the Long Shutdown 2 in 2019-2020, are being made. One of the major requirements of the system is the capacity to transport data between so-called FLPs (First Level Processors), equipped with readout cards, and the EPNs (Event Processing Node), performing data aggregation, frame building and partial reconstruction. It is foreseen to have 268 FLPs dispatching data to 1500 EPNs with an average output of 20 Gb/s each. In overall, the O2 processing system will operate at terabits per second of throughput while handling millions of concurrent connections. The ALFA framework will standardize and handle software related tasks such as readout, data transport, frame building, calibration, online reconstruction and more in the upgraded computing system. ALFA supports two data transport libraries: ZeroMQ and nanomsg. This paper discusses the efficiency of ALFA in terms of high throughput data transport. The tests were performed with multiple FLPs pushing data to multiple EPNs. The transfer was done using push-pull communication patterns and two socket configurations: bind, connect. The set of benchmarks was prepared to get the most performant results on each hardware setup. The paper presents the measurement process and final results - data throughput combined with computing resources usage as a function of block size. The high number of nodes and connections in the final set up may cause race conditions that can lead to uneven load balancing and poor scalability. The performed tests allow us to validate whether the traffic is distributed evenly over all receivers. It also measures the behaviour of the network in saturation and evaluates scalability from a 1-to-1 to a N-to-M solution.

  6. Using High-Throughput Sequencing to Leverage Surveillance of Genetic Diversity and Oseltamivir Resistance: A Pilot Study during the 2009 Influenza A(H1N1) Pandemic

    PubMed Central

    Téllez-Sosa, Juan; Rodríguez, Mario Henry; Gómez-Barreto, Rosa E.; Valdovinos-Torres, Humberto; Hidalgo, Ana Cecilia; Cruz-Hervert, Pablo; Luna, René Santos; Carrillo-Valenzo, Erik; Ramos, Celso; García-García, Lourdes; Martínez-Barnetche, Jesús

    2013-01-01

    Background Influenza viruses display a high mutation rate and complex evolutionary patterns. Next-generation sequencing (NGS) has been widely used for qualitative and semi-quantitative assessment of genetic diversity in complex biological samples. The “deep sequencing” approach, enabled by the enormous throughput of current NGS platforms, allows the identification of rare genetic viral variants in targeted genetic regions, but is usually limited to a small number of samples. Methodology and Principal Findings We designed a proof-of-principle study to test whether redistributing sequencing throughput from a high depth-small sample number towards a low depth-large sample number approach is feasible and contributes to influenza epidemiological surveillance. Using 454-Roche sequencing, we sequenced at a rather low depth, a 307 bp amplicon of the neuraminidase gene of the Influenza A(H1N1) pandemic (A(H1N1)pdm) virus from cDNA amplicons pooled in 48 barcoded libraries obtained from nasal swab samples of infected patients (n  =  299) taken from May to November, 2009 pandemic period in Mexico. This approach revealed that during the transition from the first (May-July) to second wave (September-November) of the pandemic, the initial genetic variants were replaced by the N248D mutation in the NA gene, and enabled the establishment of temporal and geographic associations with genetic diversity and the identification of mutations associated with oseltamivir resistance. Conclusions NGS sequencing of a short amplicon from the NA gene at low sequencing depth allowed genetic screening of a large number of samples, providing insights to viral genetic diversity dynamics and the identification of genetic variants associated with oseltamivir resistance. Further research is needed to explain the observed replacement of the genetic variants seen during the second wave. As sequencing throughput rises and library multiplexing and automation improves, we foresee that the approach presented here can be scaled up for global genetic surveillance of influenza and other infectious diseases. PMID:23843978

  7. Discovering the desirable alleles contributing to the lignocellulosic biomass traits in Saccharum germplasm collections for energy cane improvement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianping; Sandhu, Hardev

    1) The success in crop improvement programs depends largely on the extent of genetic variability available. Germplasm collections assembles all the available genetic resources and are critical for long-term crop improvement. This world sugarcane germplasm collection contains enormous genetic variability for various morphological traits, biomass yield components, adaptation and many quality traits, prospectively imbeds a large number of valuable alleles for biofuel traits such as high biomass yield, quantity and quality of lignocelluloses, stress tolerance, and nutrient use efficiency. The germplasm collection is of little value unless it is characterized and utilized for crop improvement. In this project, we phenotypicallymore » and genotypically characterized the sugarcane world germplasm collection (The results were published in two papers already and another two papers are to be published). This data will be made available for public to refer to for germplasm unitization specifically in the sugarcane and energy cane breeding programs. In addition, we are identifying the alleles contributing to the biomass traits in sugarcane germplasm. This part of project is very challenging due to the large genome and highly polyploid level of this crop. We firstly established a high throughput sugarcane genotyping pipeline in the genome and bioinformatics era (a paper is published in 2016). We identified and modified a software for genome-wide association analysis of polyploid species. The results of the alleles associated to the biomass traits will be published soon, which will help the scientific community understand the genetic makeup of the biomass components of sugarcane. Molecular breeders can develop markers for marker assisted selection of biomass traits improvement. Further, the development and release of new energy cane cultivars through this project not only improved genetic diversity but also improved dry biomass yields and resistance to diseases. These new cultivars were tested on marginal soils in Florida and showed very promising yield potential that is important for the successful use of energy cane as a dedicated feedstock for lignocellulosic ethanol production. 2) Multiple techniques at different project progress stages were utilized. For example, for the whole world germplasm accession genotyping, a cheap widely used SSR marker genotyping platform was utilized due to the large number of samples (over thousand). But the throughput of this technique is low in generating data points. However, the purpose the genotyping is to form a core collection for further high throughput genotyping. Thus the results from the SSR genotyping was quite good enough to generated the core collection. To genotype the few hundred core collection accessions, an target enrichment sequencing technology was used, which is not only high throughput in generating large number of genotyping data, but also has the candidate genes targeted to genotyping. The data generated would be sufficient in identifying the alleles contributing to the traits of interests. All the techniques used in this project are effective though extensive time was invested specifically for establish the pipeline in the experimental design, data analysis, and different approach comparison. 3) the research can benefit to the public in polyploid genotyping and new and cost efficient genotyping platform development« less

  8. Throughput assurance of wireless body area networks coexistence based on stochastic geometry

    PubMed Central

    Wang, Yinglong; Shu, Minglei; Wu, Shangbin

    2017-01-01

    Wireless body area networks (WBANs) are expected to influence the traditional medical model by assisting caretakers with health telemonitoring. Within WBANs, the transmit power of the nodes should be as small as possible owing to their limited energy capacity but should be sufficiently large to guarantee the quality of the signal at the receiving nodes. When multiple WBANs coexist in a small area, the communication reliability and overall throughput can be seriously affected due to resource competition and interference. We show that the total network throughput largely depends on the WBANs distribution density (λp), transmit power of their nodes (Pt), and their carrier-sensing threshold (γ). Using stochastic geometry, a joint carrier-sensing threshold and power control strategy is proposed to meet the demand of coexisting WBANs based on the IEEE 802.15.4 standard. Given different network distributions and carrier-sensing thresholds, the proposed strategy derives a minimum transmit power according to varying surrounding environment. We obtain expressions for transmission success probability and throughput adopting this strategy. Using numerical examples, we show that joint carrier-sensing thresholds and transmit power strategy can effectively improve the overall system throughput and reduce interference. Additionally, this paper studies the effects of a guard zone on the throughput using a Matern hard-core point process (HCPP) type II model. Theoretical analysis and simulation results show that the HCPP model can increase the success probability and throughput of networks. PMID:28141841

  9. Enhancing high throughput toxicology - development of putative adverse outcome pathways linking US EPA ToxCast screening targets to relevant apical hazards.

    EPA Science Inventory

    High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...

  10. Evaluation of High-Throughput Chemical Exposure Models via Analysis of Matched Environmental and Biological Media Measurements

    EPA Science Inventory

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...

  11. The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD

    NASA Astrophysics Data System (ADS)

    Cox, M. A.; Reed, R.; Mellado, B.

    2015-01-01

    After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.

  12. [Current applications of high-throughput DNA sequencing technology in antibody drug research].

    PubMed

    Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong

    2012-03-01

    Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.

  13. A high-throughput screen against pantothenate synthetase (PanC) identifies 3-biphenyl-4-cyanopyrrole-2-carboxylic acids as a new class of inhibitor with activity against Mycobacterium tuberculosis.

    PubMed

    Kumar, Anuradha; Casey, Allen; Odingo, Joshua; Kesicki, Edward A; Abrahams, Garth; Vieth, Michal; Masquelin, Thierry; Mizrahi, Valerie; Hipskind, Philip A; Sherman, David R; Parish, Tanya

    2013-01-01

    The enzyme pantothenate synthetase, PanC, is an attractive drug target in Mycobacterium tuberculosis. It is essential for the in vitro growth of M. tuberculosis and for survival of the bacteria in the mouse model of infection. PanC is absent from mammals. We developed an enzyme-based assay to identify inhibitors of PanC, optimized it for high-throughput screening, and tested a large and diverse library of compounds for activity. Two compounds belonging to the same chemical class of 3-biphenyl-4- cyanopyrrole-2-carboxylic acids had activity against the purified recombinant protein, and also inhibited growth of live M. tuberculosis in manner consistent with PanC inhibition. Thus we have identified a new class of PanC inhibitors with whole cell activity that can be further developed.

  14. A spectrophotometric assay for fatty acid amide hydrolase suitable for high-throughput screening.

    PubMed

    De Bank, Paul A; Kendall, David A; Alexander, Stephen P H

    2005-04-15

    Signalling via the endocannabinoids anandamide and 2-arachidonylglycerol appears to be terminated largely through the action of the enzyme fatty acid amide hydrolase (FAAH). In this report, we describe a simple spectrophotometric assay to detect FAAH activity in vitro using the ability of the enzyme to hydrolyze oleamide and measuring the resultant production of ammonia with a NADH/NAD+-coupled enzyme reaction. This dual-enzyme assay was used to determine Km and Vmax values of 104 microM and 5.7 nmol/min/mgprotein, respectively, for rat liver FAAH-catalyzed oleamide hydrolysis. Inhibitor potency was determined with the resultant rank order of methyl arachidonyl fluorophosphonate>phenylmethylsulphonyl fluoride>anandamide. This assay system was also adapted for use in microtiter plates and its ability to detect a known inhibitor of FAAH demonstrated, highlighting its potential for use in high-throughput screening.

  15. Control structures for high speed processors

    NASA Technical Reports Server (NTRS)

    Maki, G. K.; Mankin, R.; Owsley, P. A.; Kim, G. M.

    1982-01-01

    A special processor was designed to function as a Reed Solomon decoder with throughput data rate in the Mhz range. This data rate is significantly greater than is possible with conventional digital architectures. To achieve this rate, the processor design includes sequential, pipelined, distributed, and parallel processing. The processor was designed using a high level language register transfer language. The RTL can be used to describe how the different processes are implemented by the hardware. One problem of special interest was the development of dependent processes which are analogous to software subroutines. For greater flexibility, the RTL control structure was implemented in ROM. The special purpose hardware required approximately 1000 SSI and MSI components. The data rate throughput is 2.5 megabits/second. This data rate is achieved through the use of pipelined and distributed processing. This data rate can be compared with 800 kilobits/second in a recently proposed very large scale integration design of a Reed Solomon encoder.

  16. Large-Scale Discovery of Induced Point Mutations With High-Throughput TILLING

    PubMed Central

    Till, Bradley J.; Reynolds, Steven H.; Greene, Elizabeth A.; Codomo, Christine A.; Enns, Linda C.; Johnson, Jessica E.; Burtner, Chris; Odden, Anthony R.; Young, Kim; Taylor, Nicholas E.; Henikoff, Jorja G.; Comai, Luca; Henikoff, Steven

    2003-01-01

    TILLING (Targeting Induced Local Lesions in Genomes) is a general reverse-genetic strategy that provides an allelic series of induced point mutations in genes of interest. High-throughput TILLING allows the rapid and low-cost discovery of induced point mutations in populations of chemically mutagenized individuals. As chemical mutagenesis is widely applicable and mutation detection for TILLING is dependent only on sufficient yield of PCR products, TILLING can be applied to most organisms. We have developed TILLING as a service to the Arabidopsis community known as the Arabidopsis TILLING Project (ATP). Our goal is to rapidly deliver allelic series of ethylmethanesulfonate-induced mutations in target 1-kb loci requested by the international research community. In the first year of public operation, ATP has discovered, sequenced, and delivered >1000 mutations in >100 genes ordered by Arabidopsis researchers. The tools and methodologies described here can be adapted to create similar facilities for other organisms. PMID:12618384

  17. High-Throughput Mapping of Single-Neuron Projections by Sequencing of Barcoded RNA.

    PubMed

    Kebschull, Justus M; Garcia da Silva, Pedro; Reid, Ashlan P; Peikon, Ian D; Albeanu, Dinu F; Zador, Anthony M

    2016-09-07

    Neurons transmit information to distant brain regions via long-range axonal projections. In the mouse, area-to-area connections have only been systematically mapped using bulk labeling techniques, which obscure the diverse projections of intermingled single neurons. Here we describe MAPseq (Multiplexed Analysis of Projections by Sequencing), a technique that can map the projections of thousands or even millions of single neurons by labeling large sets of neurons with random RNA sequences ("barcodes"). Axons are filled with barcode mRNA, each putative projection area is dissected, and the barcode mRNA is extracted and sequenced. Applying MAPseq to the locus coeruleus (LC), we find that individual LC neurons have preferred cortical targets. By recasting neuroanatomy, which is traditionally viewed as a problem of microscopy, as a problem of sequencing, MAPseq harnesses advances in sequencing technology to permit high-throughput interrogation of brain circuits. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Robust ridge regression estimators for nonlinear models with applications to high throughput screening assay data.

    PubMed

    Lim, Changwon

    2015-03-30

    Nonlinear regression is often used to evaluate the toxicity of a chemical or a drug by fitting data from a dose-response study. Toxicologists and pharmacologists may draw a conclusion about whether a chemical is toxic by testing the significance of the estimated parameters. However, sometimes the null hypothesis cannot be rejected even though the fit is quite good. One possible reason for such cases is that the estimated standard errors of the parameter estimates are extremely large. In this paper, we propose robust ridge regression estimation procedures for nonlinear models to solve this problem. The asymptotic properties of the proposed estimators are investigated; in particular, their mean squared errors are derived. The performances of the proposed estimators are compared with several standard estimators using simulation studies. The proposed methodology is also illustrated using high throughput screening assay data obtained from the National Toxicology Program. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  20. Automated Phase Segmentation for Large-Scale X-ray Diffraction Data Using a Graph-Based Phase Segmentation (GPhase) Algorithm.

    PubMed

    Xiong, Zheng; He, Yinyan; Hattrick-Simpers, Jason R; Hu, Jianjun

    2017-03-13

    The creation of composition-processing-structure relationships currently represents a key bottleneck for data analysis for high-throughput experimental (HTE) material studies. Here we propose an automated phase diagram attribution algorithm for HTE data analysis that uses a graph-based segmentation algorithm and Delaunay tessellation to create a crystal phase diagram from high throughput libraries of X-ray diffraction (XRD) patterns. We also propose the sample-pair based objective evaluation measures for the phase diagram prediction problem. Our approach was validated using 278 diffraction patterns from a Fe-Ga-Pd composition spread sample with a prediction precision of 0.934 and a Matthews Correlation Coefficient score of 0.823. The algorithm was then applied to the open Ni-Mn-Al thin-film composition spread sample to obtain the first predicted phase diagram mapping for that sample.

Top