Sample records for high throughput procedure

  1. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  2. HIGH-THROUGHPUT IDENTIFICATION OF CATALYTIC REDOX-ACTIVE CYSTEINE RESIDUES

    EPA Science Inventory

    Cysteine (Cys) residues often play critical roles in proteins; however, identification of their specific functions has been limited to case-by-case experimental approaches. We developed a procedure for high-throughput identification of catalytic redox-active Cys in proteins by se...

  3. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    USDA-ARS?s Scientific Manuscript database

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  4. A high-throughput assay format for determination of nitrate reductase and nitrite reductase enzyme activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNally, N.; Liu, Xiang Yang; Choudary, P.V.

    1997-01-01

    The authors describe a microplate-based high-throughput procedure for rapid assay of the enzyme activities of nitrate reductase and nitrite reductase, using extremely small volumes of reagents. The new procedure offers the advantages of rapidity, small sample size-nanoliter volumes, low cost, and a dramatic increase in the throughput sample number that can be analyzed simultaneously. Additional advantages can be accessed by using microplate reader application software packages that permit assigning a group type to the wells, recording of the data on exportable data files and exercising the option of using the kinetic or endpoint reading modes. The assay can also bemore » used independently for detecting nitrite residues/contamination in environmental/food samples. 10 refs., 2 figs.« less

  5. High-throughput assay of oxygen radical absorbance capacity (ORAC) using a multichannel liquid handling system coupled with a microplate fluorescence reader in 96-well format.

    PubMed

    Huang, Dejian; Ou, Boxin; Hampsch-Woodill, Maureen; Flanagan, Judith A; Prior, Ronald L

    2002-07-31

    The oxygen radical absorbance capacity (ORAC) assay has been widely accepted as a standard tool to measure the antioxidant activity in the nutraceutical, pharmaceutical, and food industries. However, the ORAC assay has been criticized for a lack of accessibility due to the unavailability of the COBAS FARA II analyzer, an instrument discontinued by the manufacturer. In addition, the manual sample preparation is time-consuming and labor-intensive. The objective of this study was to develop a high-throughput instrument platform that can fully automate the ORAC assay procedure. The new instrument platform consists of a robotic eight-channel liquid handling system and a microplate fluorescence reader. By using the high-throughput platform, the efficiency of the assay is improved with at least a 10-fold increase in sample throughput over the current procedure. The mean of intra- and interday CVs was

  6. Carbohydrate Microarray Technology Applied to High-Throughput Mapping of Plant Cell Wall Glycans Using Comprehensive Microarray Polymer Profiling (CoMPP).

    PubMed

    Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho

    2017-01-01

    Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.

  7. Novel method for the high-throughput processing of slides for the comet assay

    PubMed Central

    Karbaschi, Mahsa; Cooke, Marcus S.

    2014-01-01

    Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. “Scoring”, or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure. PMID:25425241

  8. Novel method for the high-throughput processing of slides for the comet assay.

    PubMed

    Karbaschi, Mahsa; Cooke, Marcus S

    2014-11-26

    Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.

  9. A high performance hardware implementation image encryption with AES algorithm

    NASA Astrophysics Data System (ADS)

    Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab

    2011-06-01

    This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.

  10. Functional Metagenomics: Construction and High-Throughput Screening of Fosmid Libraries for Discovery of Novel Carbohydrate-Active Enzymes.

    PubMed

    Ufarté, Lisa; Bozonnet, Sophie; Laville, Elisabeth; Cecchini, Davide A; Pizzut-Serin, Sandra; Jacquiod, Samuel; Demanèche, Sandrine; Simonet, Pascal; Franqueville, Laure; Veronese, Gabrielle Potocki

    2016-01-01

    Activity-based metagenomics is one of the most efficient approaches to boost the discovery of novel biocatalysts from the huge reservoir of uncultivated bacteria. In this chapter, we describe a highly generic procedure of metagenomic library construction and high-throughput screening for carbohydrate-active enzymes. Applicable to any bacterial ecosystem, it enables the swift identification of functional enzymes that are highly efficient, alone or acting in synergy, to break down polysaccharides and oligosaccharides.

  11. High Throughput Determination of Tetramine in Drinking ...

    EPA Pesticide Factsheets

    Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.

  12. High-Throughput Screening of Therapeutic Neural Stimulation Targets: Toward Principles of Preventing and Treating Post-Traumatic Stress Disorder

    DTIC Science & Technology

    2009-09-01

    onset and averaged across all excited units tested (mean ± SE). 7 SUPPLEMENTAL EXPERIMENTAL PROCEDURES Virus design and production...to baseline level 355 ± 505 ms later. The level of post -light firing did not vary with repeated light exposure (p > 0.7, paired t- test comparing...High-Throughput Screening of Therapeutic Neural Stimulation Targets: Toward Principles of Preventing and Treating Post - Traumatic Stress Disorder

  13. Turbulent flow chromatography TFC-tandem mass spectrometry supporting in vitro/vivo studies of NCEs in high throughput fashion.

    PubMed

    Verdirame, Maria; Veneziano, Maria; Alfieri, Anna; Di Marco, Annalise; Monteagudo, Edith; Bonelli, Fabio

    2010-03-11

    Turbulent Flow Chromatography (TFC) is a powerful approach for on-line extraction in bioanalytical studies. It improves sensitivity and reduces sample preparation time, two factors that are of primary importance in drug discovery. In this paper the application of the ARIA system to the analytical support of in vivo pharmacokinetics (PK) and in vitro drug metabolism studies is described, with an emphasis in high throughput optimization. For PK studies, a comparison between acetonitrile plasma protein precipitation (APPP) and TFC was carried out. Our optimized TFC methodology gave better S/N ratios and lower limit of quantification (LOQ) than conventional procedures. A robust and high throughput analytical method to support hepatocyte metabolic stability screening of new chemical entities was developed by hyphenation of TFC with mass spectrometry. An in-loop dilution injection procedure was implemented to overcome one of the main issues when using TFC, that is the early elution of hydrophilic compounds that renders low recoveries. A comparison between off-line solid phase extraction (SPE) and TFC was also carried out, and recovery, sensitivity (LOQ), matrix effect and robustness were evaluated. The use of two parallel columns in the configuration of the system provided a further increase of the throughput. Copyright 2009 Elsevier B.V. All rights reserved.

  14. High throughput workflow for coacervate formation and characterization in shampoo systems.

    PubMed

    Kalantar, T H; Tucker, C J; Zalusky, A S; Boomgaard, T A; Wilson, B E; Ladika, M; Jordan, S L; Li, W K; Zhang, X; Goh, C G

    2007-01-01

    Cationic cellulosic polymers find wide utility as benefit agents in shampoo. Deposition of these polymers onto hair has been shown to mend split-ends, improve appearance and wet combing, as well as provide controlled delivery of insoluble actives. The deposition is thought to be enhanced by the formation of a polymer/surfactant complex that phase-separates from the bulk solution upon dilution. A standard characterization method has been developed to characterize the coacervate formation upon dilution, but the test is time and material prohibitive. We have developed a semi-automated high throughput workflow to characterize the coacervate-forming behavior of different shampoo formulations. A procedure that allows testing of real use shampoo dilutions without first formulating a complete shampoo was identified. This procedure was adapted to a Tecan liquid handler by optimizing the parameters for liquid dispensing as well as for mixing. The high throughput workflow enabled preparation and testing of hundreds of formulations with different types and levels of cationic cellulosic polymers and surfactants, and for each formulation a haze diagram was constructed. Optimal formulations and their dilutions that give substantial coacervate formation (determined by haze measurements) were identified. Results from this high throughput workflow were shown to reproduce standard haze and bench-top turbidity measurements, and this workflow has the advantages of using less material and allowing more variables to be tested with significant time savings.

  15. Loeffler 4.0: Diagnostic Metagenomics.

    PubMed

    Höper, Dirk; Wylezich, Claudia; Beer, Martin

    2017-01-01

    A new world of possibilities for "virus discovery" was opened up with high-throughput sequencing becoming available in the last decade. While scientifically metagenomic analysis was established before the start of the era of high-throughput sequencing, the availability of the first second-generation sequencers was the kick-off for diagnosticians to use sequencing for the detection of novel pathogens. Today, diagnostic metagenomics is becoming the standard procedure for the detection and genetic characterization of new viruses or novel virus variants. Here, we provide an overview about technical considerations of high-throughput sequencing-based diagnostic metagenomics together with selected examples of "virus discovery" for animal diseases or zoonoses and metagenomics for food safety or basic veterinary research. © 2017 Elsevier Inc. All rights reserved.

  16. Optimization of High-Throughput Sequencing Kinetics for determining enzymatic rate constants of thousands of RNA substrates

    PubMed Central

    Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633

  17. NCBI GEO: archive for high-throughput functional genomic data.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  18. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  19. Scalable software-defined optical networking with high-performance routing and wavelength assignment algorithms.

    PubMed

    Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin

    2015-10-19

    The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.

  20. Handheld Fluorescence Microscopy based Flow Analyzer.

    PubMed

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  1. Gas pressure assisted microliquid-liquid extraction coupled online to direct infusion mass spectrometry: a new automated screening platform for bioanalysis.

    PubMed

    Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas

    2014-10-21

    In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.

  2. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  3. Evaluation of high throughput gene expression platforms using a genomic biomarker signature for prediction of skin sensitization.

    PubMed

    Forreryd, Andy; Johansson, Henrik; Albrekt, Ann-Sofie; Lindstedt, Malin

    2014-05-16

    Allergic contact dermatitis (ACD) develops upon exposure to certain chemical compounds termed skin sensitizers. To reduce the occurrence of skin sensitizers, chemicals are regularly screened for their capacity to induce sensitization. The recently developed Genomic Allergen Rapid Detection (GARD) assay is an in vitro alternative to animal testing for identification of skin sensitizers, classifying chemicals by evaluating transcriptional levels of a genomic biomarker signature. During assay development and biomarker identification, genome-wide expression analysis was applied using microarrays covering approximately 30,000 transcripts. However, the microarray platform suffers from drawbacks in terms of low sample throughput, high cost per sample and time consuming protocols and is a limiting factor for adaption of GARD into a routine assay for screening of potential sensitizers. With the purpose to simplify assay procedures, improve technical parameters and increase sample throughput, we assessed the performance of three high throughput gene expression platforms--nCounter®, BioMark HD™ and OpenArray®--and correlated their performance metrics against our previously generated microarray data. We measured the levels of 30 transcripts from the GARD biomarker signature across 48 samples. Detection sensitivity, reproducibility, correlations and overall structure of gene expression measurements were compared across platforms. Gene expression data from all of the evaluated platforms could be used to classify most of the sensitizers from non-sensitizers in the GARD assay. Results also showed high data quality and acceptable reproducibility for all platforms but only medium to poor correlations of expression measurements across platforms. In addition, evaluated platforms were superior to the microarray platform in terms of cost efficiency, simplicity of protocols and sample throughput. We evaluated the performance of three non-array based platforms using a limited set of transcripts from the GARD biomarker signature. We demonstrated that it was possible to achieve acceptable discriminatory power in terms of separation between sensitizers and non-sensitizers in the GARD assay while reducing assay costs, simplify assay procedures and increase sample throughput by using an alternative platform, providing a first step towards the goal to prepare GARD for formal validation and adaption of the assay for industrial screening of potential sensitizers.

  4. “Drinking in the Dark” (DID) Procedures: A Model of Binge-Like Ethanol Drinking in Non-Dependent Mice

    PubMed Central

    Thiele, Todd E.; Navarro, Montserrat

    2013-01-01

    This review provides an overview of an animal model of binge-like ethanol drinking that has come to be called “drinking in the dark” (DID), a procedure that promotes high levels of ethanol drinking and pharmacologically relevant blood ethanol concentrations (BECs) in ethanol-preferring strains of mice. Originally described by Rhodes et al. (2005), the most common variation of the DID procedure, using singly housed mice, involves replacing the water bottle with a bottle containing 20% ethanol for 2 to 4 hours, beginning 3 hours into the dark cycle. Using this procedure, high ethanol drinking strains of mice (e.g., C57BL/6J) typically consume enough ethanol to achieve BECs greater than 100 mg/dL and to exhibit behavioral evidence of intoxication. This limited access procedure takes advantage of the time in the animal’s dark cycle in which the levels of ingestive behaviors are high, yet high ethanol intake does not appear to stem from caloric need. Mice have the choice of drinking or avoiding the ethanol solution, eliminating the stressful conditions that are inherent in other models of binge-like ethanol exposure in which ethanol is administered by the experimenter, and in some cases, potentially painful. The DID procedure is a high throughput approach that does not require extensive training or the inclusion of sweet compounds to motivate high levels of ethanol intake. The high throughput nature of the DID procedure makes it useful for rapid screening of pharmacological targets that are protective against binge-like drinking and for identifying strains of mice that exhibit binge-like drinking behavior. Additionally, the simplicity of DID procedures allows for easy integration into other paradigms, such as prenatal ethanol exposure and adolescent ethanol drinking. It is suggested that the DID model is a useful tool for studying the neurobiology and genetics underlying binge-like ethanol drinking, and may be useful for studying the transition to ethanol dependence. PMID:24275142

  5. BIOREL: the benchmark resource to estimate the relevance of the gene networks.

    PubMed

    Antonov, Alexey V; Mewes, Hans W

    2006-02-06

    The progress of high-throughput methodologies in functional genomics has lead to the development of statistical procedures to infer gene networks from various types of high-throughput data. However, due to the lack of common standards, the biological significance of the results of the different studies is hard to compare. To overcome this problem we propose a benchmark procedure and have developed a web resource (BIOREL), which is useful for estimating the biological relevance of any genetic network by integrating different sources of biological information. The associations of each gene from the network are classified as biologically relevant or not. The proportion of genes in the network classified as "relevant" is used as the overall network relevance score. Employing synthetic data we demonstrated that such a score ranks the networks fairly in respect to the relevance level. Using BIOREL as the benchmark resource we compared the quality of experimental and theoretically predicted protein interaction data.

  6. A low cost and high throughput magnetic bead-based immuno-agglutination assay in confined droplets.

    PubMed

    Teste, Bruno; Ali-Cherif, Anaïs; Viovy, Jean Louis; Malaquin, Laurent

    2013-06-21

    Although passive immuno-agglutination assays consist of one step and simple procedures, they are usually not adapted for high throughput analyses and they require expensive and bulky equipment for quantitation steps. Here we demonstrate a low cost, multimodal and high throughput immuno-agglutination assay that relies on a combination of magnetic beads (MBs), droplets microfluidics and magnetic tweezers. Antibody coated MBs were used as a capture support in the homogeneous phase. Following the immune interaction, water in oil droplets containing MBs and analytes were generated and transported in Teflon tubing. When passing in between magnetic tweezers, the MBs contained in the droplets were magnetically confined in order to enhance the agglutination rate and kinetics. When releasing the magnetic field, the internal recirculation flows in the droplet induce shear forces that favor MBs redispersion. In the presence of the analyte, the system preserves specific interactions and MBs stay in the aggregated state while in the case of a non-specific analyte, redispersion of particles occurs. The analyte quantitation procedure relies on the MBs redispersion rate within the droplet. The influence of different parameters such as magnetic field intensity, flow rate and MBs concentration on the agglutination performances have been investigated and optimized. Although the immuno-agglutination assay described in this work may not compete with enzyme linked immunosorbent assay (ELISA) in terms of sensitivity, it offers major advantages regarding the reagents consumption (analysis is performed in sub microliter droplet) and the platform cost that yields to very cheap analyses. Moreover the fully automated analysis procedure provides reproducible analyses with throughput well above those of existing technologies. We demonstrated the detection of biotinylated phosphatase alkaline in 100 nL sample volumes with an analysis rate of 300 assays per hour and a limit of detection of 100 pM.

  7. Quantifying the Onset and Progression of Plant Senescence by Color Image Analysis for High Throughput Applications

    PubMed Central

    Cai, Jinhai; Okamoto, Mamoru; Atieno, Judith; Sutton, Tim; Li, Yongle; Miklavcic, Stanley J.

    2016-01-01

    Leaf senescence, an indicator of plant age and ill health, is an important phenotypic trait for the assessment of a plant’s response to stress. Manual inspection of senescence, however, is time consuming, inaccurate and subjective. In this paper we propose an objective evaluation of plant senescence by color image analysis for use in a high throughput plant phenotyping pipeline. As high throughput phenotyping platforms are designed to capture whole-of-plant features, camera lenses and camera settings are inappropriate for the capture of fine detail. Specifically, plant colors in images may not represent true plant colors, leading to errors in senescence estimation. Our algorithm features a color distortion correction and image restoration step prior to a senescence analysis. We apply our algorithm to two time series of images of wheat and chickpea plants to quantify the onset and progression of senescence. We compare our results with senescence scores resulting from manual inspection. We demonstrate that our procedure is able to process images in an automated way for an accurate estimation of plant senescence even from color distorted and blurred images obtained under high throughput conditions. PMID:27348807

  8. Robot-Based High-Throughput Engineering of Alcoholic Polymer: Fullerene Nanoparticle Inks for an Eco-Friendly Processing of Organic Solar Cells.

    PubMed

    Xie, Chen; Tang, Xiaofeng; Berlinghof, Marvin; Langner, Stefan; Chen, Shi; Späth, Andreas; Li, Ning; Fink, Rainer H; Unruh, Tobias; Brabec, Christoph J

    2018-06-27

    Development of high-quality organic nanoparticle inks is a significant scientific challenge for the industrial production of solution-processed organic photovoltaics (OPVs) with eco-friendly processing methods. In this work, we demonstrate a novel, robot-based, high-throughput procedure performing automatic poly(3-hexylthio-phene-2,5-diyl) and indene-C 60 bisadduct nanoparticle ink synthesis in nontoxic alcohols. A novel methodology to prepare particle dispersions for fully functional OPVs by manipulating the particle size and solvent system was studied in detail. The ethanol dispersion with a particle diameter of around 80-100 nm exhibits reduced degradation, yielding a power conversion efficiency of 4.52%, which is the highest performance reported so far for water/alcohol-processed OPV devices. By successfully deploying the high-throughput robot-based approach for an organic nanoparticle ink preparation, we believe that the findings demonstrated in this work will trigger more research interest and effort on eco-friendly industrial production of OPVs.

  9. High-throughput and selective solid-phase extraction of urinary catecholamines by crown ether-modified resin composite fiber.

    PubMed

    Chen, LiQin; Wang, Hui; Xu, Zhen; Zhang, QiuYue; Liu, Jia; Shen, Jun; Zhang, WanQi

    2018-08-03

    In the present study, we developed a simple and high-throughput solid phase extraction (SPE) procedure for selective extraction of catecholamines (CAs) in urine samples. The SPE adsorbents were electrospun composite fibers functionalized with 4-carboxybenzo-18-crown-6 ether modified XAD resin and polystyrene, which were packed into 96-well columns and used for high-throughput selective extraction of CAs in healthy human urine samples. Moreover, the extraction efficiency of packed-fiber SPE (PFSPE) was examined by high performance liquid chromatography coupled with fluorescence detector. The parameters affecting the extraction efficiency and impurity removal efficiency were optimized, and good linearity ranging from 0.5 to 400 ng/mL was obtained with a low limit of detection (LOD, 0.2-0.5 ng/mL) and a good repeatability (2.7%-3.7%, n = 6). The extraction recoveries of three CAs ranged from 70.5% to 119.5%. Furthermore, stable and reliable results obtained by the fluorescence detector were superior to those obtained by the electrochemical detector. Collectively, PFSPE coupled with 96-well columns was a simple, rapid, selective, high-throughput and cost-efficient method, and the proposed method could be applied in clinical chemistry. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    PubMed

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  11. Mapping the miRNA interactome by crosslinking ligation and sequencing of hybrids (CLASH)

    PubMed Central

    Helwak, Aleksandra; Tollervey, David

    2014-01-01

    RNA-RNA interactions play critical roles in many cellular processes but studying them is difficult and laborious. Here, we describe an experimental procedure, termed crosslinking ligation and sequencing of hybrids (CLASH), which allows high-throughput identification of sites of RNA-RNA interaction. During CLASH, a tagged bait protein is UV crosslinked in vivo to stabilise RNA interactions and purified under denaturing conditions. RNAs associated with the bait protein are partially truncated, and the ends of RNA-duplexes are ligated together. Following linker addition, cDNA library preparation and high-throughput sequencing, the ligated duplexes give rise to chimeric cDNAs, which unambiguously identify RNA-RNA interaction sites independent of bioinformatic predictions. This protocol is optimized for studying miRNA targets bound by Argonaute proteins, but should be easily adapted for other RNA-binding proteins and classes of RNA. The protocol requires around 5 days to complete, excluding the time required for high-throughput sequencing and bioinformatic analyses. PMID:24577361

  12. A high-throughput solid-phase extraction microchip combined with inductively coupled plasma-mass spectrometry for rapid determination of trace heavy metals in natural water.

    PubMed

    Shih, Tsung-Ting; Hsieh, Cheng-Chuan; Luo, Yu-Ting; Su, Yi-An; Chen, Ping-Hung; Chuang, Yu-Chen; Sun, Yuh-Chang

    2016-04-15

    Herein, a hyphenated system combining a high-throughput solid-phase extraction (htSPE) microchip with inductively coupled plasma-mass spectrometry (ICP-MS) for rapid determination of trace heavy metals was developed. Rather than performing multiple analyses in parallel for the enhancement of analytical throughput, we improved the processing speed for individual samples by increasing the operation flow rate during SPE procedures. To this end, an innovative device combining a micromixer and a multi-channeled extraction unit was designed. Furthermore, a programmable valve manifold was used to interface the developed microchip and ICP-MS instrumentation in order to fully automate the system, leading to a dramatic reduction in operation time and human error. Under the optimized operation conditions for the established system, detection limits of 1.64-42.54 ng L(-1) for the analyte ions were achieved. Validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Each analysis could be readily accomplished within just 186 s using the established system. This represents, to the best of our knowledge, an unprecedented speed for the analysis of trace heavy metal ions. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. RNA isolation from mammalian cells using porous polymer monoliths: an approach for high-throughput automation.

    PubMed

    Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F

    2010-06-01

    The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.

  14. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    PubMed Central

    2011-01-01

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments. PMID:22136293

  15. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format.

    PubMed

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-12-02

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  16. Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms

    PubMed Central

    Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas

    2016-01-01

    Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640

  17. High-throughput biomonitoring of dioxins and polychlorinated biphenyls at the sub-picogram level in human serum.

    PubMed

    Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin

    2006-10-13

    We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.

  18. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  19. A bioinformatics roadmap for the human vaccines project.

    PubMed

    Scheuermann, Richard H; Sinkovits, Robert S; Schenkelberg, Theodore; Koff, Wayne C

    2017-06-01

    Biomedical research has become a data intensive science in which high throughput experimentation is producing comprehensive data about biological systems at an ever-increasing pace. The Human Vaccines Project is a new public-private partnership, with the goal of accelerating development of improved vaccines and immunotherapies for global infectious diseases and cancers by decoding the human immune system. To achieve its mission, the Project is developing a Bioinformatics Hub as an open-source, multidisciplinary effort with the overarching goal of providing an enabling infrastructure to support the data processing, analysis and knowledge extraction procedures required to translate high throughput, high complexity human immunology research data into biomedical knowledge, to determine the core principles driving specific and durable protective immune responses.

  20. Application of a Permethrin Immunosorbent Assay Method to Residential Soil and Dust Samples

    EPA Science Inventory

    A low-cost, high throughput bioanalytical screening method was developed for monitoring cis/trans-permethrin in dust and soil samples. The method consisted of a simple sample preparation procedure [sonication with dichloromethane followed by a solvent exchange into methanol:wate...

  1. ADAPTING THE MEDAKA EMBRYO ASSAY TO A HIGH-THROUGHPUT APPROACH FOR DEVELOPMENTAL TOXICITY TESTING.

    EPA Science Inventory

    Chemical exposure during embryonic development may cause persistent effects, yet developmental toxicity data exist for very few chemicals. Current testing procedures are time consuming and costly, underlining the need for rapid and low cost screening strategies. While in vitro ...

  2. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    PubMed

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.

  3. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  4. Effort versus reward: preparing samples for fungal community characterization in high-throughput sequencing surveys of soils

    USDA-ARS?s Scientific Manuscript database

    Next generation fungal amplicon sequencing is being used with increasing frequency to study fungal diversity in various ecosystems; however, the influence of sample preparation on the characterization of fungal community is poorly understood. We investigated the effects of four procedural modificati...

  5. Performance comparison of SNP detection tools with illumina exome sequencing data—an assessment using both family pedigree information and sample-matched SNP array data

    PubMed Central

    Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.

    2014-01-01

    To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545

  6. High throughput and miniaturised systems for biodegradability assessments.

    PubMed

    Cregut, Mickael; Jouanneau, Sulivan; Brillet, François; Durand, Marie-José; Sweetlove, Cyril; Chenèble, Jean-Charles; L'Haridon, Jacques; Thouand, Gérald

    2014-01-01

    The society demands safer products with a better ecological profile. Regulatory criteria have been developed to prevent risks for human health and the environment, for example, within the framework of the European regulation REACH (Regulation (EC) No 1907, 2006). This has driven industry to consider the development of high throughput screening methodologies for assessing chemical biodegradability. These new screening methodologies must be scalable for miniaturisation, reproducible and as reliable as existing procedures for enhanced biodegradability assessment. Here, we evaluate two alternative systems that can be scaled for high throughput screening and conveniently miniaturised to limit costs in comparison with traditional testing. These systems are based on two dyes as follows: an invasive fluorescent dyes that serves as a cellular activity marker (a resazurin-like dye reagent) and a noninvasive fluorescent oxygen optosensor dye (an optical sensor). The advantages and limitations of these platforms for biodegradability assessment are presented. Our results confirm the feasibility of these systems for evaluating and screening chemicals for ready biodegradability. The optosensor is a miniaturised version of a component already used in traditional ready biodegradability testing, whereas the resazurin dye offers an interesting new screening mechanism for chemical concentrations greater than 10 mg/l that are not amenable to traditional closed bottle tests. The use of these approaches allows generalisation of high throughput screening methodologies to meet the need of developing new compounds with a favourable ecological profile and also assessment for regulatory purpose.

  7. Evaluation of High Density Air Traffic Operations with Automation for Separation Assurance, Weather Avoidance and Schedule Conformance

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mercer, Joey S.; Martin, Lynne Hazel; Homola, Jeffrey R.; Cabrall, Christopher D.; Brasil, Connie L.

    2011-01-01

    In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace.

  8. Essential attributes identified in the design of a Laboratory Information Management System for a high throughput siRNA screening laboratory.

    PubMed

    Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey

    2011-11-01

    In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers

  9. Combining high-throughput sequencing with fruit body surveys reveals contrasting life-history strategies in fungi

    PubMed Central

    Ovaskainen, Otso; Schigel, Dmitry; Ali-Kovero, Heini; Auvinen, Petri; Paulin, Lars; Nordén, Björn; Nordén, Jenni

    2013-01-01

    Before the recent revolution in molecular biology, field studies on fungal communities were mostly confined to fruit bodies, whereas mycelial interactions were studied in the laboratory. Here we combine high-throughput sequencing with a fruit body inventory to study simultaneously mycelial and fruit body occurrences in a community of fungi inhabiting dead wood of Norway spruce. We studied mycelial occurrence by extracting DNA from wood samples followed by 454-sequencing of the ITS1 and ITS2 regions and an automated procedure for species identification. In total, we detected 198 species as mycelia and 137 species as fruit bodies. The correlation between mycelial and fruit body occurrences was high for the majority of the species, suggesting that high-throughput sequencing can successfully characterize the dominating fungal communities, despite possible biases related to sampling, PCR, sequencing and molecular identification. We used the fruit body and molecular data to test hypothesized links between life history and population dynamic parameters. We show that the species that have on average a high mycelial abundance also have a high fruiting rate and produce large fruit bodies, leading to a positive feedback loop in their population dynamics. Earlier studies have shown that species with specialized resource requirements are rarely seen fruiting, for which reason they are often classified as red-listed. We show with the help of high-throughput sequencing that some of these species are more abundant as mycelium in wood than what could be expected from their occurrence as fruit bodies. PMID:23575372

  10. High-Throughput Sequencing of 16S rRNA Gene Amplicons: Effects of Extraction Procedure, Primer Length and Annealing Temperature

    PubMed Central

    Sergeant, Martin J.; Constantinidou, Chrystala; Cogan, Tristan; Penn, Charles W.; Pallen, Mark J.

    2012-01-01

    The analysis of 16S-rDNA sequences to assess the bacterial community composition of a sample is a widely used technique that has increased with the advent of high throughput sequencing. Although considerable effort has been devoted to identifying the most informative region of the 16S gene and the optimal informatics procedures to process the data, little attention has been paid to the PCR step, in particular annealing temperature and primer length. To address this, amplicons derived from 16S-rDNA were generated from chicken caecal content DNA using different annealing temperatures, primers and different DNA extraction procedures. The amplicons were pyrosequenced to determine the optimal protocols for capture of maximum bacterial diversity from a chicken caecal sample. Even at very low annealing temperatures there was little effect on the community structure, although the abundance of some OTUs such as Bifidobacterium increased. Using shorter primers did not reveal any novel OTUs but did change the community profile obtained. Mechanical disruption of the sample by bead beating had a significant effect on the results obtained, as did repeated freezing and thawing. In conclusion, existing primers and standard annealing temperatures captured as much diversity as lower annealing temperatures and shorter primers. PMID:22666455

  11. Effort versus Reward: Preparing Samples for Fungal Community Characterization in High-Throughput Sequencing Surveys of Soils

    PubMed Central

    Song, Zewei; Schlatter, Dan; Kennedy, Peter; Kinkel, Linda L.; Kistler, H. Corby; Nguyen, Nhu; Bates, Scott T.

    2015-01-01

    Next generation fungal amplicon sequencing is being used with increasing frequency to study fungal diversity in various ecosystems; however, the influence of sample preparation on the characterization of fungal community is poorly understood. We investigated the effects of four procedural modifications to library preparation for high-throughput sequencing (HTS). The following treatments were considered: 1) the amount of soil used in DNA extraction, 2) the inclusion of additional steps (freeze/thaw cycles, sonication, or hot water bath incubation) in the extraction procedure, 3) the amount of DNA template used in PCR, and 4) the effect of sample pooling, either physically or computationally. Soils from two different ecosystems in Minnesota, USA, one prairie and one forest site, were used to assess the generality of our results. The first three treatments did not significantly influence observed fungal OTU richness or community structure at either site. Physical pooling captured more OTU richness compared to individual samples, but total OTU richness at each site was highest when individual samples were computationally combined. We conclude that standard extraction kit protocols are well optimized for fungal HTS surveys, but because sample pooling can significantly influence OTU richness estimates, it is important to carefully consider the study aims when planning sampling procedures. PMID:25974078

  12. High-throughput sequencing of 16S rRNA gene amplicons: effects of extraction procedure, primer length and annealing temperature.

    PubMed

    Sergeant, Martin J; Constantinidou, Chrystala; Cogan, Tristan; Penn, Charles W; Pallen, Mark J

    2012-01-01

    The analysis of 16S-rDNA sequences to assess the bacterial community composition of a sample is a widely used technique that has increased with the advent of high throughput sequencing. Although considerable effort has been devoted to identifying the most informative region of the 16S gene and the optimal informatics procedures to process the data, little attention has been paid to the PCR step, in particular annealing temperature and primer length. To address this, amplicons derived from 16S-rDNA were generated from chicken caecal content DNA using different annealing temperatures, primers and different DNA extraction procedures. The amplicons were pyrosequenced to determine the optimal protocols for capture of maximum bacterial diversity from a chicken caecal sample. Even at very low annealing temperatures there was little effect on the community structure, although the abundance of some OTUs such as Bifidobacterium increased. Using shorter primers did not reveal any novel OTUs but did change the community profile obtained. Mechanical disruption of the sample by bead beating had a significant effect on the results obtained, as did repeated freezing and thawing. In conclusion, existing primers and standard annealing temperatures captured as much diversity as lower annealing temperatures and shorter primers.

  13. Determination of Microbial Extracellular Enzyme Activity in Waters, Soils, and Sediments using High Throughput Microplate Assays

    PubMed Central

    Jackson, Colin R.; Tyler, Heather L.; Millar, Justin J.

    2013-01-01

    Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample. PMID:24121617

  14. Determination of microbial extracellular enzyme activity in waters, soils, and sediments using high throughput microplate assays.

    PubMed

    Jackson, Colin R; Tyler, Heather L; Millar, Justin J

    2013-10-01

    Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample.

  15. Advantages and application of label-free detection assays in drug screening.

    PubMed

    Cunningham, Brian T; Laing, Lance G

    2008-08-01

    Adoption is accelerating for a new family of label-free optical biosensors incorporated into standard format microplates owing to their ability to enable highly sensitive detection of small molecules, proteins and cells for high-throughput drug discovery applications. Label-free approaches are displacing other detection technologies owing to their ability to provide simple assay procedures for hit finding/validation, accessing difficult target classes, screening the interaction of cells with drugs and analyzing the affinity of small molecule inhibitors to target proteins. This review describes several new drug discovery applications that are under development for microplate-based photonic crystal optical biosensors and the key issues that will drive adoption of the technology. Microplate-based optical biosensors are enabling a variety of cell-based assays, inhibition assays, protein-protein binding assays and protein-small molecule binding assays to be performed with high-throughput and high sensitivity.

  16. High-throughput accurate-wavelength lens-based visible spectrometer.

    PubMed

    Bell, Ronald E; Scotti, Filippo

    2010-10-01

    A scanning visible spectrometer has been prototyped to complement fixed-wavelength transmission grating spectrometers for charge exchange recombination spectroscopy. Fast f/1.8 200 mm commercial lenses are used with a large 2160 mm(-1) grating for high throughput. A stepping-motor controlled sine drive positions the grating, which is mounted on a precision rotary table. A high-resolution optical encoder on the grating stage allows the grating angle to be measured with an absolute accuracy of 0.075 arc  sec, corresponding to a wavelength error ≤0.005 Å. At this precision, changes in grating groove density due to thermal expansion and variations in the refractive index of air are important. An automated calibration procedure determines all the relevant spectrometer parameters to high accuracy. Changes in bulk grating temperature, atmospheric temperature, and pressure are monitored between the time of calibration and the time of measurement to ensure a persistent wavelength calibration.

  17. Identification and Correction of Additive and Multiplicative Spatial Biases in Experimental High-Throughput Screening.

    PubMed

    Mazoure, Bogdan; Caraus, Iurie; Nadon, Robert; Makarenkov, Vladimir

    2018-06-01

    Data generated by high-throughput screening (HTS) technologies are prone to spatial bias. Traditionally, bias correction methods used in HTS assume either a simple additive or, more recently, a simple multiplicative spatial bias model. These models do not, however, always provide an accurate correction of measurements in wells located at the intersection of rows and columns affected by spatial bias. The measurements in these wells depend on the nature of interaction between the involved biases. Here, we propose two novel additive and two novel multiplicative spatial bias models accounting for different types of bias interactions. We describe a statistical procedure that allows for detecting and removing different types of additive and multiplicative spatial biases from multiwell plates. We show how this procedure can be applied by analyzing data generated by the four HTS technologies (homogeneous, microorganism, cell-based, and gene expression HTS), the three high-content screening (HCS) technologies (area, intensity, and cell-count HCS), and the only small-molecule microarray technology available in the ChemBank small-molecule screening database. The proposed methods are included in the AssayCorrector program, implemented in R, and available on CRAN.

  18. Establishment of a Bioenergy-Focused Microalgae Strain Collection Using Rapid, High-Throughput Methodologies: Cooperative Research and Development Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pienkos, Philip T.

    2013-11-01

    This project is part of the overall effort by and among NREL, Colorado State University, University of Colorado, and Colorado School of Mines known as the Colorado Center for Biorefining and Biofuels. This is part of a larger statewide effort provided for in House Bill 06-1322, establishing a Colorado Collaboratory that envisions these four institutions working together as part of the state'senergy plan. This individual project with Colorado School of Mines is the first of many envisioned in this overall effort. The project focuses on development of high throughput procedures aimed at rapidly isolating and purifying novel microalgal strains (specificallymore » green alga and diatoms) from water samples obtained from unique aquatic environments.« less

  19. Diffraction efficiency of radially-profiled off-plane reflection gratings

    NASA Astrophysics Data System (ADS)

    Miles, Drew M.; Tutt, James H.; DeRoo, Casey T.; Marlowe, Hannah; Peterson, Thomas J.; McEntaffer, Randall L.; Menz, Benedikt; Burwitz, Vadim; Hartner, Gisela; Laubis, Christian; Scholze, Frank

    2015-09-01

    Future X-ray missions will require gratings with high throughput and high spectral resolution. Blazed off-plane reflection gratings are capable of meeting these demands. A blazed grating profile optimizes grating efficiency, providing higher throughput to one side of zero-order on the arc of diffraction. This paper presents efficiency measurements made in the 0.3 - 1.5 keV energy band at the Physikalisch-Technische Bundesanstalt (PTB) BESSY II facility for three holographically-ruled gratings, two of which are blazed. Each blazed grating was tested in both the Littrow configuration and anti-Littrow configuration in order to test the alignment sensitivity of these gratings with regard to throughput. This paper outlines the procedure of the grating experiment performed at BESSY II and discuss the resulting efficiency measurements across various energies. Experimental results are generally consistent with theory and demonstrate that the blaze does increase throughput to one side of zero-order. However, the total efficiency of the non-blazed, sinusoidal grating is greater than that of the blazed gratings, which suggests that the method of manufacturing these blazed profiles fails to produce facets with the desired level of precision. Finally, evidence of a successful blaze implementation from first diffraction results of prototype blazed gratings produce via a new fabrication technique at the University of Iowa are presented.

  20. Anesthesiology and gastroenterology.

    PubMed

    de Villiers, Willem J S

    2009-03-01

    A successful population-based colorectal cancer screening requires efficient colonoscopy practices that incorporate high throughput, safety, and patient satisfaction. There are several different modalities of nonanesthesiologist-administered sedation currently available and in development that may fulfill these requirements. Modern-day gastroenterology endoscopic procedures are complex and demand the full attention of the attending gastroenterologist and the complete cooperation of the patient. Many of these procedures will also require the anesthesiologist's knowledge, skills, abilities, and experience to ensure optimal procedure results and good patient outcomes. The goal of this review is (1) to provide a gastroenterology perspective on the use of propofol in gastroenterology endoscopic practice, and (2) to describe newer GI endoscopy procedures that gastroenterologists perform that might involve anesthesiologists.

  1. Fluorescent and Lanthanide Labeling for Ligand Screens, Assays, and Imaging

    PubMed Central

    Josan, Jatinder S.; De Silva, Channa R.; Yoo, Byunghee; Lynch, Ronald M.; Pagel, Mark D.; Vagner, Josef; Hruby, Victor J.

    2012-01-01

    The use of fluorescent (or luminescent) and metal contrast agents in high-throughput screens, in vitro assays, and molecular imaging procedures has rapidly expanded in recent years. Here we describe the development and utility of high-affinity ligands for cancer theranostics and other in vitro screening studies. In this context, we also illustrate the syntheses and use of heteromultivalent ligands as targeted imaging agents. PMID:21318902

  2. Sensitive high-throughput screening for the detection of reducing sugars.

    PubMed

    Mellitzer, Andrea; Glieder, Anton; Weis, Roland; Reisinger, Christoph; Flicker, Karlheinz

    2012-01-01

    The exploitation of renewable resources for the production of biofuels relies on efficient processes for the enzymatic hydrolysis of lignocellulosic materials. The development of enzymes and strains for these processes requires reliable and fast activity-based screening assays. Additionally, these assays are also required to operate on the microscale and on the high-throughput level. Herein, we report the development of a highly sensitive reducing-sugar assay in a 96-well microplate screening format. The assay is based on the formation of osazones from reducing sugars and para-hydroxybenzoic acid hydrazide. By using this sensitive assay, the enzyme loads and conversion times during lignocellulose hydrolysis can be reduced, thus allowing higher throughput. The assay is about five times more sensitive than the widely applied dinitrosalicylic acid based assay and can reliably detect reducing sugars down to 10 μM. The assay-specific variation over one microplate was determined for three different lignocellulolytic enzymes and ranges from 2 to 8%. Furthermore, the assay was combined with a microscale cultivation procedure for the activity-based screening of Pichia pastoris strains expressing functional Thermomyces lanuginosus xylanase A, Trichoderma reesei β-mannanase, or T. reesei cellobiohydrolase 2. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. High-throughput Cloning and Expression of Integral Membrane Proteins in Escherichia coli

    PubMed Central

    Bruni, Renato

    2014-01-01

    Recently, several structural genomics centers have been established and a remarkable number of three-dimensional structures of soluble proteins have been solved. For membrane proteins, the number of structures solved has been significantly trailing those for their soluble counterparts, not least because over-expression and purification of membrane proteins is a much more arduous process. By using high throughput technologies, a large number of membrane protein targets can be screened simultaneously and a greater number of expression and purification conditions can be employed, leading to a higher probability of successfully determining the structure of membrane proteins. This unit describes the cloning, expression and screening of membrane proteins using high throughput methodologies developed in our laboratory. Basic Protocol 1 deals with the cloning of inserts into expression vectors by ligation-independent cloning. Basic Protocol 2 describes the expression and purification of the target proteins on a miniscale. Lastly, for the targets that express at the miniscale, basic protocols 3 and 4 outline the methods employed for the expression and purification of targets at the midi-scale, as well as a procedure for detergent screening and identification of detergent(s) in which the target protein is stable. PMID:24510647

  4. Detecting and overcoming systematic bias in high-throughput screening technologies: a comprehensive review of practical issues and methodological solutions.

    PubMed

    Caraus, Iurie; Alsuwailem, Abdulaziz A; Nadon, Robert; Makarenkov, Vladimir

    2015-11-01

    Significant efforts have been made recently to improve data throughput and data quality in screening technologies related to drug design. The modern pharmaceutical industry relies heavily on high-throughput screening (HTS) and high-content screening (HCS) technologies, which include small molecule, complementary DNA (cDNA) and RNA interference (RNAi) types of screening. Data generated by these screening technologies are subject to several environmental and procedural systematic biases, which introduce errors into the hit identification process. We first review systematic biases typical of HTS and HCS screens. We highlight that study design issues and the way in which data are generated are crucial for providing unbiased screening results. Considering various data sets, including the publicly available ChemBank data, we assess the rates of systematic bias in experimental HTS by using plate-specific and assay-specific error detection tests. We describe main data normalization and correction techniques and introduce a general data preprocessing protocol. This protocol can be recommended for academic and industrial researchers involved in the analysis of current or next-generation HTS data. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. Novel microscale approaches for easy, rapid determination of protein stability in academic and commercial settings

    PubMed Central

    Alexander, Crispin G.; Wanner, Randy; Johnson, Christopher M.; Breitsprecher, Dennis; Winter, Gerhard; Duhr, Stefan; Baaske, Philipp; Ferguson, Neil

    2014-01-01

    Chemical denaturant titrations can be used to accurately determine protein stability. However, data acquisition is typically labour intensive, has low throughput and is difficult to automate. These factors, combined with high protein consumption, have limited the adoption of chemical denaturant titrations in commercial settings. Thermal denaturation assays can be automated, sometimes with very high throughput. However, thermal denaturation assays are incompatible with proteins that aggregate at high temperatures and large extrapolation of stability parameters to physiological temperatures can introduce significant uncertainties. We used capillary-based instruments to measure chemical denaturant titrations by intrinsic fluorescence and microscale thermophoresis. This allowed higher throughput, consumed several hundred-fold less protein than conventional, cuvette-based methods yet maintained the high quality of the conventional approaches. We also established efficient strategies for automated, direct determination of protein stability at a range of temperatures via chemical denaturation, which has utility for characterising stability for proteins that are difficult to purify in high yield. This approach may also have merit for proteins that irreversibly denature or aggregate in classical thermal denaturation assays. We also developed procedures for affinity ranking of protein–ligand interactions from ligand-induced changes in chemical denaturation data, and proved the principle for this by correctly ranking the affinity of previously unreported peptide–PDZ domain interactions. The increased throughput, automation and low protein consumption of protein stability determinations afforded by using capillary-based methods to measure denaturant titrations, can help to revolutionise protein research. We believe that the strategies reported are likely to find wide applications in academia, biotherapeutic formulation and drug discovery programmes. PMID:25262836

  6. High-throughput determination of urinary hexosamines for diagnosis of mucopolysaccharidoses by capillary electrophoresis and high-performance liquid chromatography.

    PubMed

    Coppa, Giovanni V; Galeotti, Fabio; Zampini, Lucia; Maccari, Francesca; Galeazzi, Tiziana; Padelia, Lucia; Santoro, Lucia; Gabrielli, Orazio; Volpi, Nicola

    2011-04-01

    Mucopolysaccharidoses (MPS) diagnosis is often delayed and irreversible organ damage can occur, making possible therapies less effective. This highlights the importance of early and accurate diagnosis. A high-throughput procedure for the simultaneous determination of glucosamine and galactosamine produced from urinary galactosaminoglycans and glucosaminoglycans by capillary electrophoresis (CE) and HPLC has been performed and validated in subjects affected by various MPS including their mild and severe forms, Hurler and Hurler-Scheie, Hunter, Sanfilippo, Morquio, and Maroteaux-Lamy. Contrary to other analytical approaches, the present single analytical procedure, which is able to measure total abnormal amounts of urinary GAGs, high molecular mass, and related fragments, as well as specific hexosamines belonging to a group of GAGs, would be useful for possible application in their early diagnosis. After a rapid urine pretreatment, free hexosamines are generated by acidic hydrolysis, derivatized with 2-aminobenzoic acid and separated by CE/UV in ∼10min and reverse-phase (RP)-HPLC in fluorescence in ∼21min. The total content of hexosamines was found to be indicative of abnormal urinary excretion of GAGs in patients compared to the controls, and the galactosamine/glucosamine ratio was observed to be related to specific MPS syndromes in regard to both their mild and severe forms. As a consequence, important correlations between analytical response and clinical diagnosis and the severity of the disorders were observed. Furthermore, we can assume that the severity of the syndrome may be ascribed to the quantity of total GAGs, as high-molecular-mass polymers and fragments, accumulated in cells and directly excreted in the urine. Finally, due to the high-throughput nature of this approach and to the equipment commonly available in laboratories, this method is suitable for newborn screening in preventive public health programs for early detection of MPS disorders, diagnosis, and their treatment. Copyright © 2010 Elsevier Inc. All rights reserved.

  7. High throughput determination of cleaning solutions to prevent the fouling of an anion exchange resin.

    PubMed

    Elich, Thomas; Iskra, Timothy; Daniels, William; Morrison, Christopher J

    2016-06-01

    Effective cleaning of chromatography resin is required to prevent fouling and maximize the number of processing cycles which can be achieved. Optimization of resin cleaning procedures, however, can lead to prohibitive material, labor, and time requirements, even when using milliliter scale chromatography columns. In this work, high throughput (HT) techniques were used to evaluate cleaning agents for a monoclonal antibody (mAb) polishing step utilizing Fractogel(®) EMD TMAE HiCap (M) anion exchange (AEX) resin. For this particular mAb feed stream, the AEX resin could not be fully restored with traditional NaCl and NaOH cleaning solutions, resulting in a loss of impurity capacity with resin cycling. Miniaturized microliter scale chromatography columns and an automated liquid handling system (LHS) were employed to evaluate various experimental cleaning conditions. Cleaning agents were monitored for their ability to maintain resin impurity capacity over multiple processing cycles by analyzing the flowthrough material for turbidity and high molecular weight (HMW) content. HT experiments indicated that a 167 mM acetic acid strip solution followed by a 0.5 M NaOH, 2 M NaCl sanitization provided approximately 90% cleaning improvement over solutions containing solely NaCl and/or NaOH. Results from the microliter scale HT experiments were confirmed in subsequent evaluations at the milliliter scale. These results identify cleaning agents which may restore resin performance for applications involving fouling species in ion exchange systems. In addition, this work demonstrates the use of miniaturized columns operated with an automated LHS for HT evaluation of chromatographic cleaning procedures, effectively decreasing material requirements while simultaneously increasing throughput. Biotechnol. Bioeng. 2016;113: 1251-1259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. A High-Throughput Screening Method for Identification of Inhibitors of the Deubiquitinating Enzyme USP14

    PubMed Central

    Lee, Byung-Hoon; Finley, Daniel; King, Randall W.

    2013-01-01

    Deubiquitinating enzymes (DUBs) reverse the process of ubiquitination, and number nearly 100 in humans. In principle, DUBs represent promising drug targets, as several of the enzymes have been implicated in human diseases. The isopeptidase activity of DUBs can be selectively inhibited by targeting the catalytic site with drug-like compounds. Notably, the mammalian 26S proteasome is associated with three major DUBs: RPN11, UCH37 and USP14. Because the ubiquitin ‘chain-trimming’ activity of USP14 can inhibit proteasome function, inhibitors of USP14 can stimulate proteasomal degradation. We recently established a high-throughput screening (HTS) method to discover small-molecule inhibitors specific for USP14. The protocols in this article cover the necessary procedures for preparing assay reagents, performing HTS for USP14 inhibitors, and carrying out post-HTS analysis. PMID:23788557

  9. Delta-Doping at Wafer Level for High Throughput, High Yield Fabrication of Silicon Imaging Arrays

    NASA Technical Reports Server (NTRS)

    Hoenk, Michael E. (Inventor); Nikzad, Shoulch (Inventor); Jones, Todd J. (Inventor); Greer, Frank (Inventor); Carver, Alexander G. (Inventor)

    2014-01-01

    Systems and methods for producing high quantum efficiency silicon devices. A silicon MBE has a preparation chamber that provides for cleaning silicon surfaces using an oxygen plasma to remove impurities and a gaseous (dry) NH3 + NF3 room temperature oxide removal process that leaves the silicon surface hydrogen terminated. Silicon wafers up to 8 inches in diameter have devices that can be fabricated using the cleaning procedures and MBE processing, including delta doping.

  10. Toward Streamlined Identification of Dioxin-like Compounds in Environmental Samples through Integration of Suspension Bioassay.

    PubMed

    Xiao, Hongxia; Brinkmann, Markus; Thalmann, Beat; Schiwy, Andreas; Große Brinkhaus, Sigrid; Achten, Christine; Eichbaum, Kathrin; Gembé, Carolin; Seiler, Thomas-Benjamin; Hollert, Henner

    2017-03-21

    Effect-directed analysis (EDA) is a powerful strategy to identify biologically active compounds in environmental samples. However, in current EDA studies, fractionation and handling procedures are laborious, consist of multiple evaporation steps, and thus bear the risk of contamination and decreased recoveries of the target compounds. The low resulting throughput has been one of the major bottlenecks of EDA. Here, we propose a high-throughput EDA (HT-EDA) work-flow combining reversed phase high-performance liquid chromatography fractionation of samples into 96-well microplates, followed by toxicity assessment in the micro-EROD bioassay with the wild-type rat hepatoma H4IIE cells, and chemical analysis of bioactive fractions. The approach was evaluated using single substances, binary mixtures, and extracts of sediment samples collected at the Three Gorges Reservoir, Yangtze River, China, as well as the rivers Rhine and Elbe, Germany. Selected bioactive fractions were analyzed by highly sensitive gas chromatography-atmospheric pressure laser ionization-time-of-flight-mass spectrometry. In addition, we optimized the work-flow by seeding previously adapted suspension-cultured H4IIE cells directly into the microplate used for fractionation, which makes any transfers of fractionated samples unnecessary. The proposed HT-EDA work-flow simplifies the procedure for wider application in ecotoxicology and environmental routine programs.

  11. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID:25653655

  12. High Throughput Determination of Ricinine Abrine and Alpha ...

    EPA Pesticide Factsheets

    Analytical Method This document provides the standard operating procedure for determination of ricinine (RIC), abrine (ABR), and α-amanitin (AMAN) in drinking water by isotope dilution liquid chromatography tandem mass spectrometry (LC/MS/MS). This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving one or a combination of these analytes.

  13. High-throughput single-molecule force spectroscopy for membrane proteins

    NASA Astrophysics Data System (ADS)

    Bosshart, Patrick D.; Casagrande, Fabio; Frederix, Patrick L. T. M.; Ratera, Merce; Bippes, Christian A.; Müller, Daniel J.; Palacin, Manuel; Engel, Andreas; Fotiadis, Dimitrios

    2008-09-01

    Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether ~400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with ~200 (AdiC) and ~400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications.

  14. High-throughput identification of proteins with AMPylation using self-assembled human protein (NAPPA) microarrays.

    PubMed

    Yu, Xiaobo; LaBaer, Joshua

    2015-05-01

    AMPylation (adenylylation) has been recognized as an important post-translational modification that is used by pathogens to regulate host cellular proteins and their associated signaling pathways. AMPylation has potential functions in various cellular processes, and it is widely conserved across both prokaryotes and eukaryotes. However, despite the identification of many AMPylators, relatively few candidate substrates of AMPylation are known. This is changing with the recent development of a robust and reliable method for identifying new substrates using protein microarrays, which can markedly expand the list of potential substrates. Here we describe procedures for detecting AMPylated and auto-AMPylated proteins in a sensitive, high-throughput and nonradioactive manner. The approach uses high-density protein microarrays fabricated using nucleic acid programmable protein array (NAPPA) technology, which enables the highly successful display of fresh recombinant human proteins in situ. The modification of target proteins is determined via copper-catalyzed azide-alkyne cycloaddition (CuAAC). The assay can be accomplished within 11 h.

  15. High-throughput determination of biochemical oxygen demand (BOD) by a microplate-based biosensor.

    PubMed

    Pang, Hei-Leung; Kwok, Nga-Yan; Chan, Pak-Ho; Yeung, Chi-Hung; Lo, Waihung; Wong, Kwok-Yin

    2007-06-01

    The use of the conventional 5-day biochemical oxygen demand (BOD5) method in BOD determination is greatly hampered by its time-consuming sampling procedure and its technical difficulty in the handling of a large pool of wastewater samples. Thus, it is highly desirable to develop a fast and high-throughput biosensor for BOD measurements. This paper describes the construction of a microplate-based biosensor consisting of an organically modified silica (ORMOSIL) oxygen sensing film for high-throughput determination of BOD in wastewater. The ORMOSIL oxygen sensing film was prepared by reacting tetramethoxysilane with dimethyldimethoxysilane in the presence of the oxygen-sensitive dye tris(4,7-diphenyl-1,10-phenanthroline)ruthenium-(II) chloride. The silica composite formed a homogeneous, crack-free oxygen sensing film on polystyrene microtiter plates with high stability, and the embedded ruthenium dye interacted with the dissolved oxygen in wastewater according to the Stern-Volmer relation. The bacterium Stenotrophomonas maltophilia was loaded into the ORMOSIL/ PVA composite (deposited on the top of the oxygen sensing film) and used to metabolize the organic compounds in wastewater. This BOD biosensor was found to be able to determine the BOD values of wastewater samples within 20 min by monitoring the dissolved oxygen concentrations. Moreover, the BOD values determined by the BOD biosensor were in good agreement with those obtained by the conventional BOD5 method.

  16. Precise, High-throughput Analysis of Bacterial Growth.

    PubMed

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  17. Genetic Interaction Mapping in Schizosaccharomyces pombe Using the Pombe Epistasis Mapper (PEM) System and a ROTOR HDA Colony Replicating Robot in a 1536 Array Format.

    PubMed

    Roguev, Assen; Xu, Jiewei; Krogan, Nevan

    2018-02-01

    This protocol describes an optimized high-throughput procedure for generating double deletion mutants in Schizosaccharomyces pombe using the colony replicating robot ROTOR HDA and the PEM (pombe epistasis mapper) system. The method is based on generating high-density colony arrays (1536 colonies per agar plate) and passaging them through a series of antidiploid and mating-type selection (ADS-MTS) and double-mutant selection (DMS) steps. Detailed program parameters for each individual replication step are provided. Using this procedure, batches of 25 or more screens can be routinely performed. © 2018 Cold Spring Harbor Laboratory Press.

  18. Validation of a high-throughput real-time polymerase chain reaction assay for the detection of capripoxviral DNA.

    PubMed

    Stubbs, Samuel; Oura, Chris A L; Henstock, Mark; Bowden, Timothy R; King, Donald P; Tuppurainen, Eeva S M

    2012-02-01

    Capripoxviruses, which are endemic in much of Africa and Asia, are the aetiological agents of economically devastating poxviral diseases in cattle, sheep and goats. The aim of this study was to validate a high-throughput real-time PCR assay for routine diagnostic use in a capripoxvirus reference laboratory. The performance of two previously published real-time PCR methods were compared using commercially available reagents including the amplification kits recommended in the original publication. Furthermore, both manual and robotic extraction methods used to prepare template nucleic acid were evaluated using samples collected from experimentally infected animals. The optimised assay had an analytical sensitivity of at least 63 target DNA copies per reaction, displayed a greater diagnostic sensitivity compared to conventional gel-based PCR, detected capripoxviruses isolated from outbreaks around the world and did not amplify DNA from related viruses in the genera Orthopoxvirus or Parapoxvirus. The high-throughput robotic DNA extraction procedure did not adversely affect the sensitivity of the assay compared to manual preparation of PCR templates. This laboratory-based assay provides a rapid and robust method to detect capripoxviruses following suspicion of disease in endemic or disease-free countries. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  19. Development of a high-throughput screening system for identification of novel reagents regulating DNA damage in human dermal fibroblasts.

    PubMed

    Bae, Seunghee; An, In-Sook; An, Sungkwan

    2015-09-01

    Ultraviolet (UV) radiation is a major inducer of skin aging and accumulated exposure to UV radiation increases DNA damage in skin cells, including dermal fibroblasts. In the present study, we developed a novel DNA repair regulating material discovery (DREAM) system for the high-throughput screening and identification of putative materials regulating DNA repair in skin cells. First, we established a modified lentivirus expressing the luciferase and hypoxanthine phosphoribosyl transferase (HPRT) genes. Then, human dermal fibroblast WS-1 cells were infected with the modified lentivirus and selected with puromycin to establish cells that stably expressed luciferase and HPRT (DREAM-F cells). The first step in the DREAM protocol was a 96-well-based screening procedure, involving the analysis of cell viability and luciferase activity after pretreatment of DREAM-F cells with reagents of interest and post-treatment with UVB radiation, and vice versa. In the second step, we validated certain effective reagents identified in the first step by analyzing the cell cycle, evaluating cell death, and performing HPRT-DNA sequencing in DREAM-F cells treated with these reagents and UVB. This DREAM system is scalable and forms a time-saving high-throughput screening system for identifying novel anti-photoaging reagents regulating DNA damage in dermal fibroblasts.

  20. Robust ridge regression estimators for nonlinear models with applications to high throughput screening assay data.

    PubMed

    Lim, Changwon

    2015-03-30

    Nonlinear regression is often used to evaluate the toxicity of a chemical or a drug by fitting data from a dose-response study. Toxicologists and pharmacologists may draw a conclusion about whether a chemical is toxic by testing the significance of the estimated parameters. However, sometimes the null hypothesis cannot be rejected even though the fit is quite good. One possible reason for such cases is that the estimated standard errors of the parameter estimates are extremely large. In this paper, we propose robust ridge regression estimation procedures for nonlinear models to solve this problem. The asymptotic properties of the proposed estimators are investigated; in particular, their mean squared errors are derived. The performances of the proposed estimators are compared with several standard estimators using simulation studies. The proposed methodology is also illustrated using high throughput screening assay data obtained from the National Toxicology Program. Copyright © 2014 John Wiley & Sons, Ltd.

  1. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  2. Measuring Sister Chromatid Cohesion Protein Genome Occupancy in Drosophila melanogaster by ChIP-seq.

    PubMed

    Dorsett, Dale; Misulovin, Ziva

    2017-01-01

    This chapter presents methods to conduct and analyze genome-wide chromatin immunoprecipitation of the cohesin complex and the Nipped-B cohesin loading factor in Drosophila cells using high-throughput DNA sequencing (ChIP-seq). Procedures for isolation of chromatin, immunoprecipitation, and construction of sequencing libraries for the Ion Torrent Proton high throughput sequencer are detailed, and computational methods to calculate occupancy as input-normalized fold-enrichment are described. The results obtained by ChIP-seq are compared to those obtained by ChIP-chip (genomic ChIP using tiling microarrays), and the effects of sequencing depth on the accuracy are analyzed. ChIP-seq provides similar sensitivity and reproducibility as ChIP-chip, and identifies the same broad regions of occupancy. The locations of enrichment peaks, however, can differ between ChIP-chip and ChIP-seq, and low sequencing depth can splinter broad regions of occupancy into distinct peaks.

  3. A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing.

    PubMed

    Hsiao, Yi-Hsing; Hsu, Chia-Hsien; Chen, Chihchen

    2016-07-08

    The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca(2+) concentration. However, glucose evoked a rapid elevation of intracellular Ca(2+) followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.

  4. Rapid automation of a cell-based assay using a modular approach: case study of a flow-based Varicella Zoster Virus infectivity assay.

    PubMed

    Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc

    2010-06-01

    Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.

  5. A decade of improvements in Mimiviridae and Marseilleviridae isolation from amoeba.

    PubMed

    Pagnier, Isabelle; Reteno, Dorine-Gaelle Ikanga; Saadi, Hanene; Boughalmi, Mondher; Gaia, Morgan; Slimani, Meriem; Ngounga, Tatsiana; Bekliz, Meriem; Colson, Philippe; Raoult, Didier; La Scola, Bernard

    2013-01-01

    Since the isolation of the first giant virus, the Mimivirus, by T.J. Rowbotham in a cooling tower in Bradford, UK, and after its characterisation by our group in 2003, we have continued to develop novel strategies to isolate additional strains. By first focusing on cooling towers using our original time-consuming procedure, we were able to isolate a new lineage of giant virus called Marseillevirus and a new Mimivirus strain called Mamavirus. In the following years, we have accumulated the world's largest unique collection of giant viruses by improving the use of antibiotic combinations to avoid bacterial contamination of amoeba, developing strategies of preliminary screening of samples by molecular methods, and using a high-throughput isolation method developed by our group. Based on the inoculation of nearly 7,000 samples, our collection currently contains 43 strains of Mimiviridae (14 in lineage A, 6 in lineage B, and 23 in lineage C) and 17 strains of Marseilleviridae isolated from various environments, including 3 of human origin. This study details the procedures used to build this collection and paves the way for the high-throughput isolation of new isolates to improve the record of giant virus distribution in the environment and the determination of their pangenome.

  6. Air Traffic Management Technology Demonstration-1 Concept of Operations (ATD-1 ConOps), Version 2.0

    NASA Technical Reports Server (NTRS)

    Baxley, Brian T.; Johnson, William C.; Swenson, Harry N.; Robinson, John E.; Prevot, Tom; Callantine, Todd J.; Scardina, John; Greene, Michael

    2013-01-01

    This document is an update to the operations and procedures envisioned for NASA s Air Traffic Management (ATM) Technology Demonstration #1 (ATD-1). The ATD-1 Concept of Operations (ConOps) integrates three NASA technologies to achieve high throughput, fuel-efficient arrival operations into busy terminal airspace. They are Traffic Management Advisor with Terminal Metering (TMA-TM) for precise time-based schedules to the runway and points within the terminal area, Controller-Managed Spacing (CMS) decision support tools for terminal controllers to better manage aircraft delay using speed control, and Flight deck Interval Management (FIM) avionics and flight crew procedures to conduct airborne spacing operations. The ATD-1 concept provides de-conflicted and efficient operations of multiple arrival streams of aircraft, passing through multiple merge points, from top-of-descent (TOD) to the Final Approach Fix. These arrival streams are Optimized Profile Descents (OPDs) from en route altitude to the runway, using primarily speed control to maintain separation and schedule. The ATD-1 project is currently addressing the challenges of integrating the three technologies, and their implantation into an operational environment. The ATD-1 goals include increasing the throughput of high-density airports, reducing controller workload, increasing efficiency of arrival operations and the frequency of trajectory-based operations, and promoting aircraft ADS-B equipage.

  7. High-Throughput RNA Interference Screening: Tricks of the Trade

    PubMed Central

    Nebane, N. Miranda; Coric, Tatjana; Whig, Kanupriya; McKellip, Sara; Woods, LaKeisha; Sosa, Melinda; Sheppard, Russell; Rasmussen, Lynn; Bjornsti, Mary-Ann; White, E. Lucile

    2016-01-01

    The process of validating an assay for high-throughput screening (HTS) involves identifying sources of variability and developing procedures that minimize the variability at each step in the protocol. The goal is to produce a robust and reproducible assay with good metrics. In all good cell-based assays, this means coefficient of variation (CV) values of less than 10% and a signal window of fivefold or greater. HTS assays are usually evaluated using Z′ factor, which incorporates both standard deviation and signal window. A Z′ factor value of 0.5 or higher is acceptable for HTS. We used a standard HTS validation procedure in developing small interfering RNA (siRNA) screening technology at the HTS center at Southern Research. Initially, our assay performance was similar to published screens, with CV values greater than 10% and Z′ factor values of 0.51 ± 0.16 (average ± standard deviation). After optimizing the siRNA assay, we got CV values averaging 7.2% and a robust Z′ factor value of 0.78 ± 0.06 (average ± standard deviation). We present an overview of the problems encountered in developing this whole-genome siRNA screening program at Southern Research and how equipment optimization led to improved data quality. PMID:23616418

  8. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    PubMed

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  9. NASA's ATM Technology Demonstration-1: Integrated Concept of Arrival Operations

    NASA Technical Reports Server (NTRS)

    Baxley, Brian T.; Swenson, Harry N.; Prevot, Thomas; Callantine, Todd J.

    2012-01-01

    This paper describes operations and procedures envisioned for NASA s Air Traffic Management (ATM) Technology Demonstration #1 (ATD-1). The ATD-1 Concept of Operations (ConOps) demonstration will integrate three NASA technologies to achieve high throughput, fuel-efficient arrival operations into busy terminal airspace. They are Traffic Management Advisor with Terminal Metering (TMA-TM) for precise time-based schedules to the runway and points within the terminal area, Controller-Managed Spacing (CMS) decision support tools for terminal controllers to better manage aircraft delay using speed control, and Flight deck Interval Management (FIM) avionics and flight crew procedures to conduct airborne spacing operations. The ATD-1 concept provides de-conflicted and efficient operations of multiple arrival streams of aircraft, passing through multiple merge points, from top-of-descent (TOD) to touchdown. It also enables aircraft to conduct Optimized Profile Descents (OPDs) from en route altitude to the runway, using primarily speed control to maintain separation and schedule. The ATD-1 project is currently addressing the challenges of integrating the three technologies, and implantation into an operational environment. Goals of the ATD-1 demonstration include increasing the throughput of high-density airports, reducing controller workload, increasing efficiency of arrival operations and the frequency of trajectory-based operations, and promoting aircraft ADS-B equipage.

  10. High Throughput Strontium Isotope Method for Monitoring Fluid Flow Related to Geological CO2 Storage

    NASA Astrophysics Data System (ADS)

    Capo, R. C.; Wall, A. J.; Stewart, B. W.; Phan, T. T.; Jain, J. C.; Hakala, J. A.; Guthrie, G. D.

    2012-12-01

    Natural isotope tracers, such as strontium (Sr), can be a unique and powerful component of a monitoring strategy at a CO2 storage site, facilitating both the quantification of reaction progress for fluid-rock interactions and the tracking of brine migration caused by CO2 injection. Several challenges must be overcome, however, to enable the routine use of isotopic tracers, including the ability to rapidly analyze numerous aqueous samples with potentially complex chemical compositions. In a field situation, it might be necessary to analyze tens of samples over a short period of time to identify subsurface reactions and respond to unexpected fluid movement in the host formation. These conditions require streamlined Sr separation chemistry for samples ranging from pristine groundwaters to those containing high total dissolved solids, followed by rapid measurement of isotope ratios with high analytical precision. We have optimized Sr separation chemistry and MC-ICP-MS methods to provide rapid and precise measurements of isotope ratios in geologic, hydrologic, and environmental samples. These improvements will allow an operator to independently prepare samples for Sr isotope analysis off-site using fast, low cost chemical separation procedures and commercially available components. Existing vacuum-assisted Sr separation procedures were modified by using inexpensive disposable parts to eliminate cross contamination. Experimental results indicate that the modified columns provide excellent separation of Sr from chemically complex samples and that Sr can be effectively isolated from problematic matrix elements (e.g., Ca, Ba, K) associated with oilfield brines and formation waters. The separation procedure is designed for high sample throughput in which batches of 24 samples can be processed in approximately 2 hours, and are ready for Sr isotope measurements by MC-ICP-MS immediately after collection from the columns. Precise Sr isotope results can be achieved by MC-ICP-MS with a throughput of 4 to 5 samples per hour. Our mean measured value of NIST Sr isotope standard SRM 987 is 0.710265 ± 0.000014 (2σ, n = 94). A range of brines and CO2-rich fluids analyzed by this method yielded results within the analytical uncertainty of 87Sr/86Sr ratios previously determined by standard column separation and thermal ionization mass spectrometry. This method provides a fast and effective way to use Sr isotopes for monitoring purposes related to geological CO2 storage.

  11. 18 CFR 2.104 - Mechanisms for passthrough of pipeline take-or-pay buyout and buydown costs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... surcharge or a volumetric surcharge on total throughput. (b) Cost allocation procedures. A pipeline's volume... surcharges, together with any necessary accounting procedures, designed to assure that revenues recovered by...

  12. A novel procedure to assess the non-enzymatic hydrogen-peroxide antioxidant capacity of metabolites with high UV absorption.

    PubMed

    Csepregi, Kristóf; Hideg, Éva

    2016-12-01

    Assays assessing non-enzymatic hydrogen peroxide antioxidant capacities are often hampered by the high UV absorption of the sample itself. This is a typical problem in studies using plant extracts with high polyphenol content. Our assay is based on comparing the 405 nm absorption of the product of potassium iodine and hydrogen peroxide in the presence and absence of a putative hydrogen peroxide reactive antioxidant. This method is free of interference with either hydrogen peroxide or antioxidant self-absorption and it is also suitable for high-throughput plate reader applications.

  13. High-throughput profiling of nanoparticle-protein interactions by fluorescamine labeling.

    PubMed

    Ashby, Jonathan; Duan, Yaokai; Ligans, Erik; Tamsi, Michael; Zhong, Wenwan

    2015-02-17

    A rapid, high throughput fluorescence assay was designed to screen interactions between proteins and nanoparticles. The assay employs fluorescamine, a primary-amine specific fluorogenic dye, to label proteins. Because fluorescamine could specifically target the surface amines on proteins, a conformational change of the protein upon interaction with nanoparticles will result in a change in fluorescence. In the present study, the assay was applied to test the interactions between a selection of proteins and nanoparticles made of polystyrene, silica, or iron oxide. The particles were also different in their hydrodynamic diameter, synthesis procedure, or surface modification. Significant labeling differences were detected when the same protein incubated with different particles. Principal component analysis (PCA) on the collected fluorescence profiles revealed clear grouping effects of the particles based on their properties. The results prove that fluorescamine labeling is capable of detecting protein-nanoparticle interactions, and the resulting fluorescence profile is sensitive to differences in nanoparticle's physical properties. The assay can be carried out in a high-throughput manner, and is rapid with low operation cost. Thus, it is well suited for evaluating interactions between a larger number of proteins and nanoparticles. Such assessment can help to improve our understanding on the molecular basis that governs the biological behaviors of nanomaterials. It will also be useful for initial examination of the bioactivity and reproducibility of nanomaterials employed in biomedical fields.

  14. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  15. Accelerating evaluation of converged lattice thermal conductivity

    NASA Astrophysics Data System (ADS)

    Qin, Guangzhao; Hu, Ming

    2018-01-01

    High-throughput computational materials design is an emerging area in materials science, which is based on the fast evaluation of physical-related properties. The lattice thermal conductivity (κ) is a key property of materials for enormous implications. However, the high-throughput evaluation of κ remains a challenge due to the large resources costs and time-consuming procedures. In this paper, we propose a concise strategy to efficiently accelerate the evaluation process of obtaining accurate and converged κ. The strategy is in the framework of phonon Boltzmann transport equation (BTE) coupled with first-principles calculations. Based on the analysis of harmonic interatomic force constants (IFCs), the large enough cutoff radius (rcutoff), a critical parameter involved in calculating the anharmonic IFCs, can be directly determined to get satisfactory results. Moreover, we find a simple way to largely ( 10 times) accelerate the computations by fast reconstructing the anharmonic IFCs in the convergence test of κ with respect to the rcutof, which finally confirms the chosen rcutoff is appropriate. Two-dimensional graphene and phosphorene along with bulk SnSe are presented to validate our approach, and the long-debate divergence problem of thermal conductivity in low-dimensional systems is studied. The quantitative strategy proposed herein can be a good candidate for fast evaluating the reliable κ and thus provides useful tool for high-throughput materials screening and design with targeted thermal transport properties.

  16. A Human-in-the Loop Evaluation of a Coordinated Arrival Departure Scheduling Operations for Managing Departure Delays at LaGuardia Airport

    NASA Technical Reports Server (NTRS)

    Lee, Paul U.; Smith, Nancy M.; Bienert, Nancy; Brasil, Connie; Buckley, Nathan; Chevalley, Eric; Homola, Jeffrey; Omar, Faisal; Parke, Bonny; Yoo, Hyo-Sang

    2016-01-01

    LaGuardia (LGA) departure delay was identified by the stakeholders and subject matter experts as a significant bottleneck in the New York metropolitan area. Departure delay at LGA is primarily due to dependency between LGA's arrival and departure runways: LGA departures cannot begin takeoff until arrivals have cleared the runway intersection. If one-in one-out operations are not maintained and a significant arrival-to-departure imbalance occurs, the departure backup can persist through the rest of the day. At NASA Ames Research Center, a solution called "Departure-sensitive Arrival Spacing" (DSAS) was developed to maximize the departure throughput without creating significant delays in the arrival traffic. The concept leverages a Terminal Sequencing and Spacing (TSS) operations that create and manage the arrival schedule to the runway threshold and added an interface enhancement to the traffic manager's timeline to provide the ability to manually adjust inter-arrival spacing to build precise gaps for multiple departures between arrivals. A more complete solution would include a TSS algorithm enhancement that could automatically build these multi-departure gaps. With this set of capabilities, inter-arrival spacing could be controlled for optimal departure throughput. The concept was prototyped in a human-in-the- loop (HITL) simulation environment so that operational requirements such as coordination procedures, timing and magnitude of TSS schedule adjustments, and display features for Tower, TRACON and Traffic Management Unit could be determined. A HITL simulation was conducted in August 2014 to evaluate the concept in terms of feasibility, controller workload impact, and potential benefits. Three conditions were tested, namely a Baseline condition without scheduling, TSS condition that schedules the arrivals to the runway threshold, and TSS+DSAS condition that adjusts the arrival schedule to maximize the departure throughput. The results showed that during high arrival demand period, departure throughput could be incrementally increased under TSS and TSS+DSAS conditions without compromising the arrival throughput. The concept, operational procedures, and summary results were originally published in ATM20151 but detailed results were omitted. This paper expands on the earlier paper to provide the detailed results on throughput, conformance, safety, flight time/distance, etc. that provide extra insights into the feasibility and the potential benefits on the concept.

  17. Strategic and Operational Plan for Integrating Transcriptomics ...

    EPA Pesticide Factsheets

    Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016

  18. High-Throughput Experimental Approach Capabilities | Materials Science |

    Science.gov Websites

    NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non

  19. High Throughput Assay for Bacterial Adhesion on Acellular Dermal Matrices and Synthetic Surgical Materials

    PubMed Central

    Nyame, Theodore T.; Lemon, Katherine P.; Kolter, Roberto; Liao, Eric C.

    2013-01-01

    Background There has been increasing use of various synthetic and biologically derived materials in surgery. Biologic surgical materials are used in many plastic surgery procedures, ranging from breast reconstruction to hernia repairs. In particular, acellular dermal matrix (ADM) material has gained popularity in these applications. There is a paucity of data on how ADM compares to other surgical materials as a substrate for bacterial adhesion, the first step in formation biofilm, which occurs in prosthetic wound infections. We have designed a high throughput assay to evaluate Staphylococcus aureus adherence on various synthetic and biologically derived materials. Methods Clinical isolates of Staphylococcus aureus (strains SC-1 and UAMS-1) were cultured with different materials and bacterial adherence was measured using a resazurin cell vitality reporter microtiter assay. Four materials that are commonly utilized in reconstructive procedures were evaluated: prolene mesh, vicryl mesh, and two different ADM preparations (AlloDerm®, FlexHD®). We were able to develop a high throughput and reliable assay for quantifying bacterial adhesion on synthetic and biologically derived materials. Results The resazurin vitality assay can be reliably used to quantify bacterial adherence to acellular dermal matrix material, as well as synthetic material. S. aureus strains SC-1 and UAMS-1 both adhered better to ADM materials (AlloDerm® vs. FlexHD®) than to the synthetic material prolene. S. aureus also adhered better to vicryl than to prolene. Strain UAMS-1 adhered better to vicryl and ADM materials than did strain SC-1. Conclusion Our results suggest that S. aureus adheres more readily to ADM material than to synthetic material. We have developed an assay to rapidly test bacterial formation on surgical materials, using two S. aureus bacterial strains. This provides a standard method to evaluate existing and new materials with regard to bacterial adherence and potential propensity for infection. This assay is particularly important in the clinical context of the severe sequelae of post-operative infection. PMID:22030489

  20. High-throughput protein analysis integrating bioinformatics and experimental assays

    PubMed Central

    del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan

    2004-01-01

    The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins. PMID:14762202

  1. New On-Orbit Sensitivity Calibrationfor All STIS Echelle Modes

    NASA Astrophysics Data System (ADS)

    Aloisi, Alessandra; Bohlin, Ralph; Quijano, Jessica Kim

    2007-01-01

    On-orbit sensitivities for the 32 medium- and high-resolution STIS echelle secondarymodes were determined for the rst time using observations of the fundamental DAwhite dwarf standard star G191-B2B. Revised on-orbit sensitivities for the 12 mediumandhigh-resolution echelle prime modes based on observations of the same standardstar are also presented. We review the procedures and assumptions used to derive theadopted throughputs and implement them into the pipeline.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, D. T.

    Ion beam interference coating (IBIC) is a sputter-deposition process for multiple layers of optical thin films employing a Kaufman gun. It has achieved coatings of extremely low optical loss and high mechanical strength. It has many potential applications for a wide spectral range. This coating process is described in terms of principle, fabrication procedure, and optical measurements. Some discussions follow the history and outlooks of IBIC with emphasis on how to achieve low loss and on the throughput improvements.

  3. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    PubMed Central

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  4. NMRbot: Python scripts enable high-throughput data collection on current Bruker BioSpin NMR spectrometers.

    PubMed

    Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L

    2013-06-01

    To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.

  5. On-the-fly machine-learning for high-throughput experiments: search for rare-earth-free permanent magnets

    PubMed Central

    Kusne, Aaron Gilad; Gao, Tieren; Mehta, Apurva; Ke, Liqin; Nguyen, Manh Cuong; Ho, Kai-Ming; Antropov, Vladimir; Wang, Cai-Zhuang; Kramer, Matthew J.; Long, Christian; Takeuchi, Ichiro

    2014-01-01

    Advanced materials characterization techniques with ever-growing data acquisition speed and storage capabilities represent a challenge in modern materials science, and new procedures to quickly assess and analyze the data are needed. Machine learning approaches are effective in reducing the complexity of data and rapidly homing in on the underlying trend in multi-dimensional data. Here, we show that by employing an algorithm called the mean shift theory to a large amount of diffraction data in high-throughput experimentation, one can streamline the process of delineating the structural evolution across compositional variations mapped on combinatorial libraries with minimal computational cost. Data collected at a synchrotron beamline are analyzed on the fly, and by integrating experimental data with the inorganic crystal structure database (ICSD), we can substantially enhance the accuracy in classifying the structural phases across ternary phase spaces. We have used this approach to identify a novel magnetic phase with enhanced magnetic anisotropy which is a candidate for rare-earth free permanent magnet. PMID:25220062

  6. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  7. High-throughput STR analysis for DNA database using direct PCR.

    PubMed

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner. © 2013 American Academy of Forensic Sciences Published 2013. This article is a U.S. Government work and is in the public domain in the U.S.A.

  8. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    PubMed

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  9. Automated Solid-Phase Subcloning Based on Beads Brought into Proximity by Magnetic Force

    PubMed Central

    Hudson, Elton P.; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications. PMID:22624028

  10. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  11. Optimization of throughput in semipreparative chiral liquid chromatography using stacked injection.

    PubMed

    Taheri, Mohammadreza; Fotovati, Mohsen; Hosseini, Seyed-Kiumars; Ghassempour, Alireza

    2017-10-01

    An interesting mode of chromatography for preparation of pure enantiomers from pure samples is the method of stacked injection as a pseudocontinuous procedure. Maximum throughput and minimal production costs can be achieved by the use of total chiral column length in this mode of chromatography. To maximize sample loading, often touching bands of the two enantiomers is automatically achieved. Conventional equations show direct correlation between touching-band loadability and the selectivity factor of two enantiomers. The important question for one who wants to obtain the highest throughput is "How to optimize different factors including selectivity, resolution, run time, and loading of the sample in order to save time without missing the touching-band resolution?" To answer this question, tramadol and propranolol were separated on cellulose 3,5-dimethyl phenyl carbamate, as two pure racemic mixtures with low and high solubilities in mobile phase, respectively. The mobile phase composition consisted of n-hexane solvent with alcohol modifier and diethylamine as the additive. A response surface methodology based on central composite design was used to optimize separation factors against the main responses. According to the stacked injection properties, two processes were investigated for maximizing throughput: one with a poorly soluble and another with a highly soluble racemic mixture. For each case, different optimization possibilities were inspected. It was revealed that resolution is a crucial response for separations of this kind. Peak area and run time are two critical parameters in optimization of stacked injection for binary mixtures which have low solubility in the mobile phase. © 2017 Wiley Periodicals, Inc.

  12. Using Six Sigma and Lean methodologies to improve OR throughput.

    PubMed

    Fairbanks, Catharine B

    2007-07-01

    Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members. (c) AORN, Inc, 2007.

  13. Bimodal imprint chips for peptide screening: integration of high-throughput sequencing by MS and affinity analyses by surface plasmon resonance imaging.

    PubMed

    Wang, Weizhi; Li, Menglin; Wei, Zewen; Wang, Zihua; Bu, Xiangli; Lai, Wenjia; Yang, Shu; Gong, He; Zheng, Hui; Wang, Yuqiao; Liu, Ying; Li, Qin; Fang, Qiaojun; Hu, Zhiyuan

    2014-04-15

    Peptide probes and drugs have widespread applications in disease diagnostics and therapy. The demand for peptides ligands with high affinity and high specificity toward various targets has surged in the biomedical field in recent years. The traditional peptide screening procedure involves selection, sequencing, and characterization steps, and each step is manual and tedious. Herein, we developed a bimodal imprint microarray system to embrace the whole peptide screening process. Silver-sputtered silicon chip fabricated with microwell array can trap and pattern the candidate peptide beads in a one-well-one-bead manner. Peptides on beads were photocleaved in situ. A portion of the peptide in each well was transferred to a gold-coated chip to print the peptide array for high-throughput affinity analyses by surface plasmon resonance imaging (SPRi), and the peptide left in the silver-sputtered chip was ready for in situ single bead sequencing by matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS). Using the bimodal imprint chip system, affinity peptides toward AHA were efficiently screened out from the 7 × 10(4) peptide library. The method provides a solution for high efficiency peptide screening.

  14. Digital transcriptome profiling using selective hexamer priming for cDNA synthesis.

    PubMed

    Armour, Christopher D; Castle, John C; Chen, Ronghua; Babak, Tomas; Loerch, Patrick; Jackson, Stuart; Shah, Jyoti K; Dey, John; Rohl, Carol A; Johnson, Jason M; Raymond, Christopher K

    2009-09-01

    We developed a procedure for the preparation of whole transcriptome cDNA libraries depleted of ribosomal RNA from only 1 microg of total RNA. The method relies on a collection of short, computationally selected oligonucleotides, called 'not-so-random' (NSR) primers, to obtain full-length, strand-specific representation of nonribosomal RNA transcripts. In this study we validated the technique by profiling human whole brain and universal human reference RNA using ultra-high-throughput sequencing.

  15. MerMade: An Oligodeoxyribonucleotide Synthesizer for High Throughput Oligonucleotide Production in Dual 96-Well Plates

    PubMed Central

    Rayner, Simon; Brignac, Stafford; Bumeister, Ron; Belosludtsev, Yuri; Ward, Travis; Grant, O’dell; O’Brien, Kevin; Evans, Glen A.; Garner, Harold R.

    1998-01-01

    We have designed and constructed a machine that synthesizes two standard 96-well plates of oligonucleotides in a single run using standard phosphoramidite chemistry. The machine is capable of making a combination of standard, degenerate, or modified oligos in a single plate. The run time is typically 17 hr for two plates of 20-mers and a reaction scale of 40 nm. The reaction vessel is a standard polypropylene 96-well plate with a hole drilled in the bottom of each well. The two plates are placed in separate vacuum chucks and mounted on an xy table. Each well in turn is positioned under the appropriate reagent injection line and the reagent is injected by switching a dedicated valve. All aspects of machine operation are controlled by a Macintosh computer, which also guides the user through the startup and shutdown procedures, provides a continuous update on the status of the run, and facilitates a number of service procedures that need to be carried out periodically. Over 25,000 oligos have been synthesized for use in dye terminator sequencing reactions, polymerase chain reactions (PCRs), hybridization, and RT–PCR. Oligos up to 100 bases in length have been made with a coupling efficiency in excess of 99%. These machines, working in conjunction with our oligo prediction code are particularly well suited to application in automated high throughput genomic sequencing. PMID:9685322

  16. MerMade: an oligodeoxyribonucleotide synthesizer for high throughput oligonucleotide production in dual 96-well plates.

    PubMed

    Rayner, S; Brignac, S; Bumeister, R; Belosludtsev, Y; Ward, T; Grant, O; O'Brien, K; Evans, G A; Garner, H R

    1998-07-01

    We have designed and constructed a machine that synthesizes two standard 96-well plates of oligonucleotides in a single run using standard phosphoramidite chemistry. The machine is capable of making a combination of standard, degenerate, or modified oligos in a single plate. The run time is typically 17 hr for two plates of 20-mers and a reaction scale of 40 nM. The reaction vessel is a standard polypropylene 96-well plate with a hole drilled in the bottom of each well. The two plates are placed in separate vacuum chucks and mounted on an xy table. Each well in turn is positioned under the appropriate reagent injection line and the reagent is injected by switching a dedicated valve. All aspects of machine operation are controlled by a Macintosh computer, which also guides the user through the startup and shutdown procedures, provides a continuous update on the status of the run, and facilitates a number of service procedures that need to be carried out periodically. Over 25,000 oligos have been synthesized for use in dye terminator sequencing reactions, polymerase chain reactions (PCRs), hybridization, and RT-PCR. Oligos up to 100 bases in length have been made with a coupling efficiency in excess of 99%. These machines, working in conjunction with our oligo prediction code are particularly well suited to application in automated high throughput genomic sequencing.

  17. Optimization of Time-Resolved Fluorescence Assay for Detection of Eu-DOTA-labeled Ligand-Receptor Interactions

    PubMed Central

    De Silva, Channa R.; Vagner, Josef; Lynch, Ronald; Gillies, Robert J.; Hruby, Victor J.

    2010-01-01

    Lanthanide-based luminescent ligand binding assays are superior to traditional radiolabel assays due to improved sensitivity and affordability in high throughput screening while eliminating the use of radioactivity. Despite significant progress using lanthanide(III)-coordinated chelators such as DTPA derivatives, dissociation-enhanced lanthanide fluoroimmunoassays (DELFIA) have not yet been successfully used with more stable chelators, e.g. DOTA derivatives, due to the incomplete release of lanthanide(III) ions from the complex. Here, a modified and an optimized DELFIA procedure incorporating an acid treatment protocol is introduced for use with Eu(III)-DOTA labeled peptides. Complete release of Eu(III) ions from DOTA labeled ligands was observed using hydrochloric acid (2.0 M) prior to the luminescent enhancement step. NDP-α-MSH labeled with Eu(III)-DOTA was synthesized and the binding affinity to cells overexpressing the human melanocortin-4 receptors (hMC4R) was evaluated using the modified protocol. Binding data indicate that the Eu(III)-DOTA linked peptide bound to these cells with an affinity similar to its DTPA analogue. The modified DELFIA procedure was further used to monitor the binding of an Eu(III)-DOTA labeled heterobivalent peptide to the cells expressing both hMC4R and CCK-2 (Cholecystokinin) receptors. The modified assay provides superior results and is appropriate for high-throughput screening of ligand libraries. PMID:19852924

  18. Comparison of mapping algorithms used in high-throughput sequencing: application to Ion Torrent data

    PubMed Central

    2014-01-01

    Background The rapid evolution in high-throughput sequencing (HTS) technologies has opened up new perspectives in several research fields and led to the production of large volumes of sequence data. A fundamental step in HTS data analysis is the mapping of reads onto reference sequences. Choosing a suitable mapper for a given technology and a given application is a subtle task because of the difficulty of evaluating mapping algorithms. Results In this paper, we present a benchmark procedure to compare mapping algorithms used in HTS using both real and simulated datasets and considering four evaluation criteria: computational resource and time requirements, robustness of mapping, ability to report positions for reads in repetitive regions, and ability to retrieve true genetic variation positions. To measure robustness, we introduced a new definition for a correctly mapped read taking into account not only the expected start position of the read but also the end position and the number of indels and substitutions. We developed CuReSim, a new read simulator, that is able to generate customized benchmark data for any kind of HTS technology by adjusting parameters to the error types. CuReSim and CuReSimEval, a tool to evaluate the mapping quality of the CuReSim simulated reads, are freely available. We applied our benchmark procedure to evaluate 14 mappers in the context of whole genome sequencing of small genomes with Ion Torrent data for which such a comparison has not yet been established. Conclusions A benchmark procedure to compare HTS data mappers is introduced with a new definition for the mapping correctness as well as tools to generate simulated reads and evaluate mapping quality. The application of this procedure to Ion Torrent data from the whole genome sequencing of small genomes has allowed us to validate our benchmark procedure and demonstrate that it is helpful for selecting a mapper based on the intended application, questions to be addressed, and the technology used. This benchmark procedure can be used to evaluate existing or in-development mappers as well as to optimize parameters of a chosen mapper for any application and any sequencing platform. PMID:24708189

  19. A rapid and simple procedure to detect the presence of MVM in conditioned cell fluids or culture media.

    PubMed

    Chang, A; Havas, S; Borellini, F; Ostrove, J M; Bird, R E

    1997-12-01

    During the manufacture of biopharmaceuticals, numerous adventitious agents have been detected in Master Cell Banks, end-of-production cells as well as bulk harvest fluid. Recently, a number of large-scale production bioreactors have become infected with Minute Virus of Mice (MVM) during cGMP (current good manufacturing practices) operations, and this has resulted in both the loss of product and the need for major cleaning validation procedures to be put in place. We have developed a simple DNA extraction/PCR assay to detect the presence of MVM in cell culture supernatant (conditioned cell fluids). This highly specific assay can detect 10 or fewer genome equivalents (copies) of MVM following PCR and gel electrophoresis visualization. For routine high-throughput detection, 300-100 copies could be consistently detected. The extraction procedure was shown to reliably detect MVM at a concentration of 1 TCID50/ml. The combination of the extraction/PCR procedure establishes a powerful, sensitive, specific assay that can detect the presence of MVM sequences with a 1-day turnaround time.

  20. Using Meta-Reflection to Improve Learning and Throughput: Redesigning Assessment Procedures in a Political Science Course on Power

    ERIC Educational Resources Information Center

    Hagström, Linus; Scheja, Max

    2014-01-01

    The aim of this article is to contribute to the discussion on how examinations can be designed to enhance students' learning and increase throughput in terms of the number of students who sit, and pass, the course examination. The context of the study is a basic level political science course on power analysis, which initially suffered from low…

  1. High-throughput microplate technique for enzymatic hydrolysis of lignocellulosic biomass.

    PubMed

    Chundawat, Shishir P S; Balan, Venkatesh; Dale, Bruce E

    2008-04-15

    Several factors will influence the viability of a biochemical platform for manufacturing lignocellulosic based fuels and chemicals, for example, genetically engineering energy crops, reducing pre-treatment severity, and minimizing enzyme loading. Past research on biomass conversion has focused largely on acid based pre-treatment technologies that fractionate lignin and hemicellulose from cellulose. However, for alkaline based (e.g., AFEX) and other lower severity pre-treatments it becomes critical to co-hydrolyze cellulose and hemicellulose using an optimized enzyme cocktail. Lignocellulosics are appropriate substrates to assess hydrolytic activity of enzyme mixtures compared to conventional unrealistic substrates (e.g., filter paper, chromogenic, and fluorigenic compounds) for studying synergistic hydrolysis. However, there are few, if any, high-throughput lignocellulosic digestibility analytical platforms for optimizing biomass conversion. The 96-well Biomass Conversion Research Lab (BCRL) microplate method is a high-throughput assay to study digestibility of lignocellulosic biomass as a function of biomass composition, pre-treatment severity, and enzyme composition. The most suitable method for delivering milled biomass to the microplate was through multi-pipetting slurry suspensions. A rapid bio-enzymatic, spectrophotometric assay was used to determine fermentable sugars. The entire procedure was automated using a robotic pipetting workstation. Several parameters that affect hydrolysis in the microplate were studied and optimized (i.e., particle size reduction, slurry solids concentration, glucan loading, mass transfer issues, and time period for hydrolysis). The microplate method was optimized for crystalline cellulose (Avicel) and ammonia fiber expansion (AFEX) pre-treated corn stover. Copyright 2008 Wiley Periodicals, Inc.

  2. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  3. Efficient 41Ca measurements for biomedical applications

    NASA Astrophysics Data System (ADS)

    Vockenhuber, C.; Schulze-König, T.; Synal, H.-A.; Aeberli, I.; Zimmermann, M. B.

    2015-10-01

    We present the performance of 41Ca measurements using low-energy Accelerator Mass Spectrometry (AMS) at the 500 kV facility TANDY at ETH Zurich. We optimized the measurement procedure for biomedical applications where reliability and high sample throughput is required. The main challenge for AMS measurements of 41Ca is the interfering stable isobar 41K. We use a simplified sample preparation procedure to produce calcium fluoride (CaF2) and extract calcium tri-fluoride ions (CaF3-) ions to suppress the stable isobar 41K. Although 41K is not completely suppressed we reach 41Ca/40Ca background level in the 10-12 range which is adequate for biomedical studies. With helium as a stripper gas we can use charge state 2+ at high transmission (∼50%). The new measurement procedure with the approximately 10 × improved efficiency and the higher accuracy due to 41K correction allowed us to measure more than 600 samples for a large biomedical study within only a few weeks of measurement time.

  4. Detecting and removing multiplicative spatial bias in high-throughput screening technologies.

    PubMed

    Caraus, Iurie; Mazoure, Bogdan; Nadon, Robert; Makarenkov, Vladimir

    2017-10-15

    Considerable attention has been paid recently to improve data quality in high-throughput screening (HTS) and high-content screening (HCS) technologies widely used in drug development and chemical toxicity research. However, several environmentally- and procedurally-induced spatial biases in experimental HTS and HCS screens decrease measurement accuracy, leading to increased numbers of false positives and false negatives in hit selection. Although effective bias correction methods and software have been developed over the past decades, almost all of these tools have been designed to reduce the effect of additive bias only. Here, we address the case of multiplicative spatial bias. We introduce three new statistical methods meant to reduce multiplicative spatial bias in screening technologies. We assess the performance of the methods with synthetic and real data affected by multiplicative spatial bias, including comparisons with current bias correction methods. We also describe a wider data correction protocol that integrates methods for removing both assay and plate-specific spatial biases, which can be either additive or multiplicative. The methods for removing multiplicative spatial bias and the data correction protocol are effective in detecting and cleaning experimental data generated by screening technologies. As our protocol is of a general nature, it can be used by researchers analyzing current or next-generation high-throughput screens. The AssayCorrector program, implemented in R, is available on CRAN. makarenkov.vladimir@uqam.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. A modern space simulation facility to accommodate high production acceptance testing

    NASA Technical Reports Server (NTRS)

    Glover, J. D.

    1986-01-01

    A space simulation laboratory that supports acceptance testing of spacecraft and associated subsystems at throughput rates as high as nine per year is discussed. The laboratory includes a computer operated 27' by 30' space simulation, a 20' by 20' by 20' thermal cycle chamber and an eight station thermal cycle/thermal vacuum test system. The design philosophy and unique features of each system are discussed. The development of operating procedures, test team requirements, test team integration, and other peripheral activation details are described. A discussion of special accommodations for the efficient utilization of the systems in support of high rate production is presented.

  6. Application of ToxCast High-Throughput Screening and ...

    EPA Pesticide Factsheets

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  7. High-throughput process development of an alternative platform for the production of virus-like particles in Escherichia coli.

    PubMed

    Ladd Effio, Christopher; Baumann, Pascal; Weigel, Claudia; Vormittag, Philipp; Middelberg, Anton; Hubbuch, Jürgen

    2016-02-10

    The production of safe vaccines against untreatable or new diseases has pushed the research in the field of virus-like particles (VLPs). Currently, a large number of commercial VLP-based human vaccines and vaccine candidates are available or under development. A promising VLP production route is the controlled in vitro assembly of virus proteins into capsids. In the study reported here, a high-throughput screening (HTS) procedure was implemented for the upstream process development of a VLP platform in bacterial cell systems. Miniaturized cultivations were carried out in 48-well format in the BioLector system (m2p-Labs, Germany) using an Escherichia coli strain with a tac promoter producing the murine polyomavirus capsid protein (VP1). The screening procedure incorporated micro-scale cultivations, HTS cell disruption by sonication and HTS-compatible analytics by capillary gel electrophoresis. Cultivation temperatures, shaking speeds, induction and medium conditions were varied to optimize the product expression in E. coli. The most efficient system was selected based on an evaluation of soluble and insoluble product concentrations as well as on the percentage of product in the total soluble protein fraction. The optimized system was scaled up to cultivation 2.5L shaker flask scale and purified using an anion exchange chromatography membrane adsorber, followed by a size exclusion chromatography polishing procedure. For proof of concept, purified VP1 capsomeres were assembled under defined buffer conditions into empty capsids and characterized using transmission electron microscopy (TEM). The presented HTS procedure allowed for a fast development of an efficient production process of VLPs in E. coli. Under optimized cultivation conditions, the VP1 product totalled up to 43% of the total soluble protein fraction, yielding 1.63 mg VP1 per mL of applied cultivation medium. The developed production process strongly promotes the murine polyoma-VLP platform, moving towards an industrially feasible technology for new chimeric vaccines. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  9. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  10. Potential of dynamically harmonized Fourier transform ion cyclotron resonance cell for high-throughput metabolomics fingerprinting: control of data quality.

    PubMed

    Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle

    2018-01-01

    Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.

  11. Rapid one-step recombinational cloning

    PubMed Central

    Fu, Changlin; Wehr, Daniel R.; Edwards, Janice; Hauge, Brian

    2008-01-01

    As an increasing number of genes and open reading frames of unknown function are discovered, expression of the encoded proteins is critical toward establishing function. Accordingly, there is an increased need for highly efficient, high-fidelity methods for directional cloning. Among the available methods, site-specific recombination-based cloning techniques, which eliminate the use of restriction endonucleases and ligase, have been widely used for high-throughput (HTP) procedures. We have developed a recombination cloning method, which uses truncated recombination sites to clone PCR products directly into destination/expression vectors, thereby bypassing the requirement for first producing an entry clone. Cloning efficiencies in excess of 80% are obtained providing a highly efficient method for directional HTP cloning. PMID:18424799

  12. High Throughput Transcriptomics: From screening to pathways

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  13. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    NASA Astrophysics Data System (ADS)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  14. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations.

    PubMed

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  15. Chapter 17 Sterile Plate-Based Vitrification of Adherent Human Pluripotent Stem Cells and Their Derivatives Using the TWIST Method.

    PubMed

    Neubauer, Julia C; Stracke, Frank; Zimmermann, Heiko

    2017-01-01

    Due to their high biological complexity, e.g., their close cell-to-cell contacts, cryopreservation of human pluripotent stem cells with standard slow-rate protocols often is inefficient and can hardly be standardized. Vitrification that means ultrafast freezing already showed very good viability and recovery rates for this sensitive cell system, but is only applicable for low cell numbers, bears a high risk of contamination, and can hardly be implemented under GxP regulations. In this chapter, a sterile plate-based vitrification method for adherent pluripotent stem cells and their derivatives is presented based on a procedure and device for human embryonic stem cells developed by Beier et al. (Cryobiology 66:8-16, 2013). This protocol overcomes the limitations of conventional vitrification procedures resulting in the highly efficient preservation of ready-to-use adherent pluripotent stem cells with the possibility of vitrifying cells in multi-well formats for direct application in high-throughput screenings.

  16. High Throughput Experimental Materials Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakutayev, Andriy; Perkins, John; Schwarting, Marcus

    The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).

  17. An LC-MS/MS method for rapid and sensitive high-throughput simultaneous determination of various protein kinase inhibitors in human plasma.

    PubMed

    Abdelhameed, Ali S; Attwa, Mohamed W; Kadi, Adnan A

    2017-02-01

    A reliable, high-throughput and sensitive LC-MS/MS procedure was developed and validated for the determination of five tyrosine kinase inhibitors in human plasma. Following their extraction from human plasma, samples were eluted on a RP Luna®-PFP 100 Å column using a mobile phase system composed of acetonitrile and 0.01 m ammonium formate in water (pH ~4.1) with a ratio of (50:50, v/v) flowing at 0.3 mL min -1 . The mass spectrometer was operating with electrospray ionization in the positive ion multiple reaction monitoring mode. The proposed methodology resulted in linear calibration plots with correlation coefficients values of r 2  = 0.9995-0.9999 from concentration ranges of 2.5-100 ng mL -1 for imatinib, 5.0-100 ng mL -1 for sorafenib, tofacitinib and afatinib, and 1.0-100 ng mL -1 for cabozantinib. The procedure was validated in terms of its specificity, limit of detection (0.32-1.71 ng mL -1 ), lower limit of quantification (0.97-5.07 ng mL -1 ), intra- and inter assay accuracy (-3.83 to +2.40%) and precision (<3.37%), matrix effect and recovery and stability. Our results demonstrated that the proposed method is highly reliable for routine quantification of the investigated tyrosine kinase inhibitors in human plasma and can be efficiently applied in the rapid and sensitive analysis of their clinical samples. Copyright © 2016 John Wiley & Sons, Ltd.

  18. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality.

    PubMed

    Albert, Océane; Reintsch, Wolfgang E; Chan, Peter; Robaire, Bernard

    2016-05-01

    Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses ( ITALIC! n = 3-5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  20. Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...

  1. Comparative Transcriptomes and EVO-DEVO Studies Depending on Next Generation Sequencing.

    PubMed

    Liu, Tiancheng; Yu, Lin; Liu, Lei; Li, Hong; Li, Yixue

    2015-01-01

    High throughput technology has prompted the progressive omics studies, including genomics and transcriptomics. We have reviewed the improvement of comparative omic studies, which are attributed to the high throughput measurement of next generation sequencing technology. Comparative genomics have been successfully applied to evolution analysis while comparative transcriptomics are adopted in comparison of expression profile from two subjects by differential expression or differential coexpression, which enables their application in evolutionary developmental biology (EVO-DEVO) studies. EVO-DEVO studies focus on the evolutionary pressure affecting the morphogenesis of development and previous works have been conducted to illustrate the most conserved stages during embryonic development. Old measurements of these studies are based on the morphological similarity from macro view and new technology enables the micro detection of similarity in molecular mechanism. Evolutionary model of embryo development, which includes the "funnel-like" model and the "hourglass" model, has been evaluated by combination of these new comparative transcriptomic methods with prior comparative genomic information. Although the technology has promoted the EVO-DEVO studies into a new era, technological and material limitation still exist and further investigations require more subtle study design and procedure.

  2. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  4. Vaccinomics, adversomics, and the immune response network theory: Individualized vaccinology in the 21st century

    PubMed Central

    Poland, Gregory A.; Kennedy, Richard B.; McKinney, Brett A.; Ovsyannikova, Inna G.; Lambert, Nathaniel D.; Jacobson, Robert M.; Oberg, Ann L.

    2013-01-01

    Vaccines, like drugs and medical procedures, are increasingly amenable to individualization or personalization, often based on novel data resulting from high throughput “omics” technologies. As a result of these technologies, 21st century vaccinology will increasingly see the abandonment of a “one size fits all” approach to vaccine dosing and delivery, as well as the abandonment of the empiric “isolate–inactivate–inject” paradigm for vaccine development. In this review, we discuss the immune response network theory and its application to the new field of vaccinomics and adversomics, and illustrate how vaccinomics can lead to new vaccine candidates, new understandings of how vaccines stimulate immune responses, new biomarkers for vaccine response, and facilitate the understanding of what genetic and other factors might be responsible for rare side effects due to vaccines. Perhaps most exciting will be the ability, at a systems biology level, to integrate increasingly complex high throughput data into descriptive and predictive equations for immune responses to vaccines. Herein, we discuss the above with a view toward the future of vaccinology. PMID:23755893

  5. Comparative Microbial Modules Resource: Generation and Visualization of Multi-species Biclusters

    PubMed Central

    Bate, Ashley; Eichenberger, Patrick; Bonneau, Richard

    2011-01-01

    The increasing abundance of large-scale, high-throughput datasets for many closely related organisms provides opportunities for comparative analysis via the simultaneous biclustering of datasets from multiple species. These analyses require a reformulation of how to organize multi-species datasets and visualize comparative genomics data analyses results. Recently, we developed a method, multi-species cMonkey, which integrates heterogeneous high-throughput datatypes from multiple species to identify conserved regulatory modules. Here we present an integrated data visualization system, built upon the Gaggle, enabling exploration of our method's results (available at http://meatwad.bio.nyu.edu/cmmr.html). The system can also be used to explore other comparative genomics datasets and outputs from other data analysis procedures – results from other multiple-species clustering programs or from independent clustering of different single-species datasets. We provide an example use of our system for two bacteria, Escherichia coli and Salmonella Typhimurium. We illustrate the use of our system by exploring conserved biclusters involved in nitrogen metabolism, uncovering a putative function for yjjI, a currently uncharacterized gene that we predict to be involved in nitrogen assimilation. PMID:22144874

  6. Comparative microbial modules resource: generation and visualization of multi-species biclusters.

    PubMed

    Kacmarczyk, Thadeous; Waltman, Peter; Bate, Ashley; Eichenberger, Patrick; Bonneau, Richard

    2011-12-01

    The increasing abundance of large-scale, high-throughput datasets for many closely related organisms provides opportunities for comparative analysis via the simultaneous biclustering of datasets from multiple species. These analyses require a reformulation of how to organize multi-species datasets and visualize comparative genomics data analyses results. Recently, we developed a method, multi-species cMonkey, which integrates heterogeneous high-throughput datatypes from multiple species to identify conserved regulatory modules. Here we present an integrated data visualization system, built upon the Gaggle, enabling exploration of our method's results (available at http://meatwad.bio.nyu.edu/cmmr.html). The system can also be used to explore other comparative genomics datasets and outputs from other data analysis procedures - results from other multiple-species clustering programs or from independent clustering of different single-species datasets. We provide an example use of our system for two bacteria, Escherichia coli and Salmonella Typhimurium. We illustrate the use of our system by exploring conserved biclusters involved in nitrogen metabolism, uncovering a putative function for yjjI, a currently uncharacterized gene that we predict to be involved in nitrogen assimilation. © 2011 Kacmarczyk et al.

  7. Combining high-throughput MALDI-TOF mass spectrometry and isoelectric focusing gel electrophoresis for virtual 2D gel-based proteomics.

    PubMed

    Lohnes, Karen; Quebbemann, Neil R; Liu, Kate; Kobzeff, Fred; Loo, Joseph A; Ogorzalek Loo, Rachel R

    2016-07-15

    The virtual two-dimensional gel electrophoresis/mass spectrometry (virtual 2D gel/MS) technology combines the premier, high-resolution capabilities of 2D gel electrophoresis with the sensitivity and high mass accuracy of mass spectrometry (MS). Intact proteins separated by isoelectric focusing (IEF) gel electrophoresis are imaged from immobilized pH gradient (IPG) polyacrylamide gels (the first dimension of classic 2D-PAGE) by matrix-assisted laser desorption/ionization (MALDI) MS. Obtaining accurate intact masses from sub-picomole-level proteins embedded in 2D-PAGE gels or in IPG strips is desirable to elucidate how the protein of one spot identified as protein 'A' on a 2D gel differs from the protein of another spot identified as the same protein, whenever tryptic peptide maps fail to resolve the issue. This task, however, has been extremely challenging. Virtual 2D gel/MS provides access to these intact masses. Modifications to our matrix deposition procedure improve the reliability with which IPG gels can be prepared; the new procedure is described. Development of this MALDI MS imaging (MSI) method for high-throughput MS with integrated 'top-down' MS to elucidate protein isoforms from complex biological samples is described and it is demonstrated that a 4-cm IPG gel segment can now be imaged in approximately 5min. Gel-wide chemical and enzymatic methods with further interrogation by MALDI MS/MS provide identifications, sequence-related information, and post-translational/transcriptional modification information. The MSI-based virtual 2D gel/MS platform may potentially link the benefits of 'top-down' and 'bottom-up' proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A high-efficiency real-time digital signal averager for time-of-flight mass spectrometry.

    PubMed

    Wang, Yinan; Xu, Hui; Li, Qingjiang; Li, Nan; Huang, Zhengxu; Zhou, Zhen; Liu, Husheng; Sun, Zhaolin; Xu, Xin; Yu, Hongqi; Liu, Haijun; Li, David D-U; Wang, Xi; Dong, Xiuzhen; Gao, Wei

    2013-05-30

    Analog-to-digital converter (ADC)-based acquisition systems are widely applied in time-of-flight mass spectrometers (TOFMS) due to their ability to record the signal intensity of all ions within the same pulse. However, the acquisition system raises the requirement for data throughput, along with increasing the conversion rate and resolution of the ADC. It is therefore of considerable interest to develop a high-performance real-time acquisition system, which can relieve the limitation of data throughput. We present in this work a high-efficiency real-time digital signal averager, consisting of a signal conditioner, a data conversion module and a signal processing module. Two optimization strategies are implemented using field programmable gate arrays (FPGAs) to enhance the efficiency of the real-time processing. A pipeline procedure is used to reduce the time consumption of the accumulation strategy. To realize continuous data transfer, a high-efficiency transmission strategy is developed, based on a ping-pong procedure. The digital signal averager features good responsiveness, analog bandwidth and dynamic performance. The optimal effective number of bits reaches 6.7 bits. For a 32 µs record length, the averager can realize 100% efficiency with an extraction frequency below 31.23 kHz by modifying the number of accumulation steps. In unit time, the averager yields superior signal-to-noise ratio (SNR) compared with data accumulation in a computer. The digital signal averager is combined with a vacuum ultraviolet single-photon ionization time-of-flight mass spectrometer (VUV-SPI-TOFMS). The efficiency of the real-time processing is tested by analyzing the volatile organic compounds (VOCs) from ordinary printed materials. In these experiments, 22 kinds of compounds are detected, and the dynamic range exceeds 3 orders of magnitude. Copyright © 2013 John Wiley & Sons, Ltd.

  9. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  10. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  11. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  12. Large scale validation of an efficient CRISPR/Cas-based multi gene editing protocol in Escherichia coli.

    PubMed

    Zerbini, Francesca; Zanella, Ilaria; Fraccascia, Davide; König, Enrico; Irene, Carmela; Frattini, Luca F; Tomasi, Michele; Fantappiè, Laura; Ganfini, Luisa; Caproni, Elena; Parri, Matteo; Grandi, Alberto; Grandi, Guido

    2017-04-24

    The exploitation of the CRISPR/Cas9 machinery coupled to lambda (λ) recombinase-mediated homologous recombination (recombineering) is becoming the method of choice for genome editing in E. coli. First proposed by Jiang and co-workers, the strategy has been subsequently fine-tuned by several authors who demonstrated, by using few selected loci, that the efficiency of mutagenesis (number of mutant colonies over total number of colonies analyzed) can be extremely high (up to 100%). However, from published data it is difficult to appreciate the robustness of the technology, defined as the number of successfully mutated loci over the total number of targeted loci. This information is particularly relevant in high-throughput genome editing, where repetition of experiments to rescue missing mutants would be impractical. This work describes a "brute force" validation activity, which culminated in the definition of a robust, simple and rapid protocol for single or multiple gene deletions. We first set up our own version of the CRISPR/Cas9 protocol and then we evaluated the mutagenesis efficiency by changing different parameters including sequence of guide RNAs, length and concentration of donor DNAs, and use of single stranded and double stranded donor DNAs. We then validated the optimized conditions targeting 78 "dispensable" genes. This work led to the definition of a protocol, featuring the use of double stranded synthetic donor DNAs, which guarantees mutagenesis efficiencies consistently higher than 10% and a robustness of 100%. The procedure can be applied also for simultaneous gene deletions. This work defines for the first time the robustness of a CRISPR/Cas9-based protocol based on a large sample size. Since the technical solutions here proposed can be applied to other similar procedures, the data could be of general interest for the scientific community working on bacterial genome editing and, in particular, for those involved in synthetic biology projects requiring high throughput procedures.

  13. Machine vision for digital microfluidics

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun; Lee, Jeong-Bong

    2010-01-01

    Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future.

  14. Microbial dynamics in mixed culture biofilms of bacteria surviving sanitation of conveyor belts in salmon-processing plants.

    PubMed

    Langsrud, S; Moen, B; Møretrø, T; Løype, M; Heir, E

    2016-02-01

    The microbiota surviving sanitation of salmon-processing conveyor belts was identified and its growth dynamics further investigated in a model mimicking processing surfaces in such plants. A diverse microbiota dominated by Gram-negative bacteria was isolated after regular sanitation in three salmon processing plants. A cocktail of 14 bacterial isolates representing all genera isolated from conveyor belts (Listeria, Pseudomonas, Stenotrophomonas, Brochothrix, Serratia, Acinetobacter, Rhodococcus and Chryseobacterium) formed stable biofilms on steel coupons (12°C, salmon broth) of about 10(9) CFU cm(-2) after 2 days. High-throughput sequencing showed that Listeria monocytogenes represented 0·1-0·01% of the biofilm population and that Pseudomonas spp dominated. Interestingly, both Brochothrix sp. and a Pseudomonas sp. dominated in the surrounding suspension. The microbiota surviving sanitation is dominated by Pseudomonas spp. The background microbiota in biofilms inhibit, but do not eliminate L. monocytogenes. The results highlights that sanitation procedures have to been improved in the salmon-processing industry, as high numbers of a diverse microbiota survived practical sanitation. High-throughput sequencing enables strain level studies of population dynamics in biofilm. © 2015 The Society for Applied Microbiology.

  15. Schinus terebinthifolius scale-up countercurrent chromatography (Part I): High performance countercurrent chromatography fractionation of triterpene acids with off-line detection using atmospheric pressure chemical ionization mass spectrometry.

    PubMed

    Vieira, Mariana Neves; Costa, Fernanda das Neves; Leitão, Gilda Guimarães; Garrard, Ian; Hewitson, Peter; Ignatova, Svetlana; Winterhalter, Peter; Jerz, Gerold

    2015-04-10

    'Countercurrent chromatography' (CCC) is an ideal technique for the recovery, purification and isolation of bioactive natural products, due to the liquid nature of the stationary phase, process predictability and the possibility of scale-up from analytical to preparative scale. In this work, a method developed for the fractionation of Schinus terebinthifolius Raddi berries dichloromethane extract was thoroughly optimized to achieve maximal throughput with minimal solvent and time consumption per gram of processed crude extract, using analytical, semi-preparative and preparative 'high performance countercurrent chromatography' (HPCCC) instruments. The method using the biphasic solvent system composed of n-heptane-ethyl acetate-methanol-water (6:1:6:1, v/v/v/v) was volumetrically scaled up to increase sample throughput up to 120 times, while maintaining separation efficiency and time. As a fast and specific detection alternative, the fractions collected from the CCC-separations were injected to an 'atmospheric pressure chemical ionization mass-spectrometer' (APCI-MS/MS) and reconstituted molecular weight MS-chromatograms of the APCI-ionizable compounds from S. terebinthifolius were obtained. This procedure led to the direct isolation of tirucallane type triterpenes such as masticadienonic and 3β-masticadienolic acids. Also oleanonic and moronic acids have been identified for the first time in the species. In summary, this approach can be used for other CCC scale-up processes, enabling MS-target-guided isolation procedures. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Optimization and quality control of genome-wide Hi-C library preparation.

    PubMed

    Zhang, Xiang-Yuan; He, Chao; Ye, Bing-Yu; Xie, De-Jian; Shi, Ming-Lei; Zhang, Yan; Shen, Wen-Long; Li, Ping; Zhao, Zhi-Hu

    2017-09-20

    Highest-throughput chromosome conformation capture (Hi-C) is one of the key assays for genome- wide chromatin interaction studies. It is a time-consuming process that involves many steps and many different kinds of reagents, consumables, and equipments. At present, the reproducibility is unsatisfactory. By optimizing the key steps of the Hi-C experiment, such as crosslinking, pretreatment of digestion, inactivation of restriction enzyme, and in situ ligation etc., we established a robust Hi-C procedure and prepared two biological replicates of Hi-C libraries from the GM12878 cells. After preliminary quality control by Sanger sequencing, the two replicates were high-throughput sequenced. The bioinformatics analysis of the raw sequencing data revealed the mapping-ability and pair-mate rate of the raw data were around 90% and 72%, respectively. Additionally, after removal of self-circular ligations and dangling-end products, more than 96% of the valid pairs were reached. Genome-wide interactome profiling shows clear topological associated domains (TADs), which is consistent with previous reports. Further correlation analysis showed that the two biological replicates strongly correlate with each other in terms of both bin coverage and all bin pairs. All these results indicated that the optimized Hi-C procedure is robust and stable, which will be very helpful for the wide applications of the Hi-C assay.

  17. New insights into old methods for identifying causal rare variants.

    PubMed

    Wang, Haitian; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Hu, Inchi

    2011-11-29

    The advance of high-throughput next-generation sequencing technology makes possible the analysis of rare variants. However, the investigation of rare variants in unrelated-individuals data sets faces the challenge of low power, and most methods circumvent the difficulty by using various collapsing procedures based on genes, pathways, or gene clusters. We suggest a new way to identify causal rare variants using the F-statistic and sliced inverse regression. The procedure is tested on the data set provided by the Genetic Analysis Workshop 17 (GAW17). After preliminary data reduction, we ranked markers according to their F-statistic values. Top-ranked markers were then subjected to sliced inverse regression, and those with higher absolute coefficients in the most significant sliced inverse regression direction were selected. The procedure yields good false discovery rates for the GAW17 data and thus is a promising method for future study on rare variants.

  18. A Target-Based High Throughput Screen Yields Trypanosoma brucei Hexokinase Small Molecule Inhibitors with Antiparasitic Activity

    PubMed Central

    Sharlow, Elizabeth R.; Lyda, Todd A.; Dodson, Heidi C.; Mustata, Gabriela; Morris, Meredith T.; Leimgruber, Stephanie S.; Lee, Kuo-Hsiung; Kashiwada, Yoshiki; Close, David; Lazo, John S.; Morris, James C.

    2010-01-01

    Background The parasitic protozoan Trypanosoma brucei utilizes glycolysis exclusively for ATP production during infection of the mammalian host. The first step in this metabolic pathway is mediated by hexokinase (TbHK), an enzyme essential to the parasite that transfers the γ-phospho of ATP to a hexose. Here we describe the identification and confirmation of novel small molecule inhibitors of bacterially expressed TbHK1, one of two TbHKs expressed by T. brucei, using a high throughput screening assay. Methodology/Principal Findings Exploiting optimized high throughput screening assay procedures, we interrogated 220,233 unique compounds and identified 239 active compounds from which ten small molecules were further characterized. Computation chemical cluster analyses indicated that six compounds were structurally related while the remaining four compounds were classified as unrelated or singletons. All ten compounds were ∼20-17,000-fold more potent than lonidamine, a previously identified TbHK1 inhibitor. Seven compounds inhibited T. brucei blood stage form parasite growth (0.03≤EC50<3 µM) with parasite specificity of the compounds being demonstrated using insect stage T. brucei parasites, Leishmania promastigotes, and mammalian cell lines. Analysis of two structurally related compounds, ebselen and SID 17387000, revealed that both were mixed inhibitors of TbHK1 with respect to ATP. Additionally, both compounds inhibited parasite lysate-derived HK activity. None of the compounds displayed structural similarity to known hexokinase inhibitors or human African trypanosomiasis therapeutics. Conclusions/Significance The novel chemotypes identified here could represent leads for future therapeutic development against the African trypanosome. PMID:20405000

  19. FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science

    NASA Astrophysics Data System (ADS)

    Chikyo, Toyohiro

    2011-10-01

    About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.

  20. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    PubMed

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Spatial tuning of acoustofluidic pressure nodes by altering net sonic velocity enables high-throughput, efficient cell sorting

    DOE PAGES

    Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...

    2015-01-07

    Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.

  2. Quantitative description on structure–property relationships of Li-ion battery materials for high-throughput computations

    PubMed Central

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737

  3. Improved False Discovery Rate Estimation Procedure for Shotgun Proteomics.

    PubMed

    Keich, Uri; Kertesz-Farkas, Attila; Noble, William Stafford

    2015-08-07

    Interpreting the potentially vast number of hypotheses generated by a shotgun proteomics experiment requires a valid and accurate procedure for assigning statistical confidence estimates to identified tandem mass spectra. Despite the crucial role such procedures play in most high-throughput proteomics experiments, the scientific literature has not reached a consensus about the best confidence estimation methodology. In this work, we evaluate, using theoretical and empirical analysis, four previously proposed protocols for estimating the false discovery rate (FDR) associated with a set of identified tandem mass spectra: two variants of the target-decoy competition protocol (TDC) of Elias and Gygi and two variants of the separate target-decoy search protocol of Käll et al. Our analysis reveals significant biases in the two separate target-decoy search protocols. Moreover, the one TDC protocol that provides an unbiased FDR estimate among the target PSMs does so at the cost of forfeiting a random subset of high-scoring spectrum identifications. We therefore propose the mix-max procedure to provide unbiased, accurate FDR estimates in the presence of well-calibrated scores. The method avoids biases associated with the two separate target-decoy search protocols and also avoids the propensity for target-decoy competition to discard a random subset of high-scoring target identifications.

  4. Improved False Discovery Rate Estimation Procedure for Shotgun Proteomics

    PubMed Central

    2016-01-01

    Interpreting the potentially vast number of hypotheses generated by a shotgun proteomics experiment requires a valid and accurate procedure for assigning statistical confidence estimates to identified tandem mass spectra. Despite the crucial role such procedures play in most high-throughput proteomics experiments, the scientific literature has not reached a consensus about the best confidence estimation methodology. In this work, we evaluate, using theoretical and empirical analysis, four previously proposed protocols for estimating the false discovery rate (FDR) associated with a set of identified tandem mass spectra: two variants of the target-decoy competition protocol (TDC) of Elias and Gygi and two variants of the separate target-decoy search protocol of Käll et al. Our analysis reveals significant biases in the two separate target-decoy search protocols. Moreover, the one TDC protocol that provides an unbiased FDR estimate among the target PSMs does so at the cost of forfeiting a random subset of high-scoring spectrum identifications. We therefore propose the mix-max procedure to provide unbiased, accurate FDR estimates in the presence of well-calibrated scores. The method avoids biases associated with the two separate target-decoy search protocols and also avoids the propensity for target-decoy competition to discard a random subset of high-scoring target identifications. PMID:26152888

  5. High-throughput screening (HTS) and modeling of the retinoid ...

    EPA Pesticide Factsheets

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  6. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    EPA Science Inventory

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  7. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  8. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  9. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  10. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    PubMed

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  11. Outlook for Development of High-throughput Cryopreservation for Small-bodied Biomedical Model Fishes★

    PubMed Central

    Tiersch, Terrence R.; Yang, Huiping; Hu, E.

    2011-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666

  12. High-throughput measurements of biochemical responses using the plate::vision multimode 96 minilens array reader.

    PubMed

    Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich

    2006-01-01

    The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.

  13. Improved integrating-sphere throughput with a lens and nonimaging concentrator.

    PubMed

    Chenault, D B; Snail, K A; Hanssen, L M

    1995-12-01

    A reflectometer design utilizing an integrating sphere with a lens and nonimaging concentrator is described. Compared with previous designs where a collimator was used to restrict the detector field of view, the concentrator-lens combination significantly increases the throughput of the reflectometer. A procedure for designing lens-concentrators is given along with the results of parametric studies. The measured angular response of a lens-concentrator system is compared with ray-trace predictions and with the response of an ideal system.

  14. Microwave assisted saponification (MAS) followed by on-line liquid chromatography (LC)-gas chromatography (GC) for high-throughput and high-sensitivity determination of mineral oil in different cereal-based foodstuffs.

    PubMed

    Moret, Sabrina; Scolaro, Marianna; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S

    2016-04-01

    A high throughput, high-sensitivity procedure, involving simultaneous microwave-assisted extraction (MAS) and unsaponifiable extraction, followed by on-line liquid chromatography (LC)-gas chromatography (GC), has been optimised for rapid and efficient extraction and analytical determination of mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH) in cereal-based products of different composition. MAS has the advantage of eliminating fat before LC-GC analysis, allowing an increase in the amount of sample extract injected, and hence in sensitivity. The proposed method gave practically quantitative recoveries and good repeatability. Among the different cereal-based products analysed (dry semolina and egg pasta, bread, biscuits, and cakes), egg pasta packed in direct contact with recycled paperboard had on average the highest total MOSH level (15.9 mg kg(-1)), followed by cakes (10.4 mg kg(-1)) and bread (7.5 mg kg(-1)). About 50% of the pasta and bread samples and 20% of the biscuits and cake samples had detectable MOAH amounts. The highest concentrations were found in an egg pasta in direct contact with recycled paperboard (3.6 mg kg(-1)) and in a milk bread (3.6 mg kg(-1)). Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    PubMed

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  16. Purification of microalgae from bacterial contamination using a disposable inertia-based microfluidic device

    NASA Astrophysics Data System (ADS)

    Godino, Neus; Jorde, Felix; Lawlor, Daryl; Jaeger, Magnus; Duschl, Claus

    2015-08-01

    Microalgae are a promising source of bioactive ingredients for the food, pharmaceutical and cosmetic industries. Every microalgae research group or production facility is facing one major problem regarding the potential contamination of the algal cell with bacteria. Prior to the storage of the microalgae in strain collections or to cultivation in bioreactors, it is necessary to carry out laborious purification procedures to separate the microalgae from the undesired bacterial cells. In this work, we present a disposable microfluidic cartridge for the high-throughput purification of microalgae samples based on inertial microfluidics. Some of the most relevant microalgae strains have a larger size than the relatively small, few micron bacterial cells, so making them distinguishable by size. The inertial microfluidic cartridge was fabricated with inexpensive materials, like pressure sensitive adhesive (PSA) and thin plastic layers, which were patterned using a simple cutting plotter. In spite of fabrication restrictions and the intrinsic difficulties of biological samples, the separation of microalgae from bacteria reached values in excess of 99%, previously only achieved using conventional high-end and high cost lithography methods. Moreover, due to the simple and high-throughput characteristic of the separation, it is possible to concatenate serial purification to exponentially decrease the absolute amount of bacteria in the final purified sample.

  17. Cell-free measurements of brightness of fluorescently labeled antibodies

    PubMed Central

    Zhou, Haiying; Tourkakis, George; Shi, Dennis; Kim, David M.; Zhang, Hairong; Du, Tommy; Eades, William C.; Berezin, Mikhail Y.

    2017-01-01

    Validation of imaging contrast agents, such as fluorescently labeled imaging antibodies, has been recognized as a critical challenge in clinical and preclinical studies. As the number of applications for imaging antibodies grows, these materials are increasingly being subjected to careful scrutiny. Antibody fluorescent brightness is one of the key parameters that is of critical importance. Direct measurements of the brightness with common spectroscopy methods are challenging, because the fluorescent properties of the imaging antibodies are highly sensitive to the methods of conjugation, degree of labeling, and contamination with free dyes. Traditional methods rely on cell-based assays that lack reproducibility and accuracy. In this manuscript, we present a novel and general approach for measuring the brightness using antibody-avid polystyrene beads and flow cytometry. As compared to a cell-based method, the described technique is rapid, quantitative, and highly reproducible. The proposed method requires less than ten microgram of sample and is applicable for optimizing synthetic conjugation procedures, testing commercial imaging antibodies, and performing high-throughput validation of conjugation procedures. PMID:28150730

  18. TCP Throughput Profiles Using Measurements over Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less

  19. Enhancing high throughput toxicology - development of putative adverse outcome pathways linking US EPA ToxCast screening targets to relevant apical hazards.

    EPA Science Inventory

    High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...

  20. Evaluation of High-Throughput Chemical Exposure Models via Analysis of Matched Environmental and Biological Media Measurements

    EPA Science Inventory

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...

  1. The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD

    NASA Astrophysics Data System (ADS)

    Cox, M. A.; Reed, R.; Mellado, B.

    2015-01-01

    After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.

  2. [Current applications of high-throughput DNA sequencing technology in antibody drug research].

    PubMed

    Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong

    2012-03-01

    Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.

  3. Evaluation of Flight Deck-Based Interval Management Crew Procedure Feasibility

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.; Murdoch, Jennifer L.; Hubbs, Clay E.; Swieringa, Kurt A.

    2013-01-01

    Air traffic demand is predicted to increase over the next 20 years, creating a need for new technologies and procedures to support this growth in a safe and efficient manner. The National Aeronautics and Space Administration's (NASA) Air Traffic Management Technology Demonstration - 1 (ATD-1) will operationally demonstrate the feasibility of efficient arrival operations combining ground-based and airborne NASA technologies. The integration of these technologies will increase throughput, reduce delay, conserve fuel, and minimize environmental impacts. The ground-based tools include Traffic Management Advisor with Terminal Metering for precise time-based scheduling and Controller Managed Spacing decision support tools for better managing aircraft delay with speed control. The core airborne technology in ATD-1 is Flight deck-based Interval Management (FIM). FIM tools provide pilots with speed commands calculated using information from Automatic Dependent Surveillance - Broadcast. The precise merging and spacing enabled by FIM avionics and flight crew procedures will reduce excess spacing buffers and result in higher terminal throughput. This paper describes a human-in-the-loop experiment designed to assess the acceptability and feasibility of the ATD-1 procedures used in a voice communications environment. This experiment utilized the ATD-1 integrated system of ground-based and airborne technologies. Pilot participants flew a high-fidelity fixed base simulator equipped with an airborne spacing algorithm and a FIM crew interface. Experiment scenarios involved multiple air traffic flows into the Dallas-Fort Worth Terminal Radar Control airspace. Results indicate that the proposed procedures were feasible for use by flight crews in a voice communications environment. The delivery accuracy at the achieve-by point was within +/- five seconds and the delivery precision was less than five seconds. Furthermore, FIM speed commands occurred at a rate of less than one per minute, and pilots found the frequency of the speed commands to be acceptable at all times throughout the experiment scenarios.

  4. High Throughput Measurement of Locomotor Sensitization to Volatilized Cocaine in Drosophila melanogaster.

    PubMed

    Filošević, Ana; Al-Samarai, Sabina; Andretić Waldowski, Rozi

    2018-01-01

    Drosophila melanogaster can be used to identify genes with novel functional roles in neuronal plasticity induced by repeated consumption of addictive drugs. Behavioral sensitization is a relatively simple behavioral output of plastic changes that occur in the brain after repeated exposures to drugs of abuse. The development of screening procedures for genes that control behavioral sensitization has stalled due to a lack of high-throughput behavioral tests that can be used in genetically tractable organism, such as Drosophila . We have developed a new behavioral test, FlyBong, which combines delivery of volatilized cocaine (vCOC) to individually housed flies with objective quantification of their locomotor activity. There are two main advantages of FlyBong: it is high-throughput and it allows for comparisons of locomotor activity of individual flies before and after single or multiple exposures. At the population level, exposure to vCOC leads to transient and concentration-dependent increase in locomotor activity, representing sensitivity to an acute dose. A second exposure leads to further increase in locomotion, representing locomotor sensitization. We validate FlyBong by showing that locomotor sensitization at either the population or individual level is absent in the mutants for circadian genes period (per) , Clock (Clk) , and cycle (cyc) . The locomotor sensitization that is present in timeless (tim) and pigment dispersing factor (pdf) mutant flies is in large part not cocaine specific, but derived from increased sensitivity to warm air. Circadian genes are not only integral part of the neural mechanism that is required for development of locomotor sensitization, but in addition, they modulate the intensity of locomotor sensitization as a function of the time of day. Motor-activating effects of cocaine are sexually dimorphic and require a functional dopaminergic transporter. FlyBong is a new and improved method for inducing and measuring locomotor sensitization to cocaine in individual Drosophila . Because of its high-throughput nature, FlyBong can be used in genetic screens or in selection experiments aimed at the unbiased identification of functional genes involved in acute or chronic effects of volatilized psychoactive substances.

  5. High Throughput Measurement of Locomotor Sensitization to Volatilized Cocaine in Drosophila melanogaster

    PubMed Central

    Filošević, Ana; Al-samarai, Sabina; Andretić Waldowski, Rozi

    2018-01-01

    Drosophila melanogaster can be used to identify genes with novel functional roles in neuronal plasticity induced by repeated consumption of addictive drugs. Behavioral sensitization is a relatively simple behavioral output of plastic changes that occur in the brain after repeated exposures to drugs of abuse. The development of screening procedures for genes that control behavioral sensitization has stalled due to a lack of high-throughput behavioral tests that can be used in genetically tractable organism, such as Drosophila. We have developed a new behavioral test, FlyBong, which combines delivery of volatilized cocaine (vCOC) to individually housed flies with objective quantification of their locomotor activity. There are two main advantages of FlyBong: it is high-throughput and it allows for comparisons of locomotor activity of individual flies before and after single or multiple exposures. At the population level, exposure to vCOC leads to transient and concentration-dependent increase in locomotor activity, representing sensitivity to an acute dose. A second exposure leads to further increase in locomotion, representing locomotor sensitization. We validate FlyBong by showing that locomotor sensitization at either the population or individual level is absent in the mutants for circadian genes period (per), Clock (Clk), and cycle (cyc). The locomotor sensitization that is present in timeless (tim) and pigment dispersing factor (pdf) mutant flies is in large part not cocaine specific, but derived from increased sensitivity to warm air. Circadian genes are not only integral part of the neural mechanism that is required for development of locomotor sensitization, but in addition, they modulate the intensity of locomotor sensitization as a function of the time of day. Motor-activating effects of cocaine are sexually dimorphic and require a functional dopaminergic transporter. FlyBong is a new and improved method for inducing and measuring locomotor sensitization to cocaine in individual Drosophila. Because of its high-throughput nature, FlyBong can be used in genetic screens or in selection experiments aimed at the unbiased identification of functional genes involved in acute or chronic effects of volatilized psychoactive substances. PMID:29459820

  6. Polymer surface functionalities that control human embryoid body cell adhesion revealed by high throughput surface characterization of combinatorial material microarrays

    PubMed Central

    Yang, Jing; Mei, Ying; Hook, Andrew L.; Taylor, Michael; Urquhart, Andrew J.; Bogatyrev, Said R.; Langer, Robert; Anderson, Daniel G.; Davies, Martyn C.; Alexander, Morgan R.

    2010-01-01

    High throughput materials discovery using combinatorial polymer microarrays to screen for new biomaterials with new and improved function is established as a powerful strategy. Here we combine this screening approach with high throughput surface characterisation (HT-SC) to identify surface structure-function relationships. We explore how this combination can help to identify surface chemical moieties that control protein adsorption and subsequent cellular response. The adhesion of human embryoid body (hEB) cells to a large number (496) of different acrylate polymers synthesized in a microarray format is screened using a high throughput procedure. To determine the role of the polymer surface properties on hEB cell adhesion, detailed HT-SC of these acrylate polymers is carried out using time of flight secondary ion mass spectrometry (ToF SIMS), x-ray photoelectron spectroscopy (XPS), pico litre drop sessile water contact angle (WCA) measurement and atomic force microscopy (AFM). A structure-function relationship is identified between the ToF SIMS analysis of the surface chemistry after a fibronectin (Fn) pre-conditioning step and the cell adhesion to each spot using the multivariate analysis technique partial least squares (PLS) regression. Secondary ions indicative of the adsorbed Fn correlate with increased cell adhesion whereas glycol and other functionalities from the polymers are identified that reduce cell adhesion. Furthermore, a strong relationship between the ToF SIMS spectra of bare polymers and the cell adhesion to each spot is identified using PLS regression. This identifies a role for both the surface chemistry of the bare polymer and the pre-adsorbed Fn, as-represented in the ToF SIMS spectra, in controlling cellular adhesion. In contrast, no relationship is found between cell adhesion and wettability, surface roughness, elemental or functional surface composition. The correlation between ToF SIMS data of the surfaces and the cell adhesion demonstrates the ability of identifying surface moieties that control protein adsorption and subsequent cell adhesion using ToF SIMS and multivariate analysis. PMID:20832108

  7. Fluorescence Adherence Inhibition Assay: A Novel Functional Assessment of Blocking Virus Attachment by Vaccine-Induced Antibodies

    PubMed Central

    Asati, Atul; Kachurina, Olga; Karol, Alex; Dhir, Vipra; Nguyen, Michael; Parkhill, Robert; Kouiavskaia, Diana; Chumakov, Konstantin; Warren, William; Kachurin, Anatoly

    2016-01-01

    Neutralizing antibodies induced by vaccination or natural infection play a critically important role in protection against the viral diseases. In general, neutralization of the viral infection occurs via two major pathways: pre- and post-attachment modes, the first being the most important for such infections as influenza and polio, the latter being significant for filoviruses. Neutralizing capacity of antibodies is typically evaluated by virus neutralization assays that assess reduction of viral infectivity to the target cells in the presence of functional antibodies. Plaque reduction neutralization test, microneutralization and immunofluorescent assays are often used as gold standard virus neutralization assays. However, these methods are associated with several important prerequisites such as use of live virus requiring safety precautions, tedious evaluation procedure and long assessment time. Hence, there is a need for a robust, inexpensive high throughput functional assay that can be performed rapidly using inactivated virus, without extensive safety precautions. Herein, we report a novel high throughput Fluorescence Adherence Inhibition assay (fADI) using inactivated virus labeled with fluorescent secondary antibodies virus and Vero cells or erythrocytes as targets. It requires only few hours to assess pre-attachment neutralizing capacity of donor sera. fADI assay was tested successfully on donors immunized with polio, yellow fever and influenza vaccines. To further simplify and improve the throughput of the assay, we have developed a mathematical approach for calculating the 50% titers from a single sample dilution, without the need to analyze multi-point titration curves. Assessment of pre- and post-vaccination human sera from subjects immunized with IPOL®, YF-VAX® and 2013–2014 Fluzone® vaccines demonstrated high efficiency of the assay. The results correlated very well with microneutralization assay performed independently by the FDA Center of Biologics Evaluation and Research, with plaque reduction neutralization test performed by Focus Diagnostics, and with hemaglutination inhibition assay performed in-house at Sanofi Pasteur. Taken together, fADI assay appears to be a useful high throughput functional immunoassay for assessment of antibody-related neutralization of the viral infections for which pre-attachment neutralization pathway is predominant, such as polio, influenza, yellow fever and dengue. PMID:26863313

  8. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    PubMed Central

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  9. High-throughput screening based on label-free detection of small molecule microarrays

    NASA Astrophysics Data System (ADS)

    Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong

    2017-02-01

    Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.

  10. High-throughput analysis of yeast replicative aging using a microfluidic system

    PubMed Central

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-01-01

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317

  11. Determination of strontium-90 from direct separation of yttrium-90 by solid phase extraction using DGA Resin for seawater monitoring.

    PubMed

    Tazoe, Hirofumi; Obata, Hajime; Yamagata, Takeyasu; Karube, Zin'ichi; Nagai, Hisao; Yamada, Masatoshi

    2016-05-15

    It is important for public safety to monitor strontium-90 in aquatic environments in the vicinity of nuclear related facilities. Strontium-90 concentrations in seawater exceeding the background level have been observed in accidents of nuclear facilities. However, the analytical procedure for measuring strontium-90 in seawater is highly demanding. Here we show a simple and high throughput analytical technique for the determination of strontium-90 in seawater samples using a direct yttrium-90 separation. The DGA Resin is used to determine the abundance of strontium-90 by detecting yttrium-90 decay (beta-emission) in secular equilibrium. The DGA Resin can selectively collect yttrium-90 and remove naturally occurring radionuclides such as (40)K, (210)Pb, (214)Bi, (238)U, and (232)Th and anthropogenic radionuclides such as (140)Ba, and (140)La. Through a sample separation procedure, a high chemical yield of yttrium-90 was achieved at 95.5±2.3%. The result of IAEA-443 certified seawater analysis (107.7±3.4 mBq kg(-1)) was in good agreement with the certified value (110±5 mBq kg(-1)). By developed method, we can finish analyzing 8 samples per day after achieving secular equilibrium, which is a reasonably fast throughput in actual seawater monitoring. By processing 3 L of seawater sample and applying a counting time of 20 h, minimum detectable activity can be as low as 1.5 mBq kg(-1), which could be applied to monitoring for the contaminated marine environment. Reproducibility was found to be 3.4% according to 10 independent analyses of natural seawater samples from the vicinity of the Fukushima Daiichi Nuclear Power Plant in September 2013. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  13. POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.

    PubMed

    Peña, Edsel A; Habiger, Joshua D; Wu, Wensong

    2011-02-01

    Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.

  14. 40 CFR Table 9 to Subpart Eeee of... - Continuous Compliance With Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...

  15. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    PubMed

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  16. Phenotypic approaches to drought in cassava: review

    PubMed Central

    Okogbenin, Emmanuel; Setter, Tim L.; Ferguson, Morag; Mutegi, Rose; Ceballos, Hernan; Olasanmi, Bunmi; Fregene, Martin

    2012-01-01

    Cassava is an important crop in Africa, Asia, Latin America, and the Caribbean. Cassava can be produced adequately in drought conditions making it the ideal food security crop in marginal environments. Although cassava can tolerate drought stress, it can be genetically improved to enhance productivity in such environments. Drought adaptation studies in over three decades in cassava have identified relevant mechanisms which have been explored in conventional breeding. Drought is a quantitative trait and its multigenic nature makes it very challenging to effectively manipulate and combine genes in breeding for rapid genetic gain and selection process. Cassava has a long growth cycle of 12–18 months which invariably contributes to a long breeding scheme for the crop. Modern breeding using advances in genomics and improved genotyping, is facilitating the dissection and genetic analysis of complex traits including drought tolerance, thus helping to better elucidate and understand the genetic basis of such traits. A beneficial goal of new innovative breeding strategies is to shorten the breeding cycle using minimized, efficient or fast phenotyping protocols. While high throughput genotyping have been achieved, this is rarely the case for phenotyping for drought adaptation. Some of the storage root phenotyping in cassava are often done very late in the evaluation cycle making selection process very slow. This paper highlights some modified traits suitable for early-growth phase phenotyping that may be used to reduce drought phenotyping cycle in cassava. Such modified traits can significantly complement the high throughput genotyping procedures to fast track breeding of improved drought tolerant varieties. The need for metabolite profiling, improved phenomics to take advantage of next generation sequencing technologies and high throughput phenotyping are basic steps for future direction to improve genetic gain and maximize speed for drought tolerance breeding. PMID:23717282

  17. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  18. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  19. The combination of gas-phase fluorophore technology and automation to enable high-throughput analysis of plant respiration.

    PubMed

    Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K

    2017-01-01

    Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.

  20. Simultaneous Measurements of Auto-Immune and Infectious Disease Specific Antibodies Using a High Throughput Multiplexing Tool

    PubMed Central

    Asati, Atul; Kachurina, Olga; Kachurin, Anatoly

    2012-01-01

    Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605

  1. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  2. Adenylylation of small RNA sequencing adapters using the TS2126 RNA ligase I.

    PubMed

    Lama, Lodoe; Ryan, Kevin

    2016-01-01

    Many high-throughput small RNA next-generation sequencing protocols use 5' preadenylylated DNA oligonucleotide adapters during cDNA library preparation. Preadenylylation of the DNA adapter's 5' end frees from ATP-dependence the ligation of the adapter to RNA collections, thereby avoiding ATP-dependent side reactions. However, preadenylylation of the DNA adapters can be costly and difficult. The currently available method for chemical adenylylation of DNA adapters is inefficient and uses techniques not typically practiced in laboratories profiling cellular RNA expression. An alternative enzymatic method using a commercial RNA ligase was recently introduced, but this enzyme works best as a stoichiometric adenylylating reagent rather than a catalyst and can therefore prove costly when several variant adapters are needed or during scale-up or high-throughput adenylylation procedures. Here, we describe a simple, scalable, and highly efficient method for the 5' adenylylation of DNA oligonucleotides using the thermostable RNA ligase 1 from bacteriophage TS2126. Adapters with 3' blocking groups are adenylylated at >95% yield at catalytic enzyme-to-adapter ratios and need not be gel purified before ligation to RNA acceptors. Experimental conditions are also reported that enable DNA adapters with free 3' ends to be 5' adenylylated at >90% efficiency. © 2015 Lama and Ryan; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  3. Ribosomal Binding Site Switching: An Effective Strategy for High-Throughput Cloning Constructions

    PubMed Central

    Li, Yunlong; Zhang, Yong; Lu, Pei; Rayner, Simon; Chen, Shiyun

    2012-01-01

    Direct cloning of PCR fragments by TA cloning or blunt end ligation are two simple methods which would greatly benefit high-throughput (HTP) cloning constructions if the efficiency can be improved. In this study, we have developed a ribosomal binding site (RBS) switching strategy for direct cloning of PCR fragments. RBS is an A/G rich region upstream of the translational start codon and is essential for gene expression. Change from A/G to T/C in the RBS blocks its activity and thereby abolishes gene expression. Based on this property, we introduced an inactive RBS upstream of a selectable marker gene, and designed a fragment insertion site within this inactive RBS. Forward and reverse insertions of specifically tailed fragments will respectively form an active and inactive RBS, thus all background from vector self-ligation and fragment reverse insertions will be eliminated due to the non-expression of the marker gene. The effectiveness of our strategy for TA cloning and blunt end ligation are confirmed. Application of this strategy to gene over-expression, a bacterial two-hybrid system, a bacterial one-hybrid system, and promoter bank construction are also verified. The advantages of this simple procedure, together with its low cost and high efficiency, makes our strategy extremely useful in HTP cloning constructions. PMID:23185557

  4. Identification of Novel Pro-Migratory, Cancer-Associated Genes Using Quantitative, Microscopy-Based Screening

    PubMed Central

    Naffar-Abu-Amara, Suha; Shay, Tal; Galun, Meirav; Cohen, Naomi; Isakoff, Steven J.; Kam, Zvi; Geiger, Benjamin

    2008-01-01

    Background Cell migration is a highly complex process, regulated by multiple genes, signaling pathways and external stimuli. To discover genes or pharmacological agents that can modulate the migratory activity of cells, screening strategies that enable the monitoring of diverse migratory parameters in a large number of samples are necessary. Methodology In the present study, we describe the development of a quantitative, high-throughput cell migration assay, based on a modified phagokinetic tracks (PKT) procedure, and apply it for identifying novel pro-migratory genes in a cancer-related gene library. In brief, cells are seeded on fibronectin-coated 96-well plates, covered with a monolayer of carboxylated latex beads. Motile cells clear the beads, located along their migratory paths, forming tracks that are visualized using an automated, transmitted-light screening microscope. The tracks are then segmented and characterized by multi-parametric, morphometric analysis, resolving a variety of morphological and kinetic features. Conclusions In this screen we identified 4 novel genes derived from breast carcinoma related cDNA library, whose over-expression induces major alteration in the migration of the stationary MCF7 cells. This approach can serve for high throughput screening for novel ways to modulate cellular migration in pathological states such as tumor metastasis and invasion. PMID:18213366

  5. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    PubMed

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  6. High Throughput T Epitope Mapping and Vaccine Development

    PubMed Central

    Li Pira, Giuseppina; Ivaldi, Federico; Moretti, Paolo; Manca, Fabrizio

    2010-01-01

    Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th) and by cytolytic T lymphocytes (CTL) is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP) approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost. PMID:20617148

  7. A novel approach to the simultaneous extraction and non-targeted analysis of the small molecules metabolome and lipidome using 96-well solid phase extraction plates with column-switching technology.

    PubMed

    Li, Yubo; Zhang, Zhenzhu; Liu, Xinyu; Li, Aizhu; Hou, Zhiguo; Wang, Yuming; Zhang, Yanjun

    2015-08-28

    This study combines solid phase extraction (SPE) using 96-well plates with column-switching technology to construct a rapid and high-throughput method for the simultaneous extraction and non-targeted analysis of small molecules metabolome and lipidome based on ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry. This study first investigated the columns and analytical conditions for small molecules metabolome and lipidome, separated by an HSS T3 and BEH C18 columns, respectively. Next, the loading capacity and actuation duration of SPE were further optimized. Subsequently, SPE and column switching were used together to rapidly and comprehensively analyze the biological samples. The experimental results showed that the new analytical procedure had good precision and maintained sample stability (RSD<15%). The method was then satisfactorily applied to more widely analyze the small molecules metabolome and lipidome to test the throughput. The resulting method represents a new analytical approach for biological samples, and a highly useful tool for researches in metabolomics and lipidomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Porous silicon mass spectrometry as an alternative confirmatory assay for compliance testing of methadone.

    PubMed

    Guinan, Taryn M; Neldner, Declan; Stockham, Peter; Kobus, Hilton; Della Vedova, Christopher B; Voelcker, Nicolas H

    2017-05-01

    Porous silicon based surface-assisted laser desorption ionization mass spectrometry (pSi SALDI-MS) is an analytical technique well suited for high throughput analysis of low molecular weight compounds from biological samples. A potential application of this technology is the compliance monitoring of opioid addiction programmes, where methadone is used as a pharmacological treatment for drugs such as heroin. Here, we present the detection and quantification of methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP) from water and clinical samples (saliva, urine, and plasma) from opioid dependent participants using pSi SALDI-MS. A one-step solvent phase extraction using chloroform was developed for the detection of methadone from clinical samples for analysis by pSi SALDI-MS. Liquid chromatography-mass spectrometry (LC-MS) was used as a comparative technique for the quantification of methadone from clinical saliva and plasma samples. In all cases, we obtained a good correlation of pSi SALDI-MS and LC-MS results, suggesting that pSi SALDI-MS may be an alternative procedure for high-throughput screening and quantification for application in opioid compliance testing. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Development and validation of a high throughput assay for the quantification of multiple green tea-derived catechins in human plasma.

    PubMed

    Mawson, Deborah H; Jeffrey, Keon L; Teale, Philip; Grace, Philip B

    2018-06-19

    A rapid, accurate and robust method for the determination of catechin (C), epicatechin (EC), gallocatechin (GC), epigallocatechin (EGC), catechin gallate (Cg), epicatechin gallate (ECg), gallocatechin gallate (GCg) and epigallocatechin gallate (EGCg) concentrations in human plasma has been developed. The method utilises protein precipitation following enzyme hydrolysis, with chromatographic separation and detection using reversed-phase liquid chromatography - tandem mass spectrometry (LC-MS/MS). Traditional issues such as lengthy chromatographic run times, sample and extract stability, and lack of suitable internal standards have been addressed. The method has been evaluated using a comprehensive validation procedure, confirming linearity over appropriate concentration ranges, and inter/intra batch precision and accuracies within suitable thresholds (precisions within 13.8% and accuracies within 12.4%). Recoveries of analytes were found to be consistent between different matrix samples, compensated for using suitable internal markers and within the performance of the instrumentation used. Similarly, chromatographic interferences have been corrected using the internal markers selected. Stability of all analytes in matrix is demonstrated over 32 days and throughout extraction conditions. This method is suitable for high throughput sample analysis studies. This article is protected by copyright. All rights reserved.

  10. High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.

    2016-09-23

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less

  11. Ultra-High Throughput Synthesis of Nanoparticles with Homogeneous Size Distribution Using a Coaxial Turbulent Jet Mixer

    PubMed Central

    2015-01-01

    High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296

  12. The Newick utilities: high-throughput phylogenetic tree processing in the UNIX shell.

    PubMed

    Junier, Thomas; Zdobnov, Evgeny M

    2010-07-01

    We present a suite of Unix shell programs for processing any number of phylogenetic trees of any size. They perform frequently-used tree operations without requiring user interaction. They also allow tree drawing as scalable vector graphics (SVG), suitable for high-quality presentations and further editing, and as ASCII graphics for command-line inspection. As an example we include an implementation of bootscanning, a procedure for finding recombination breakpoints in viral genomes. C source code, Python bindings and executables for various platforms are available from http://cegg.unige.ch/newick_utils. The distribution includes a manual and example data. The package is distributed under the BSD License. thomas.junier@unige.ch

  13. Automated sample-preparation technologies in genome sequencing projects.

    PubMed

    Hilbert, H; Lauber, J; Lubenow, H; Düsterhöft, A

    2000-01-01

    A robotic workstation system (BioRobot 96OO, QIAGEN) and a 96-well UV spectrophotometer (Spectramax 250, Molecular Devices) were integrated in to the process of high-throughput automated sequencing of double-stranded plasmid DNA templates. An automated 96-well miniprep kit protocol (QIAprep Turbo, QIAGEN) provided high-quality plasmid DNA from shotgun clones. The DNA prepared by this procedure was used to generate more than two mega bases of final sequence data for two genomic projects (Arabidopsis thaliana and Schizosaccharomyces pombe), three thousand expressed sequence tags (ESTs) plus half a mega base of human full-length cDNA clones, and approximately 53,000 single reads for a whole genome shotgun project (Pseudomonas putida).

  14. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Preparation of Protein Samples for NMR Structure, Function, and Small Molecule Screening Studies

    PubMed Central

    Acton, Thomas B.; Xiao, Rong; Anderson, Stephen; Aramini, James; Buchwald, William A.; Ciccosanti, Colleen; Conover, Ken; Everett, John; Hamilton, Keith; Huang, Yuanpeng Janet; Janjua, Haleema; Kornhaber, Gregory; Lau, Jessica; Lee, Dong Yup; Liu, Gaohua; Maglaqui, Melissa; Ma, Lichung; Mao, Lei; Patel, Dayaban; Rossi, Paolo; Sahdev, Seema; Shastry, Ritu; Swapna, G.V.T.; Tang, Yeufeng; Tong, Saichiu; Wang, Dongyan; Wang, Huang; Zhao, Li; Montelione, Gaetano T.

    2014-01-01

    In this chapter, we concentrate on the production of high quality protein samples for NMR studies. In particular, we provide an in-depth description of recent advances in the production of NMR samples and their synergistic use with recent advancements in NMR hardware. We describe the protein production platform of the Northeast Structural Genomics Consortium, and outline our high-throughput strategies for producing high quality protein samples for nuclear magnetic resonance (NMR) studies. Our strategy is based on the cloning, expression and purification of 6X-His-tagged proteins using T7-based Escherichia coli systems and isotope enrichment in minimal media. We describe 96-well ligation-independent cloning and analytical expression systems, parallel preparative scale fermentation, and high-throughput purification protocols. The 6X-His affinity tag allows for a similar two-step purification procedure implemented in a parallel high-throughput fashion that routinely results in purity levels sufficient for NMR studies (> 97% homogeneity). Using this platform, the protein open reading frames of over 17,500 different targeted proteins (or domains) have been cloned as over 28,000 constructs. Nearly 5,000 of these proteins have been purified to homogeneity in tens of milligram quantities (see Summary Statistics, http://nesg.org/statistics.html), resulting in more than 950 new protein structures, including more than 400 NMR structures, deposited in the Protein Data Bank. The Northeast Structural Genomics Consortium pipeline has been effective in producing protein samples of both prokaryotic and eukaryotic origin. Although this paper describes our entire pipeline for producing isotope-enriched protein samples, it focuses on the major updates introduced during the last 5 years (Phase 2 of the National Institute of General Medical Sciences Protein Structure Initiative). Our advanced automated and/or parallel cloning, expression, purification, and biophysical screening technologies are suitable for implementation in a large individual laboratory or by a small group of collaborating investigators for structural biology, functional proteomics, ligand screening and structural genomics research. PMID:21371586

  16. Characterization of matrix effects in developing rugged high-throughput LC-MS/MS methods for bioanalysis.

    PubMed

    Li, Fumin; Wang, Jun; Jenkins, Rand

    2016-05-01

    There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.

  17. Raman microspectrometer combined with scattering microscopy and lensless imaging for bacteria identification

    NASA Astrophysics Data System (ADS)

    Strola, S. A.; Schultz, E.; Allier, C. P.; DesRoches, B.; Lemmonier, J.; Dinten, J.-M.

    2013-03-01

    In this paper, we report on a compact prototype capable both of lensfree imaging, Raman spectrometry and scattering microscopy from bacteria samples. This instrument allows high-throughput real-time characterization without the need of markers, making it potentially suitable to field label-free biomedical and environmental applications. Samples are illuminated from above with a focused-collimated 532nm laser beam and can be x-y-z scanned. The bacteria detection is based on emerging lensfree imaging technology able to localize cells of interest over a large field-of-view of 24mm2. Raman signal and scattered light are then collected by separate measurement arms simultaneously. In the first arm the emission light is fed by a fiber into a prototype spectrometer, developed by Tornado Spectral System based on Tornado's High Throughput Virtual Slit (HTVS) novel technology. The enhanced light throughput in the spectral region of interest (500-1800 cm-1) reduces Raman acquisition time down to few seconds, thus facilitating experimental protocols and avoiding the bacteria deterioration induced by laser thermal heating. Scattered light impinging in the second arm is collected onto a charge-coupled-device. The reconstructed image allows studying the single bacteria diffraction pattern and their specific structural features. The characterization and identification of different bacteria have been performed to validate and optimize the acquisition system and the component setup. The results obtained demonstrate the benefits of these three techniques combination by providing the precise bacteria localization, their chemical composition and a morphology description. The procedure for a rapid identification of particular pathogen bacteria in a sample is illustrated.

  18. High throughput system for magnetic manipulation of cells, polymers, and biomaterials

    PubMed Central

    Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.

    2008-01-01

    In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357

  19. Identification of functional modules using network topology and high-throughput data.

    PubMed

    Ulitsky, Igor; Shamir, Ron

    2007-01-26

    With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.

  20. Evaluation of a new modified QuEChERS method for the monitoring of carbamate residues in high-fat cheeses by using UHPLC-MS/MS.

    PubMed

    Hamed, Ahmed M; Moreno-González, David; Gámiz-Gracia, Laura; García-Campaña, Ana M

    2017-01-01

    A simple and efficient method for the determination of 28 carbamates in high-fat cheeses is proposed. The methodology is based on a modified quick, easy, cheap, effective, rugged, and safe procedure as sample treatment using a new sorbent (Z-Sep + ) followed by ultra-high performance liquid chromatography with tandem mass spectrometry determination. The method has been validated in different kinds of cheese (Gorgonzola, Roquefort, and Camembert), achieving recoveries of 70-115%, relative standard deviations lower than 13% and limits of quantification lower than 5.4 μg/kg, below the maximum residue levels tolerated for these compounds by the European legislation. The matrix effect was lower than ±30% for all the studied pesticides. The combination of ultra-high performance liquid chromatography and tandem mass spectrometry with this modified quick, easy, cheap, effective, rugged, and safe procedure using Z-Sep + allowed a high sample throughput and an efficient cleaning of extracts for the control of these residues in cheeses with a high fat content. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.

  2. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  3. Chromato-panning: an efficient new mode of identifying suitable ligands from phage display libraries

    PubMed Central

    Noppe, Wim; Plieva, Fatima; Galaev, Igor Yu; Pottel, Hans; Deckmyn, Hans; Mattiasson, Bo

    2009-01-01

    Background Phage Display technology is a well established technique for high throughput screening of affinity ligands. Here we describe a new compact chromato-panning procedure for selection of suitable binders from a phage peptide display library. Results Both phages and E. coli cells pass non-hindered through the interconnected pores of macroporous gel, so called cryogel. After coupling a ligand to a monolithic cryogel column, the phage library was applied on the column and non-bound phages were washed out. The selection of strong phage-binders was achieved already after the first panning cycle due to the efficient separation of phage-binders from phage-non-binders in chromatographic mode rather than in batch mode as in traditional biopanning procedures. E. coli cells were applied on the column for infection with the specifically bound phages. Conclusion Chromato-panning allows combining several steps of the panning procedure resulting in 4–8 fold decrease of total time needed for phage selection. PMID:19292898

  4. MultiGeMS: detection of SNVs from multiple samples using model selection on high-throughput sequencing data.

    PubMed

    Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping

    2016-05-15

    Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less

  6. Microplate-based high throughput screening procedure for the isolation of lipid-rich marine microalgae

    PubMed Central

    2011-01-01

    We describe a new selection method based on BODIPY (4,4-difluoro-1,3,5,7-tetramethyl-4-bora-3a,4a-diaza-s-indacene) staining, fluorescence activated cell sorting (FACS) and microplate-based isolation of lipid-rich microalgae from an environmental sample. Our results show that direct sorting onto solid medium upon FACS can save about 3 weeks during the scale-up process as compared with the growth of the same cultures in liquid medium. This approach enabled us to isolate a biodiverse collection of several axenic and unialgal cultures of different phyla. PMID:22192119

  7. Microplate-based high throughput screening procedure for the isolation of lipid-rich marine microalgae.

    PubMed

    Pereira, Hugo; Barreira, Luísa; Mozes, André; Florindo, Cláudia; Polo, Cristina; Duarte, Catarina V; Custódio, Luísa; Varela, João

    2011-12-22

    We describe a new selection method based on BODIPY (4,4-difluoro-1,3,5,7-tetramethyl-4-bora-3a,4a-diaza-s-indacene) staining, fluorescence activated cell sorting (FACS) and microplate-based isolation of lipid-rich microalgae from an environmental sample. Our results show that direct sorting onto solid medium upon FACS can save about 3 weeks during the scale-up process as compared with the growth of the same cultures in liquid medium. This approach enabled us to isolate a biodiverse collection of several axenic and unialgal cultures of different phyla.

  8. Transdermal Diagnosis of Malaria Using Vapor Nanobubbles.

    PubMed

    Lukianova-Hleb, Ekaterina; Bezek, Sarah; Szigeti, Reka; Khodarev, Alexander; Kelley, Thomas; Hurrell, Andrew; Berba, Michail; Kumar, Nirbhay; D'Alessandro, Umberto; Lapotko, Dmitri

    2015-07-01

    A fast, precise, noninvasive, high-throughput, and simple approach for detecting malaria in humans and mosquitoes is not possible with current techniques that depend on blood sampling, reagents, facilities, tedious procedures, and trained personnel. We designed a device for rapid (20-second) noninvasive diagnosis of Plasmodium falciparum infection in a malaria patient without drawing blood or using any reagent. This method uses transdermal optical excitation and acoustic detection of vapor nanobubbles around intraparasite hemozoin. The same device also identified individual malaria parasite-infected Anopheles mosquitoes in a few seconds and can be realized as a low-cost universal tool for clinical and field diagnoses.

  9. Development of high through-put Sr isotope analysis for monitoring reservoir integrity for CO{sub 2} storage.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Andy; Jain, Jinesh; Stewart, Brian

    2012-01-01

    Recent innovations in multi-collector ICP-mass spectrometry (MC-ICP-MS) have allowed for rapid and precise measurements of isotope ratios in geological samples. Naturally occurring Sr isotopes has the potential for use in Monitoring, Verification, and Accounting (MVA) associated with geologic CO2 storage. Sr isotopes can be useful for: Sensitive tracking of brine migration; Determining seal rock leakage; Studying fluid/rock reactions. We have optimized separation chemistry procedures that will allow operators to prepare samples for Sr isotope analysis off site using rapid, low cost methods.

  10. High Throughput Determination of VX in Drinking Water by ...

    EPA Pesticide Factsheets

    Methods Report This document provides the standard operating procedure for determination of the chemical warfare agent VX (O-Ethyl S-2-Diisopropylamino-Ethyl Methylphosphonothioate) in drinking water by isotope dilution liquid chromatography tandem mass spectrometer (LC/MS/MS). This method was adapted from one that was initially developed by the Centers for Disease Control and Prevention, in the National Center for Environmental Health for the determination and quantitation of VX in aqueous matrices. This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving this analyte.

  11. 40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...

  12. Sequencing intractable DNA to close microbial genomes.

    PubMed

    Hurt, Richard A; Brown, Steven D; Podar, Mircea; Palumbo, Anthony V; Elias, Dwayne A

    2012-01-01

    Advancement in high throughput DNA sequencing technologies has supported a rapid proliferation of microbial genome sequencing projects, providing the genetic blueprint for in-depth studies. Oftentimes, difficult to sequence regions in microbial genomes are ruled "intractable" resulting in a growing number of genomes with sequence gaps deposited in databases. A procedure was developed to sequence such problematic regions in the "non-contiguous finished" Desulfovibrio desulfuricans ND132 genome (6 intractable gaps) and the Desulfovibrio africanus genome (1 intractable gap). The polynucleotides surrounding each gap formed GC rich secondary structures making the regions refractory to amplification and sequencing. Strand-displacing DNA polymerases used in concert with a novel ramped PCR extension cycle supported amplification and closure of all gap regions in both genomes. The developed procedures support accurate gene annotation, and provide a step-wise method that reduces the effort required for genome finishing.

  13. Validation of a reversed phase high performance thin layer chromatographic-densitometric method for secoisolariciresinol diglucoside determination in flaxseed.

    PubMed

    Coran, Silvia A; Bartolucci, Gianluca; Bambagiotti-Alberti, Massimo

    2008-10-17

    The validation of a HPTLC-densitometric method for the determination of secoisolariciresinol diglucoside (SDG) in flaxseed was performed improving the reproducibility of a previously reported HPTLC densitometric procedure by the use of fully wettable reversed phase plates (silica gel 60 RP18W F(254S), 10cmx10cm) with MeOH:HCOOH 0.1% (40:60, v/v) mobile phase. The analysis required only the alkaline hydrolysis in aqueous medium of undefatted samples and densitometry at 282nm of HPTLC runs. The method was validated following the protocol proposed by the Société Francaise des Sciences et Techniques Pharmaceutiques (SFSTP) giving rise to a dependable and high throughput procedure well suited to routine application. SDG was quantified in the range of 321-1071ng with RSD of repeatability and intermediate precision not exceeding 3.61% and accuracy inside the acceptance limits. Flaxseed of five cultivars of different origin was elected as test-bed.

  14. Compound Transfer by Acoustic Droplet Ejection Promotes Quality and Efficiency in Ultra-High-Throughput Screening Campaigns.

    PubMed

    Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H

    2016-02-01

    Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.

  15. Aircraft Configuration and Flight Crew Compliance with Procedures While Conducting Flight Deck Based Interval Management (FIM) Operations

    NASA Technical Reports Server (NTRS)

    Shay, Rick; Swieringa, Kurt A.; Baxley, Brian T.

    2012-01-01

    Flight deck based Interval Management (FIM) applications using ADS-B are being developed to improve both the safety and capacity of the National Airspace System (NAS). FIM is expected to improve the safety and efficiency of the NAS by giving pilots the technology and procedures to precisely achieve an interval behind the preceding aircraft by a specific point. Concurrently but independently, Optimized Profile Descents (OPD) are being developed to help reduce fuel consumption and noise, however, the range of speeds available when flying an OPD results in a decrease in the delivery precision of aircraft to the runway. This requires the addition of a spacing buffer between aircraft, reducing system throughput. FIM addresses this problem by providing pilots with speed guidance to achieve a precise interval behind another aircraft, even while flying optimized descents. The Interval Management with Spacing to Parallel Dependent Runways (IMSPiDR) human-in-the-loop experiment employed 24 commercial pilots to explore the use of FIM equipment to conduct spacing operations behind two aircraft arriving to parallel runways, while flying an OPD during high-density operations. This paper describes the impact of variations in pilot operations; in particular configuring the aircraft, their compliance with FIM operating procedures, and their response to changes of the FIM speed. An example of the displayed FIM speeds used incorrectly by a pilot is also discussed. Finally, this paper examines the relationship between achieving airline operational goals for individual aircraft and the need for ATC to deliver aircraft to the runway with greater precision. The results show that aircraft can fly an OPD and conduct FIM operations to dependent parallel runways, enabling operational goals to be achieved efficiently while maintaining system throughput.

  16. Status of NASA's Evolutionary Xenon Thruster (NEXT) Long-Duration Test as of 50,000 h and 900 kg Throughput

    NASA Technical Reports Server (NTRS)

    Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Patterson, Michael J.

    2015-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) project is developing the next-generation solar electric propulsion ion propulsion system with significant enhancements beyond the state-of-the-art NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) ion propulsion system in order to provide future NASA science missions with enhanced propulsion capabilities. As part of a comprehensive thruster service life assessment, the NEXT Long-Duration Test (LDT) was initiated in June 2005 to demonstrate throughput capability and validate thruster service life modeling. The NEXT LDT exceeded its original qualification throughput requirement of 450 kg in December 2009. To date, the NEXT LDT has set records for electric propulsion lifetime and has demonstrated 50,170 h of operation, processed 902 kg of propellant, and delivered 34.9 MN-s of total impulse. The NEXT thruster design mitigated several life-limiting mechanisms encountered in the NSTAR design, dramatically increasing service life capability. Various component erosion rates compare favorably to the pretest predictions based upon semi-empirical ion thruster models. The NEXT LDT either met or exceeded all of its original goals regarding lifetime demonstration, performance and wear characterization, and modeling validation. In light of recent budget constraints and to focus on development of other components of the NEXT ion propulsion system, a voluntary termination procedure for the NEXT LDT began in April 2013. As part of this termination procedure, a comprehensive post-test performance characterization was conducted across all operating conditions of the NEXT throttle table. These measurements were found to be consistent with prior data that show minimal degradation of performance over the thruster's 50 kh lifetime. Repair of various diagnostics within the test facility is presently planned while keeping the thruster under high vacuum conditions. These diagnostics will provide additional critical information on the current state of the thruster, in regards to performance and wear, prior to destructive post-test analyses performed on the thruster under atmosphere conditions.

  17. Status of NASA's Evolutionary Xenon Thruster (NEXT) Long-Duration Test as of 50,000 h and 900 kg Throughput

    NASA Technical Reports Server (NTRS)

    Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Patterson, Michael J.

    2013-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) project is developing the next-generation solar electric propulsion ion propulsion system with significant enhancements beyond the state-of-the-art NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) ion propulsion system in order to provide future NASA science missions with enhanced propulsion capabilities. As part of a comprehensive thruster service life assessment, the NEXT Long-Duration Test (LDT) was initiated in June 2005 to demonstrate throughput capability and validate thruster service life modeling. The NEXT LDT exceeded its original qualification throughput requirement of 450 kg in December 2009. To date, the NEXT LDT has set records for electric propulsion lifetime and has demonstrated 50,170 hours of operation, processed 902 kg of propellant, and delivered 34.9 MN-s of total impulse. The NEXT thruster design mitigated several life-limiting mechanisms encountered in the NSTAR design, dramatically increasing service life capability. Various component erosion rates compare favorably to the pretest predictions based upon semi-empirical ion thruster models. The NEXT LDT either met or exceeded all of its original goals regarding lifetime demonstration, performance and wear characterization, and modeling validation. In light of recent budget constraints and to focus on development of other components of the NEXT ion propulsion system, a voluntary termination procedure for the NEXT LDT began in April 2013. As part of this termination procedure, a comprehensive post-test performance characterization was conducted across all operating conditions of the NEXT throttle table. These measurements were found to be consistent with prior data that show minimal degradation of performance over the thruster's 50 kh lifetime. Repair of various diagnostics within the test facility is presently planned while keeping the thruster under high vacuum conditions. These diagnostics will provide additional critical information on the current state of the thruster, in regards to performance and wear, prior to destructive post-test analyses performed on the thruster under atmosphere conditions.

  18. Exploring Genome-Wide Expression Profiles Using Machine Learning Techniques.

    PubMed

    Kebschull, Moritz; Papapanou, Panos N

    2017-01-01

    Although contemporary high-throughput -omics methods produce high-dimensional data, the resulting wealth of information is difficult to assess using traditional statistical procedures. Machine learning methods facilitate the detection of additional patterns, beyond the mere identification of lists of features that differ between groups.Here, we demonstrate the utility of (1) supervised classification algorithms in class validation, and (2) unsupervised clustering in class discovery. We use data from our previous work that described the transcriptional profiles of gingival tissue samples obtained from subjects suffering from chronic or aggressive periodontitis (1) to test whether the two diagnostic entities were also characterized by differences on the molecular level, and (2) to search for a novel, alternative classification of periodontitis based on the tissue transcriptomes.Using machine learning technology, we provide evidence for diagnostic imprecision in the currently accepted classification of periodontitis, and demonstrate that a novel, alternative classification based on differences in gingival tissue transcriptomes is feasible. The outlined procedures allow for the unbiased interrogation of high-dimensional datasets for characteristic underlying classes, and are applicable to a broad range of -omics data.

  19. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality

    PubMed Central

    Albert, Océane; Reintsch, Wolfgang E.; Chan, Peter; Robaire, Bernard

    2016-01-01

    STUDY QUESTION Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? SUMMARY ANSWER We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. WHAT IS KNOWN ALREADY The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. STUDY DESIGN, SIZE, DURATION The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses (n = 3–5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. PARTICIPANTS/MATERIALS, SETTING, METHODS Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. MAIN RESULTS AND THE ROLE OF CHANCE We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. LIMITATIONS, REASONS FOR CAUTION The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. WIDER IMPLICATIONS OF THE FINDINGS This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. STUDY FUNDING/COMPETING INTEREST(S) Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. PMID:26975326

  20. Research Of Airborne Precision Spacing to Improve Airport Arrival Operations

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Baxley, Brian T.; Murdoch, Jennifer L.

    2011-01-01

    In September 2004, the European Organization for the Safety of Air Navigation (EUROCONTROL) and the United States Federal Aviation Administration (FAA) signed a Memorandum of Cooperation to mutually develop, modify, test, and evaluate systems, procedures, facilities, and devices to meet the need for safe and efficient air navigation and air traffic control in the future. In the United States and Europe, these efforts are defined within the architectures of the Next Generation Air Transportation System (NextGen) Program and Single European Sky Air Traffic Management Research (SESAR) Program respectively. Both programs have identified Airborne Spacing as a critical component, with Automatic Dependent Surveillance Broadcast (ADS-B) as a key enabler. Increased interest in reducing airport community noise and the escalating cost of aviation fuel has led to the use of Continuous Descent Arrival (CDA) procedures to reduce noise, emissions, and fuel usage compared to current procedures. To provide these operational enhancements, arrival flight paths into terminal areas are planned around continuous vertical descents that are closer to an optimum trajectory than those in use today. The profiles are designed to be near-idle descents from cruise altitude to the Final Approach Fix (FAF) and are typically without any level segments. By staying higher and faster than conventional arrivals, CDAs also save flight time for the aircraft operator. The drawback is that the variation of optimized trajectories for different types and weights of aircraft requires the Air Traffic Controller to provide more airspace around an aircraft on a CDA than on a conventional arrival procedure. This additional space decreases the throughput rate of the destination airport. Airborne self-spacing concepts have been developed to increase the throughput at high-demand airports by managing the inter-arrival spacing to be more precise and consistent using on-board guidance. It has been proposed that the additional space needed around an aircraft performing a CDA could be reduced or eliminated when using airborne spacing techniques.

  1. High-throughput measurements of the optical redox ratio using a commercial microplate reader.

    PubMed

    Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C

    2015-01-01

    There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.

  2. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    PubMed Central

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  3. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  4. High-throughput, image-based screening of pooled genetic variant libraries

    PubMed Central

    Emanuel, George; Moffitt, Jeffrey R.; Zhuang, Xiaowei

    2018-01-01

    Image-based, high-throughput screening of genetic perturbations will advance both biology and biotechnology. We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in numerous individual cells. We achieve genotyping by introducing barcoded genetic variants into cells and using massively multiplexed FISH to measure the barcodes. We demonstrated this method by screening mutants of the fluorescent protein YFAST, yielding brighter and more photostable YFAST variants. PMID:29083401

  5. Experimental Design for Combinatorial and High Throughput Materials Development

    NASA Astrophysics Data System (ADS)

    Cawse, James N.

    2002-12-01

    In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.

  6. Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing

    PubMed Central

    Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi

    2016-01-01

    Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039

  7. Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.

    PubMed

    Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S

    1994-01-01

    The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.

  8. Quantification of strontium in human serum by ICP-MS using alternate analyte-free matrix and its application to a pilot bioequivalence study of two strontium ranelate oral formulations in healthy Chinese subjects.

    PubMed

    Zhang, Dan; Wang, Xiaolin; Liu, Man; Zhang, Lina; Deng, Ming; Liu, Huichen

    2015-01-01

    A rapid, sensitive and accurate ICP-MS method using alternate analyte-free matrix for calibration standards preparation and a rapid direct dilution procedure for sample preparation was developed and validated for the quantification of exogenous strontium (Sr) from the drug in human serum. Serum was prepared by direct dilution (1:29, v/v) in an acidic solution consisting of nitric acid (0.1%) and germanium (Ge) added as internal standard (IS), to obtain simple and high-throughput preparation procedure with minimized matrix effect, and good repeatability. ICP-MS analysis was performed using collision cell technology (CCT) mode. Alternate matrix method by using distilled water as an alternate analyte-free matrix for the preparation of calibration standards (CS) was used to avoid the influence of endogenous Sr in serum on the quantification. The method was validated in terms of selectivity, carry-over, matrix effects, lower limit of quantification (LLOQ), linearity, precision and accuracy, and stability. Instrumental linearity was verified in the range of 1.00-500ng/mL, corresponding to a concentration range of 0.0300-15.0μg/mL in 50μL sample of serum matrix and alternate matrix. Intra- and inter-day precision as relative standard deviation (RSD) were less than 8.0% and accuracy as relative error (RE) was within ±3.0%. The method allowed a high sample throughput, and was sensitive and accurate enough for a pilot bioequivalence study in healthy male Chinese subjects following single oral administration of two strontium ranelate formulations containing 2g strontium ranelate. Copyright © 2014 Elsevier GmbH. All rights reserved.

  9. A novel high-throughput screening format to identify inhibitors of secreted acid sphingomyelinase.

    PubMed

    Mintzer, Robert J; Appell, Kenneth C; Cole, Andrew; Johns, Anthony; Pagila, Rene; Polokoff, Mark A; Tabas, Ira; Snider, R Michael; Meurer-Ogden, Janet A

    2005-04-01

    Secreted extracellular acid sphingomyelinase (sASM) activity has been suggested to promote atherosclerosis by enhancing subendothelial aggregation and retention of low-density lipoprotein (LDL) with resultant foam cell formation. Compounds that inhibit sASM activity, at neutral pH, may prevent lipid retention and thus would be expected to be anti-atherosclerotic. With the goal of identifying novel compounds that inhibit sASM at pH 7.4, a high-throughput screen was performed. Initial screening was run using a modification of a proven system that measures the hydrolysis of radiolabeled sphingomyelin presented in detergent micelles in a 96-well format. Separation of the radiolabeled aqueous phosphorylcholine reaction product from uncleaved sphingomyelin lipid substrate was achieved by chloroform/methanol extraction. During the screening campaign, a novel extraction procedure was developed to eliminate the use of the hazardous organic reagents. This new procedure exploited the ability of uncleaved, radiolabeled lipid substrate to interact with hydrophobic phenyl-sepharose beads. A comparison of the organic-based and the bead-based extraction sASM screening assays revealed Z' factor values ranging from 0.7 to 0.95 for both formats. In addition, both assay formats led to the identification of sub- to low micromolar inhibitors of sASM at pH 7.4 with similar IC(50) values. Subsequent studies demonstrated that both methods were also adaptable to run in a 384-well format. In contrast to the results observed at neutral pH, however, only the organic extraction assay was capable of accurately measuring sASM activity at its pH optimum of 5.0. The advantages and disadvantages of both sASM assay formats are discussed.

  10. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  11. High-throughput protein concentration and buffer exchange: comparison of ultrafiltration and ammonium sulfate precipitation.

    PubMed

    Moore, Priscilla A; Kery, Vladimir

    2009-01-01

    High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.

  12. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  13. Hazardous waste incinerators under waste uncertainty: balancing and throughput maximization via heat recuperation.

    PubMed

    Tsiliyannis, Christos Aristeides

    2013-09-01

    Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    PubMed

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  15. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  16. High-Throughput Sequencing of Germline and Tumor From Men with Early-Onset Metastatic Prostate Cancer

    DTIC Science & Technology

    2016-12-01

    AWARD NUMBER: W81XWH-13-1-0371 TITLE: High-Throughput Sequencing of Germline and Tumor From Men with Early- Onset Metastatic Prostate Cancer...DATES COVERED 30 Sep 2013 - 29 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER High-Throughput Sequencing of Germline and Tumor From Men with...presenting with metastatic prostate cancer at a young age (before age 60 years). Whole exome sequencing identified a panel of germline variants that have

  17. Highly Efficient Proteolysis Accelerated by Electromagnetic Waves for Peptide Mapping

    PubMed Central

    Chen, Qiwen; Liu, Ting; Chen, Gang

    2011-01-01

    Proteomics will contribute greatly to the understanding of gene functions in the post-genomic era. In proteome research, protein digestion is a key procedure prior to mass spectrometry identification. During the past decade, a variety of electromagnetic waves have been employed to accelerate proteolysis. This review focuses on the recent advances and the key strategies of these novel proteolysis approaches for digesting and identifying proteins. The subjects covered include microwave-accelerated protein digestion, infrared-assisted proteolysis, ultraviolet-enhanced protein digestion, laser-assisted proteolysis, and future prospects. It is expected that these novel proteolysis strategies accelerated by various electromagnetic waves will become powerful tools in proteome research and will find wide applications in high throughput protein digestion and identification. PMID:22379392

  18. Multispectral and DSLR sensors for assessing crop stress in corn and cotton using fixed-wing unmanned air systems

    NASA Astrophysics Data System (ADS)

    Valasek, John; Henrickson, James V.; Bowden, Ezekiel; Shi, Yeyin; Morgan, Cristine L. S.; Neely, Haly L.

    2016-05-01

    As small unmanned aircraft systems become increasingly affordable, reliable, and formally recognized under federal regulation, they become increasingly attractive as novel platforms for civil applications. This paper details the development and demonstration of fixed-wing unmanned aircraft systems for precision agriculture tasks. Tasks such as soil moisture content and high throughput phenotyping are considered. Rationale for sensor, vehicle, and ground equipment selections are provided, in addition to developed flight operation procedures for minimal numbers of crew. Preliminary imagery results are presented and analyzed, and these results demonstrate that fixed-wing unmanned aircraft systems modified to carry non-traditional sensors at extended endurance durations can provide high quality data that is usable for serious scientific analysis.

  19. The Newick utilities: high-throughput phylogenetic tree processing in the Unix shell

    PubMed Central

    Junier, Thomas; Zdobnov, Evgeny M.

    2010-01-01

    Summary: We present a suite of Unix shell programs for processing any number of phylogenetic trees of any size. They perform frequently-used tree operations without requiring user interaction. They also allow tree drawing as scalable vector graphics (SVG), suitable for high-quality presentations and further editing, and as ASCII graphics for command-line inspection. As an example we include an implementation of bootscanning, a procedure for finding recombination breakpoints in viral genomes. Availability: C source code, Python bindings and executables for various platforms are available from http://cegg.unige.ch/newick_utils. The distribution includes a manual and example data. The package is distributed under the BSD License. Contact: thomas.junier@unige.ch PMID:20472542

  20. A compression scheme for radio data in high performance computing

    NASA Astrophysics Data System (ADS)

    Masui, K.; Amiri, M.; Connor, L.; Deng, M.; Fandino, M.; Höfer, C.; Halpern, M.; Hanna, D.; Hincks, A. D.; Hinshaw, G.; Parra, J. M.; Newburgh, L. B.; Shaw, J. R.; Vanderlinde, K.

    2015-09-01

    We present a procedure for efficiently compressing astronomical radio data for high performance applications. Integrated, post-correlation data are first passed through a nearly lossless rounding step which compares the precision of the data to a generalized and calibration-independent form of the radiometer equation. This allows the precision of the data to be reduced in a way that has an insignificant impact on the data. The newly developed Bitshuffle lossless compression algorithm is subsequently applied. When the algorithm is used in conjunction with the HDF5 library and data format, data produced by the CHIME Pathfinder telescope is compressed to 28% of its original size and decompression throughputs in excess of 1 GB/s are obtained on a single core.

  1. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    PubMed

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  2. Squeezing water from a stone: high-throughput sequencing from a 145-year old holotype resolves (barely) a cryptic species problem in flying lizards.

    PubMed

    McGuire, Jimmy A; Cotoras, Darko D; O'Connell, Brendan; Lawalata, Shobi Z S; Wang-Claypool, Cynthia Y; Stubbs, Alexander; Huang, Xiaoting; Wogan, Guinevere O U; Hykin, Sarah M; Reilly, Sean B; Bi, Ke; Riyanto, Awal; Arida, Evy; Smith, Lydia L; Milne, Heather; Streicher, Jeffrey W; Iskandar, Djoko T

    2018-01-01

    We used Massively Parallel High-Throughput Sequencing to obtain genetic data from a 145-year old holotype specimen of the flying lizard, Draco cristatellus . Obtaining genetic data from this holotype was necessary to resolve an otherwise intractable taxonomic problem involving the status of this species relative to closely related sympatric Draco species that cannot otherwise be distinguished from one another on the basis of museum specimens. Initial analyses suggested that the DNA present in the holotype sample was so degraded as to be unusable for sequencing. However, we used a specialized extraction procedure developed for highly degraded ancient DNA samples and MiSeq shotgun sequencing to obtain just enough low-coverage mitochondrial DNA (721 base pairs) to conclusively resolve the species status of the holotype as well as a second known specimen of this species. The holotype was prepared before the advent of formalin-fixation and therefore was most likely originally fixed with ethanol and never exposed to formalin. Whereas conventional wisdom suggests that formalin-fixed samples should be the most challenging for DNA sequencing, we propose that evaporation during long-term alcohol storage and consequent water-exposure may subject older ethanol-fixed museum specimens to hydrolytic damage. If so, this may pose an even greater challenge for sequencing efforts involving historical samples.

  3. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  4. A High Throughput In Vivo Assay for Taste Quality and Palatability

    PubMed Central

    Palmer, R. Kyle; Long, Daniel; Brennan, Francis; Buber, Tulu; Bryant, Robert; Salemme, F. Raymond

    2013-01-01

    Taste quality and palatability are two of the most important properties measured in the evaluation of taste stimuli. Human panels can report both aspects, but are of limited experimental flexibility and throughput capacity. Relatively efficient animal models for taste evaluation have been developed, but each of them is designed to measure either taste quality or palatability as independent experimental endpoints. We present here a new apparatus and method for high throughput quantification of both taste quality and palatability using rats in an operant taste discrimination paradigm. Cohorts of four rats were trained in a modified operant chamber to sample taste stimuli by licking solutions from a 96-well plate that moved in a randomized pattern beneath the chamber floor. As a rat’s tongue entered the well it disrupted a laser beam projecting across the top of the 96-well plate, consequently producing two retractable levers that operated a pellet dispenser. The taste of sucrose was associated with food reinforcement by presses on a sucrose-designated lever, whereas the taste of water and other basic tastes were associated with the alternative lever. Each disruption of the laser was counted as a lick. Using this procedure, rats were trained to discriminate 100 mM sucrose from water, quinine, citric acid, and NaCl with 90-100% accuracy. Palatability was determined by the number of licks per trial and, due to intermediate rates of licking for water, was quantifiable along the entire spectrum of appetitiveness to aversiveness. All 96 samples were evaluated within 90 minute test sessions with no evidence of desensitization or fatigue. The technology is capable of generating multiple concentration–response functions within a single session, is suitable for in vivo primary screening of tastant libraries, and potentially can be used to evaluate stimuli for any taste system. PMID:23951319

  5. High-throughput sequencing methods to study neuronal RNA-protein interactions.

    PubMed

    Ule, Jernej

    2009-12-01

    UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.

  6. Evaluation of Compatibility of ToxCast High-Throughput/High-Content Screening Assays with Engineered Nanomaterials

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  7. Initial Investigations of Controller Tools and Procedures for Schedule-Based Arrival Operations with Mixed Flight-Deck Interval Management Equipage

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Omar, Faisal G.; Prevot, Thomas

    2012-01-01

    NASA?s Air Traffic Management Demonstration-1 (ATD-1) is a multi-year effort to demonstrate high-throughput, fuel-efficient arrivals at a major U.S. airport using NASA-developed scheduling automation, controller decision-support tools, and ADS-B-enabled Flight-Deck Interval Management (FIM) avionics. First-year accomplishments include the development of a concept of operations for managing scheduled arrivals flying Optimized Profile Descents with equipped aircraft conducting FIM operations, and the integration of laboratory prototypes of the core ATD-1 technologies. Following each integration phase, a human-in-the-loop simulation was conducted to evaluate and refine controller tools, procedures, and clearance phraseology. From a ground-side perspective, the results indicate the concept is viable and the operations are safe and acceptable. Additional training is required for smooth operations that yield notable benefits, particularly in the areas of FIM operations and clearance phraseology.

  8. Phage display for the discovery of hydroxyapatite-associated peptides.

    PubMed

    Jin, Hyo-Eon; Chung, Woo-Jae; Lee, Seung-Wuk

    2013-01-01

    In nature, proteins play a critical role in the biomineralization process. Understanding how different peptide or protein sequences selectively interact with the target crystal is of great importance. Identifying such protein structures is one of the critical steps in verifying the molecular mechanisms of biomineralization. One of the promising ways to obtain such information for a particular crystal surface is to screen combinatorial peptide libraries in a high-throughput manner. Among the many combinatorial library screening procedures, phage display is a powerful method to isolate such proteins and peptides. In this chapter, we will describe our established methods to perform phage display with inorganic crystal surfaces. Specifically, we will use hydroxyapatite as a model system for discovery of apatite-associated proteins in bone or tooth biomineralization studies. This model approach can be generalized to other desired crystal surfaces using the same experimental design principles with a little modification of the procedures. © 2013 Elsevier Inc. All rights reserved.

  9. Hit-Validation Methodologies for Ligands Isolated from DNA-Encoded Chemical Libraries.

    PubMed

    Zimmermann, Gunther; Li, Yizhou; Rieder, Ulrike; Mattarella, Martin; Neri, Dario; Scheuermann, Jörg

    2017-05-04

    DNA-encoded chemical libraries (DECLs) are large collections of compounds linked to DNA fragments, serving as amplifiable barcodes, which can be screened on target proteins of interest. In typical DECL selections, preferential binders are identified by high-throughput DNA sequencing, by comparing their frequency before and after the affinity capture step. Hits identified in this procedure need to be confirmed, by resynthesis and by performing affinity measurements. In this article we present new methods based on hybridization of oligonucleotide conjugates with fluorescently labeled complementary oligonucleotides; these facilitate the determination of affinity constants and kinetic dissociation constants. The experimental procedures were demonstrated with acetazolamide, a binder to carbonic anhydrase IX with a dissociation constant in the nanomolar range. The detection of binding events was compatible not only with fluorescence polarization methodologies, but also with Alphascreen technology and with microscale thermophoresis. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Selective determination of aloin in different matrices by HPTLC densitometry in fluorescence mode.

    PubMed

    Coran, Silvia A; Bartolucci, Gianluca; Bambagiotti-Alberti, Massimo

    2011-01-25

    A novel method based on the fluorescence excited solely on aloin by a H₃BO₃ derivatizing procedure, allowed its rapid and selective determination among the co-occurring components in a variety of complex matrices as several Aloes dried extracts and related commercial products. HPTLC LiChrospher silica gel 60 F254S, 20 cm x 10 cm, plates with ethyl formate: CH₃OH:H₂O (100:14.5:10, v/v) as the mobile phase were used. Densitometric determinations were performed in fluorescence mode, exciting wavelength 365 nm, optical filter K540 after derivatization with H₃BO₃. The method was validated giving rise to a dependable and high throughput procedure well suited to routine application. Aloin was quantified in the range of 110-330 ng with RSD of repeatability and intermediate precision not exceeding 2.3% and accuracy inside the acceptance limits. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Quantitative bioanalysis of strontium in human serum by inductively coupled plasma-mass spectrometry

    PubMed Central

    Somarouthu, Srikanth; Ohh, Jayoung; Shaked, Jonathan; Cunico, Robert L; Yakatan, Gerald; Corritori, Suzana; Tami, Joe; Foehr, Erik D

    2015-01-01

    Aim: A bioanalytical method using inductively-coupled plasma-mass spectrometry to measure endogenous levels of strontium in human serum was developed and validated. Results & methodology: This article details the experimental procedures used for the method development and validation thus demonstrating the application of the inductively-coupled plasma-mass spectrometry method for quantification of strontium in human serum samples. The assay was validated for specificity, linearity, accuracy, precision, recovery and stability. Significant endogenous levels of strontium are present in human serum samples ranging from 19 to 96 ng/ml with a mean of 34.6 ± 15.2 ng/ml (SD). Discussion & conclusion: Calibration procedures and sample pretreatment were simplified for high throughput analysis. The validation demonstrates that the method was sensitive, selective for quantification of strontium (88Sr) and is suitable for routine clinical testing of strontium in human serum samples. PMID:28031925

  12. In situ analysis and structural elucidation of sainfoin (Onobrychis viciifolia) tannins for high-throughput germplasm screening.

    PubMed

    Gea, An; Stringano, Elisabetta; Brown, Ron H; Mueller-Harvey, Irene

    2011-01-26

    A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin ( Onobrychis viciifolia ) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6-113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

  13. Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments

    PubMed Central

    Shockley, Keith R.

    2014-01-01

    Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003

  14. Noninvasive prenatal screening for fetal common sex chromosome aneuploidies from maternal blood.

    PubMed

    Zhang, Bin; Lu, Bei-Yi; Yu, Bin; Zheng, Fang-Xiu; Zhou, Qin; Chen, Ying-Ping; Zhang, Xiao-Qing

    2017-04-01

    Objective To explore the feasibility of high-throughput massively parallel genomic DNA sequencing technology for the noninvasive prenatal detection of fetal sex chromosome aneuploidies (SCAs). Methods The study enrolled pregnant women who were prepared to undergo noninvasive prenatal testing (NIPT) in the second trimester. Cell-free fetal DNA (cffDNA) was extracted from the mother's peripheral venous blood and a high-throughput sequencing procedure was undertaken. Patients identified as having pregnancies associated with SCAs were offered prenatal fetal chromosomal karyotyping. Results The study enrolled 10 275 pregnant women who were prepared to undergo NIPT. Of these, 57 pregnant women (0.55%) showed fetal SCA, including 27 with Turner syndrome (45,X), eight with Triple X syndrome (47,XXX), 12 with Klinefelter syndrome (47,XXY) and three with 47,XYY. Thirty-three pregnant women agreed to undergo fetal karyotyping and 18 had results consistent with NIPT, while 15 patients received a normal karyotype result. The overall positive predictive value of NIPT for detecting SCAs was 54.54% (18/33) and for detecting Turner syndrome (45,X) was 29.41% (5/17). Conclusion NIPT can be used to identify fetal SCAs by analysing cffDNA using massively parallel genomic sequencing, although the accuracy needs to be improved particularly for Turner syndrome (45,X).

  15. GreenLight Model 960.

    PubMed

    Fernandes, Richard; Carey, Conn; Hynes, James; Papkovsky, Dmitri

    2013-01-01

    The importance of food safety has resulted in a demand for a more rapid, high-throughput method for total viable count (TVC). The industry standard for TVC determination (ISO 4833:2003) is widely used but presents users with some drawbacks. The method is materials- and labor-intensive, requiring multiple agar plates per sample. More importantly, the method is slow, with 72 h typically required for a definitive result. Luxcel Biosciences has developed the GreenLight Model 960, a microtiter plate-based assay providing a rapid high-throughput method of aerobic bacterial load assessment through analysis of microbial oxygen consumption. Results are generated in 1-12 h, depending on microbial load. The mix and measure procedure allows rapid detection of microbial oxygen consumption and equates oxygen consumption to microbial load (CFU/g), providing a simple, sensitive means of assessing the microbial contamination levels in foods (1). As bacteria in the test sample grow and respire, they deplete O2, which is detected as an increase in the GreenLight probe signal above the baseline level (2). The time required to reach this increase in signal can be used to calculate the CFU/g of the original sample, based on a predetermined calibration. The higher the initial microbial load, the earlier this threshold is reached (1).

  16. High throughput sequencing analysis of RNA libraries reveals the influences of initial library and PCR methods on SELEX efficiency.

    PubMed

    Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J; Burnett, John C; Zhou, Jiehua

    2016-09-22

    The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct "biased sequences" and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the "biased sequences" was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy.

  17. A copy number variation genotyping method for aneuploidy detection in spontaneous abortion specimens.

    PubMed

    Chen, Songchang; Liu, Deyuan; Zhang, Junyu; Li, Shuyuan; Zhang, Lanlan; Fan, Jianxia; Luo, Yuqin; Qian, Yeqing; Huang, Hefeng; Liu, Chao; Zhu, Huanhuan; Jiang, Zhengwen; Xu, Chenming

    2017-02-01

    Chromosomal abnormalities such as aneuploidy have been shown to be responsible for causing spontaneous abortion. Genetic evaluation of abortions is currently underperformed. Screening for aneuploidy in the products of conception can help determine the etiology. We designed a high-throughput ligation-dependent probe amplification (HLPA) assay to examine aneuploidy of 24 chromosomes in miscarriage tissues and aimed to validate the performance of this technique. We carried out aneuploidy screening in 98 fetal tissue samples collected from female subjects with singleton pregnancies who experienced spontaneous abortion. The mean maternal age was 31.6 years (range: 24-43), and the mean gestational age was 10.2 weeks (range: 4.6-14.1). HLPA was performed in parallel with array comparative genomic hybridization, which is the gold standard for aneuploidy detection in clinical practices. The results from the two platforms were compared. Forty-nine out of ninety-eight samples were found to be aneuploid. HLPA showed concordance with array comparative genomic hybridization in diagnosing aneuploidy. High-throughput ligation-dependent probe amplification is a rapid and accurate method for aneuploidy detection. It can be used as a cost-effective screening procedure in clinical spontaneous abortions. © 2016 John Wiley & Sons, Ltd. © 2016 John Wiley & Sons, Ltd.

  18. The stabilisation of purified, reconstituted P-glycoprotein by freeze drying with disaccharides.

    PubMed

    Heikal, Adam; Box, Karl; Rothnie, Alice; Storm, Janet; Callaghan, Richard; Allen, Marcus

    2009-02-01

    The drug efflux pump P-glycoprotein (P-gp) (ABCB1) confers multidrug resistance, a major cause of failure in the chemotherapy of tumours, exacerbated by a shortage of potent and selective inhibitors. A high throughput assay using purified P-gp to screen and characterise potential inhibitors would greatly accelerate their development. However, long-term stability of purified reconstituted ABCB1 can only be reliably achieved with storage at -80 degrees C. For example, at 20 degrees C, the activity of ABCB1 was abrogated with a half-life of <1 day. The aim of this investigation was to stabilise purified, reconstituted ABCB1 to enable storage at higher temperatures and thereby enable design of a high throughput assay system. The ABCB1 purification procedure was optimised to allow successful freeze drying by substitution of glycerol with the disaccharides trehalose or maltose. Addition of disaccharides resulted in ATPase activity being retained immediately following lyophilisation with no significant difference between the two disaccharides. However, during storage trehalose preserved ATPase activity for several months regardless of the temperature (e.g. 60% retention at 150 days), whereas ATPase activity in maltose purified P-gp was affected by both storage time and temperature. The data provide an effective mechanism for the production of resilient purified, reconstituted ABCB1.

  19. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    PubMed

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  20. Streamlined approach to high-quality purification and identification of compound series using high-resolution MS and NMR.

    PubMed

    Mühlebach, Anneke; Adam, Joachim; Schön, Uwe

    2011-11-01

    Automated medicinal chemistry (parallel chemistry) has become an integral part of the drug-discovery process in almost every large pharmaceutical company. Parallel array synthesis of individual organic compounds has been used extensively to generate diverse structural libraries to support different phases of the drug-discovery process, such as hit-to-lead, lead finding, or lead optimization. In order to guarantee effective project support, efficiency in the production of compound libraries has been maximized. As a consequence, also throughput in chromatographic purification and analysis has been adapted. As a recent trend, more laboratories are preparing smaller, yet more focused libraries with even increasing demands towards quality, i.e. optimal purity and unambiguous confirmation of identity. This paper presents an automated approach how to combine effective purification and structural conformation of a lead optimization library created by microwave-assisted organic synthesis. The results of complementary analytical techniques such as UHPLC-HRMS and NMR are not only regarded but even merged for fast and easy decision making, providing optimal quality of compound stock. In comparison with the previous procedures, throughput times are at least four times faster, while compound consumption could be decreased more than threefold. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. High throughput sequencing analysis of RNA libraries reveals the influences of initial library and PCR methods on SELEX efficiency

    PubMed Central

    Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J.; Burnett, John C.; Zhou, Jiehua

    2016-01-01

    The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct “biased sequences” and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the “biased sequences” was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy. PMID:27652575

  2. Arraycount, an algorithm for automatic cell counting in microwell arrays.

    PubMed

    Kachouie, Nezamoddin; Kang, Lifeng; Khademhosseini, Ali

    2009-09-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then analyzing images via computer to recognize the array and count cells inside each microwell. To start counting, green and red fluorescent images (representing live and dead cells, respectively) are extracted from the original image and processed separately. A template-matching algorithm is proposed in which pre-defined well and cell templates are matched against the red and green images to locate microwells and cells. Subsequently, local maxima in the correlation maps are determined and local maxima maps are thresholded. At the end, the software records the cell counts for each detected microwell on the original image in high-throughput. The automated counting was shown to be accurate compared with manual counting, with a difference of approximately 1-2 cells per microwell: based on cell concentration, the absolute difference between manual and automatic counting measurements was 2.5-13%.

  3. A Mixture Modeling Framework for Differential Analysis of High-Throughput Data

    PubMed Central

    Taslim, Cenny; Lin, Shili

    2014-01-01

    The inventions of microarray and next generation sequencing technologies have revolutionized research in genomics; platforms have led to massive amount of data in gene expression, methylation, and protein-DNA interactions. A common theme among a number of biological problems using high-throughput technologies is differential analysis. Despite the common theme, different data types have their own unique features, creating a “moving target” scenario. As such, methods specifically designed for one data type may not lead to satisfactory results when applied to another data type. To meet this challenge so that not only currently existing data types but also data from future problems, platforms, or experiments can be analyzed, we propose a mixture modeling framework that is flexible enough to automatically adapt to any moving target. More specifically, the approach considers several classes of mixture models and essentially provides a model-based procedure whose model is adaptive to the particular data being analyzed. We demonstrate the utility of the methodology by applying it to three types of real data: gene expression, methylation, and ChIP-seq. We also carried out simulations to gauge the performance and showed that the approach can be more efficient than any individual model without inflating type I error. PMID:25057284

  4. SwellGel: an affinity chromatography technology for high-capacity and high-throughput purification of recombinant-tagged proteins.

    PubMed

    Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W

    2001-07-01

    The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.

  5. Large-scale microfluidics providing high-resolution and high-throughput screening of Caenorhabditis elegans poly-glutamine aggregation model

    NASA Astrophysics Data System (ADS)

    Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela

    2016-10-01

    Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.

  6. ToxCast Dashboard

    EPA Pesticide Factsheets

    The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.

  7. RapidTox Dashboard

    EPA Pesticide Factsheets

    The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.

  8. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    PubMed Central

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  9. Phylogenomics of plant genomes: a methodology for genome-wide searches for orthologs in plants

    PubMed Central

    Conte, Matthieu G; Gaillard, Sylvain; Droc, Gaetan; Perin, Christophe

    2008-01-01

    Background Gene ortholog identification is now a major objective for mining the increasing amount of sequence data generated by complete or partial genome sequencing projects. Comparative and functional genomics urgently need a method for ortholog detection to reduce gene function inference and to aid in the identification of conserved or divergent genetic pathways between several species. As gene functions change during evolution, reconstructing the evolutionary history of genes should be a more accurate way to differentiate orthologs from paralogs. Phylogenomics takes into account phylogenetic information from high-throughput genome annotation and is the most straightforward way to infer orthologs. However, procedures for automatic detection of orthologs are still scarce and suffer from several limitations. Results We developed a procedure for ortholog prediction between Oryza sativa and Arabidopsis thaliana. Firstly, we established an efficient method to cluster A. thaliana and O. sativa full proteomes into gene families. Then, we developed an optimized phylogenomics pipeline for ortholog inference. We validated the full procedure using test sets of orthologs and paralogs to demonstrate that our method outperforms pairwise methods for ortholog predictions. Conclusion Our procedure achieved a high level of accuracy in predicting ortholog and paralog relationships. Phylogenomic predictions for all validated gene families in both species were easily achieved and we can conclude that our methodology outperforms similarly based methods. PMID:18426584

  10. A high-quality annotated transcriptome of swine peripheral blood

    USDA-ARS?s Scientific Manuscript database

    Background: High throughput gene expression profiling assays of peripheral blood are widely used in biomedicine, as well as in animal genetics and physiology research. Accurate, comprehensive, and precise interpretation of such high throughput assays relies on well-characterized reference genomes an...

  11. Using iterative cluster merging with improved gap statistics to perform online phenotype discovery in the context of high-throughput RNAi screens

    PubMed Central

    Yin, Zheng; Zhou, Xiaobo; Bakal, Chris; Li, Fuhai; Sun, Youxian; Perrimon, Norbert; Wong, Stephen TC

    2008-01-01

    Background The recent emergence of high-throughput automated image acquisition technologies has forever changed how cell biologists collect and analyze data. Historically, the interpretation of cellular phenotypes in different experimental conditions has been dependent upon the expert opinions of well-trained biologists. Such qualitative analysis is particularly effective in detecting subtle, but important, deviations in phenotypes. However, while the rapid and continuing development of automated microscope-based technologies now facilitates the acquisition of trillions of cells in thousands of diverse experimental conditions, such as in the context of RNA interference (RNAi) or small-molecule screens, the massive size of these datasets precludes human analysis. Thus, the development of automated methods which aim to identify novel and biological relevant phenotypes online is one of the major challenges in high-throughput image-based screening. Ideally, phenotype discovery methods should be designed to utilize prior/existing information and tackle three challenging tasks, i.e. restoring pre-defined biological meaningful phenotypes, differentiating novel phenotypes from known ones and clarifying novel phenotypes from each other. Arbitrarily extracted information causes biased analysis, while combining the complete existing datasets with each new image is intractable in high-throughput screens. Results Here we present the design and implementation of a novel and robust online phenotype discovery method with broad applicability that can be used in diverse experimental contexts, especially high-throughput RNAi screens. This method features phenotype modelling and iterative cluster merging using improved gap statistics. A Gaussian Mixture Model (GMM) is employed to estimate the distribution of each existing phenotype, and then used as reference distribution in gap statistics. This method is broadly applicable to a number of different types of image-based datasets derived from a wide spectrum of experimental conditions and is suitable to adaptively process new images which are continuously added to existing datasets. Validations were carried out on different dataset, including published RNAi screening using Drosophila embryos [Additional files 1, 2], dataset for cell cycle phase identification using HeLa cells [Additional files 1, 3, 4] and synthetic dataset using polygons, our methods tackled three aforementioned tasks effectively with an accuracy range of 85%–90%. When our method is implemented in the context of a Drosophila genome-scale RNAi image-based screening of cultured cells aimed to identifying the contribution of individual genes towards the regulation of cell-shape, it efficiently discovers meaningful new phenotypes and provides novel biological insight. We also propose a two-step procedure to modify the novelty detection method based on one-class SVM, so that it can be used to online phenotype discovery. In different conditions, we compared the SVM based method with our method using various datasets and our methods consistently outperformed SVM based method in at least two of three tasks by 2% to 5%. These results demonstrate that our methods can be used to better identify novel phenotypes in image-based datasets from a wide range of conditions and organisms. Conclusion We demonstrate that our method can detect various novel phenotypes effectively in complex datasets. Experiment results also validate that our method performs consistently under different order of image input, variation of starting conditions including the number and composition of existing phenotypes, and dataset from different screens. In our findings, the proposed method is suitable for online phenotype discovery in diverse high-throughput image-based genetic and chemical screens. PMID:18534020

  12. Transdermal Diagnosis of Malaria Using Vapor Nanobubbles

    PubMed Central

    Lukianova-Hleb, Ekaterina; Bezek, Sarah; Szigeti, Reka; Khodarev, Alexander; Kelley, Thomas; Hurrell, Andrew; Berba, Michail; Kumar, Nirbhay; D’Alessandro, Umberto

    2015-01-01

    A fast, precise, noninvasive, high-throughput, and simple approach for detecting malaria in humans and mosquitoes is not possible with current techniques that depend on blood sampling, reagents, facilities, tedious procedures, and trained personnel. We designed a device for rapid (20-second) noninvasive diagnosis of Plasmodium falciparum infection in a malaria patient without drawing blood or using any reagent. This method uses transdermal optical excitation and acoustic detection of vapor nanobubbles around intraparasite hemozoin. The same device also identified individual malaria parasite–infected Anopheles mosquitoes in a few seconds and can be realized as a low-cost universal tool for clinical and field diagnoses. PMID:26079141

  13. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Industrializing electrophysiology: HT automated patch clamp on SyncroPatch® 96 using instant frozen cells.

    PubMed

    Polonchuk, Liudmila

    2014-01-01

    Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.

  15. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  16. A hybrid 2D/3D inspection concept with smart routing optimisation for high throughput, high dynamic range and traceable critical dimension metrology

    NASA Astrophysics Data System (ADS)

    Jones, Christopher W.; O’Connor, Daniel

    2018-07-01

    Dimensional surface metrology is required to enable advanced manufacturing process control for products such as large-area electronics, microfluidic structures, and light management films, where performance is determined by micrometre-scale geometry or roughness formed over metre-scale substrates. While able to perform 100% inspection at a low cost, commonly used 2D machine vision systems are insufficient to assess all of the functionally relevant critical dimensions in such 3D products on their own. While current high-resolution 3D metrology systems are able to assess these critical dimensions, they have a relatively small field of view and are thus much too slow to keep up with full production speeds. A hybrid 2D/3D inspection concept is demonstrated, combining a small field of view, high-performance 3D topography-measuring instrument with a large field of view, high-throughput 2D machine vision system. In this concept, the location of critical dimensions and defects are first registered using the 2D system, then smart routing algorithms and high dynamic range (HDR) measurement strategies are used to efficiently acquire local topography using the 3D sensor. A motion control platform with a traceable position referencing system is used to recreate various sheet-to-sheet and roll-to-roll inline metrology scenarios. We present the artefacts and procedures used to calibrate this hybrid sensor system for traceable dimensional measurement, as well as exemplar measurement of optically challenging industrial test structures.

  17. False discovery rates in spectral identification.

    PubMed

    Jeong, Kyowon; Kim, Sangtae; Bandeira, Nuno

    2012-01-01

    Automated database search engines are one of the fundamental engines of high-throughput proteomics enabling daily identifications of hundreds of thousands of peptides and proteins from tandem mass (MS/MS) spectrometry data. Nevertheless, this automation also makes it humanly impossible to manually validate the vast lists of resulting identifications from such high-throughput searches. This challenge is usually addressed by using a Target-Decoy Approach (TDA) to impose an empirical False Discovery Rate (FDR) at a pre-determined threshold x% with the expectation that at most x% of the returned identifications would be false positives. But despite the fundamental importance of FDR estimates in ensuring the utility of large lists of identifications, there is surprisingly little consensus on exactly how TDA should be applied to minimize the chances of biased FDR estimates. In fact, since less rigorous TDA/FDR estimates tend to result in more identifications (at higher 'true' FDR), there is often little incentive to enforce strict TDA/FDR procedures in studies where the major metric of success is the size of the list of identifications and there are no follow up studies imposing hard cost constraints on the number of reported false positives. Here we address the problem of the accuracy of TDA estimates of empirical FDR. Using MS/MS spectra from samples where we were able to define a factual FDR estimator of 'true' FDR we evaluate several popular variants of the TDA procedure in a variety of database search contexts. We show that the fraction of false identifications can sometimes be over 10× higher than reported and may be unavoidably high for certain types of searches. In addition, we further report that the two-pass search strategy seems the most promising database search strategy. While unavoidably constrained by the particulars of any specific evaluation dataset, our observations support a series of recommendations towards maximizing the number of resulting identifications while controlling database searches with robust and reproducible TDA estimation of empirical FDR.

  18. GiNA, an efficient and high-throughput software for horticultural phenotyping

    USDA-ARS?s Scientific Manuscript database

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...

  19. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  1. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakala, Jacqueline Alexandra

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  2. A Memory Efficient Network Encryption Scheme

    NASA Astrophysics Data System (ADS)

    El-Fotouh, Mohamed Abo; Diepold, Klaus

    In this paper, we studied the two widely used encryption schemes in network applications. Shortcomings have been found in both schemes, as these schemes consume either more memory to gain high throughput or low memory with low throughput. The need has aroused for a scheme that has low memory requirements and in the same time possesses high speed, as the number of the internet users increases each day. We used the SSM model [1], to construct an encryption scheme based on the AES. The proposed scheme possesses high throughput together with low memory requirements.

  3. HTP-NLP: A New NLP System for High Throughput Phenotyping.

    PubMed

    Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L

    2017-01-01

    Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.

  4. Combined Effect of Random Transmit Power Control and Inter-Path Interference Cancellation on DS-CDMA Packet Mobile Communications

    NASA Astrophysics Data System (ADS)

    Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki

    In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.

  5. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  6. High-throughput high-volume nuclear imaging for preclinical in vivo compound screening§.

    PubMed

    Macholl, Sven; Finucane, Ciara M; Hesterman, Jacob; Mather, Stephen J; Pauplis, Rachel; Scully, Deirdre; Sosabowski, Jane K; Jouannot, Erwan

    2017-12-01

    Preclinical single-photon emission computed tomography (SPECT)/CT imaging studies are hampered by low throughput, hence are found typically within small volume feasibility studies. Here, imaging and image analysis procedures are presented that allow profiling of a large volume of radiolabelled compounds within a reasonably short total study time. Particular emphasis was put on quality control (QC) and on fast and unbiased image analysis. 2-3 His-tagged proteins were simultaneously radiolabelled by 99m Tc-tricarbonyl methodology and injected intravenously (20 nmol/kg; 100 MBq; n = 3) into patient-derived xenograft (PDX) mouse models. Whole-body SPECT/CT images of 3 mice simultaneously were acquired 1, 4, and 24 h post-injection, extended to 48 h and/or by 0-2 h dynamic SPECT for pre-selected compounds. Organ uptake was quantified by automated multi-atlas and manual segmentations. Data were plotted automatically, quality controlled and stored on a collaborative image management platform. Ex vivo uptake data were collected semi-automatically and analysis performed as for imaging data. >500 single animal SPECT images were acquired for 25 proteins over 5 weeks, eventually generating >3500 ROI and >1000 items of tissue data. SPECT/CT images clearly visualized uptake in tumour and other tissues even at 48 h post-injection. Intersubject uptake variability was typically 13% (coefficient of variation, COV). Imaging results correlated well with ex vivo data. The large data set of tumour, background and systemic uptake/clearance data from 75 mice for 25 compounds allows identification of compounds of interest. The number of animals required was reduced considerably by longitudinal imaging compared to dissection experiments. All experimental work and analyses were accomplished within 3 months expected to be compatible with drug development programmes. QC along all workflow steps, blinding of the imaging contract research organization to compound properties and automation provide confidence in the data set. Additional ex vivo data were useful as a control but could be omitted from future studies in the same centre. For even larger compound libraries, radiolabelling could be expedited and the number of imaging time points adapted to increase weekly throughput. Multi-atlas segmentation could be expanded via SPECT/MRI; however, this would require an MRI-compatible mouse hotel. Finally, analysis of nuclear images of radiopharmaceuticals in clinical trials may benefit from the automated analysis procedures developed.

  7. Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment

    DOE PAGES

    Yan, Qimin; Yu, Jie; Suram, Santosh K.; ...

    2017-03-06

    The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less

  8. Microfluidics for cell-based high throughput screening platforms - A review.

    PubMed

    Du, Guansheng; Fang, Qun; den Toonder, Jaap M J

    2016-01-15

    In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Qimin; Yu, Jie; Suram, Santosh K.

    The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less

  10. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  11. Development and Validation of an Automated High-Throughput System for Zebrafish In Vivo Screenings

    PubMed Central

    Virto, Juan M.; Holgado, Olaia; Diez, Maria; Izpisua Belmonte, Juan Carlos; Callol-Massot, Carles

    2012-01-01

    The zebrafish is a vertebrate model compatible with the paradigms of drug discovery. The small size and transparency of zebrafish embryos make them amenable for the automation necessary in high-throughput screenings. We have developed an automated high-throughput platform for in vivo chemical screenings on zebrafish embryos that includes automated methods for embryo dispensation, compound delivery, incubation, imaging and analysis of the results. At present, two different assays to detect cardiotoxic compounds and angiogenesis inhibitors can be automatically run in the platform, showing the versatility of the system. A validation of these two assays with known positive and negative compounds, as well as a screening for the detection of unknown anti-angiogenic compounds, have been successfully carried out in the system developed. We present a totally automated platform that allows for high-throughput screenings in a vertebrate organism. PMID:22615792

  12. BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing

    PubMed Central

    Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph

    2011-01-01

    Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797

  13. A Self-Reporting Photocatalyst for Online Fluorescence Monitoring of High Throughput RAFT Polymerization.

    PubMed

    Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie

    2018-04-25

    Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  15. Model-based high-throughput design of ion exchange protein chromatography.

    PubMed

    Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo

    2016-08-12

    This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider range of proteins. The trace fluorescently labeled crystals will also emit with sufficient intensity to aid in the automation of crystal alignment using relatively low cost optics, further increasing throughput at synchrotrons.

  17. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. High Throughput Genotoxicity Profiling of the US EPA ToxCast Chemical Library

    EPA Science Inventory

    A key aim of the ToxCast project is to investigate modern molecular and genetic high content and high throughput screening (HTS) assays, along with various computational tools to supplement and perhaps replace traditional assays for evaluating chemical toxicity. Genotoxicity is a...

  19. Minimally invasive surgery. Future developments.

    PubMed

    Wickham, J E

    1994-01-15

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures.

  20. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  1. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens

    PubMed Central

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-01-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701

  2. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens.

    PubMed

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-08-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents.

  3. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    PubMed

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  4. MRM validation of targeted nonglycosylated peptides from N-glycoprotein biomarkers using direct trypsin digestion of undepleted human plasma.

    PubMed

    Lee, Ju Yeon; Kim, Jin Young; Cheon, Mi Hee; Park, Gun Wook; Ahn, Yeong Hee; Moon, Myeong Hee; Yoo, Jong Shin

    2014-02-26

    A rapid, simple, and reproducible MRM-based validation method for serological glycoprotein biomarkers in clinical use was developed by targeting the nonglycosylated tryptic peptides adjacent to N-glycosylation sites. Since changes in protein glycosylation are known to be associated with a variety of diseases, glycoproteins have been major targets in biomarker discovery. We previously found that nonglycosylated tryptic peptides adjacent to N-glycosylation sites differed in concentration between normal and hepatocellular carcinoma (HCC) plasma due to differences in steric hindrance of the glycan moiety in N-glycoproteins to tryptic digestion (Lee et al., 2011). To increase the feasibility and applicability of clinical validation of biomarker candidates (nonglycosylated tryptic peptides), we developed a method to effectively monitor nonglycosylated tryptic peptides from a large number of plasma samples and to reduce the total analysis time with maximizing the effect of steric hindrance by the glycans during digestion of glycoproteins. The AUC values of targeted nonglycosylated tryptic peptides were excellent (0.955 for GQYCYELDEK, 0.880 for FEDGVLDPDYPR and 0.907 for TEDTIFLR), indicating that these could be effective biomarkers for hepatocellular carcinoma. This method provides the necessary throughput required to validate glycoprotein biomarkers, as well as quantitative accuracy for human plasma analysis, and should be amenable to clinical use. Difficulties in verifying and validating putative protein biomarkers are often caused by complex sample preparation procedures required to determine their concentrations in a large number of plasma samples. To solve the difficulties, we developed MRM-based protein biomarker assays that greatly reduce complex, time-consuming, and less reproducible sample pretreatment steps in plasma for clinical implementation. First, we used undepleted human plasma samples without any enrichment procedures. Using nanoLC/MS/MS, we targeted nonglycosylated tryptic peptides adjacent to N-linked glycosylation sites in N-linked glycoprotein biomarkers, which could be detected in human plasma samples without depleting highly abundant proteins. Second, human plasma proteins were digested with trypsin without reduction and alkylation procedures to minimize sample preparation. Third, trypsin digestion times were shortened so as to obtain reproducible results with maximization of the steric hindrance effect of the glycans during enzyme digestion. Finally, this rapid and simple sample preparation method was applied to validate targeted nonglycosylated tryptic peptides as liver cancer biomarker candidates for diagnosis in 40 normal and 41 hepatocellular carcinoma (HCC) human plasma samples. This strategy provided the necessary throughput required to monitor protein biomarkers, as well as quantitative accuracy in human plasma analysis. From biomarker discovery to clinical implementation, our method will provide a biomarker study platform that is suitable for clinical deployment, and can be applied to high-throughput approaches. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Expedient Caution: Approximating Exposure and Dosimetry to Understand Chemical Risk (OSU EMT Research Day keynote presentation)

    EPA Science Inventory

    I describe research on high throughput exposure and toxicokinetics. These tools provide context for data generated by high throughput toxicity screening to allow risk-based prioritization of thousands of chemicals.

  6. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  7. High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)

    EPA Science Inventory

    High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...

  8. High Throughput Computing Impact on Meta Genomics (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Gore, Brooklin

    2018-02-01

    This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.

  9. Identifying drought adaptive traits in upland cotton using a proximal sensing cart for high-throughput phenotyping

    USDA-ARS?s Scientific Manuscript database

    Field-based high-throughput phenotyping is an emerging approach to characterize difficult, time-sensitive plant traits in relevant growing conditions. Proximal sensing carts have been developed as an alternative platform to more costly high-clearance tractors for phenotyping dynamic traits in the fi...

  10. High-throughput profiling and analysis of plant responses over time to abiotic stress

    USDA-ARS?s Scientific Manuscript database

    Energy sorghum (Sorghum bicolor (L.) Moench) is a rapidly growing, high-biomass, annual crop prized for abiotic stress tolerance. Measuring genotype-by-environment (G x E) interactions remains a progress bottleneck. High throughput phenotyping within controlled environments has been proposed as a po...

  11. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    EPA Science Inventory

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  12. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  13. RIPiT-Seq: A high-throughput approach for footprinting RNA:protein complexes

    PubMed Central

    Singh, Guramrit; Ricci, Emiliano P.; Moore, Melissa J.

    2013-01-01

    Development of high-throughput approaches to map the RNA interaction sites of individual RNA binding proteins (RBPs) transcriptome-wide is rapidly transforming our understanding of post-transcriptional gene regulatory mechanisms. Here we describe a ribonucleoprotein (RNP) footprinting approach we recently developed for identifying occupancy sites of both individual RBPs and multi-subunit RNP complexes. RNA:protein immunoprecipitation in tandem (RIPiT) yields highly specific RNA footprints of cellular RNPs isolated via two sequential purifications; the resulting RNA footprints can then be identified by high-throughput sequencing (Seq). RIPiT-Seq is broadly applicable to all RBPs regardless of their RNA binding mode and thus provides a means to map the RNA binding sites of RBPs with poor inherent ultraviolet (UV) crosslinkability. Further, among current high-throughput approaches, RIPiT has the unique capacity to differentiate binding sites of RNPs with overlapping protein composition. It is therefore particularly suited for studying dynamic RNP assemblages whose composition evolves as gene expression proceeds. PMID:24096052

  14. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  15. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  16. Automatic and integrated micro-enzyme assay (AIμEA) platform for highly sensitive thrombin analysis via an engineered fluorescence protein-functionalized monolithic capillary column.

    PubMed

    Lin, Lihua; Liu, Shengquan; Nie, Zhou; Chen, Yingzhuang; Lei, Chunyang; Wang, Zhen; Yin, Chao; Hu, Huiping; Huang, Yan; Yao, Shouzhuo

    2015-04-21

    Nowadays, large-scale screening for enzyme discovery, engineering, and drug discovery processes require simple, fast, and sensitive enzyme activity assay platforms with high integration and potential for high-throughput detection. Herein, a novel automatic and integrated micro-enzyme assay (AIμEA) platform was proposed based on a unique microreaction system fabricated by a engineered green fluorescence protein (GFP)-functionalized monolithic capillary column, with thrombin as an example. The recombinant GFP probe was rationally engineered to possess a His-tag and a substrate sequence of thrombin, which enable it to be immobilized on the monolith via metal affinity binding, and to be released after thrombin digestion. Combined with capillary electrophoresis-laser-induced fluorescence (CE-LIF), all the procedures, including thrombin injection, online enzymatic digestion in the microreaction system, and label-free detection of the released GFP, were integrated in a single electrophoretic process. By taking advantage of the ultrahigh loading capacity of the AIμEA platform and the CE automatic programming setup, one microreaction column was sufficient for many times digestion without replacement. The novel microreaction system showed significantly enhanced catalytic efficiency, about 30 fold higher than that of the equivalent bulk reaction. Accordingly, the AIμEA platform was highly sensitive with a limit of detection down to 1 pM of thrombin. Moreover, the AIμEA platform was robust and reliable to detect thrombin in human serum samples and its inhibition by hirudin. Hence, this AIμEA platform exhibits great potential for high-throughput analysis in future biological application, disease diagnostics, and drug screening.

  17. Detecting SNPs and estimating allele frequencies in clonal bacterial populations by sequencing pooled DNA.

    PubMed

    Holt, Kathryn E; Teo, Yik Y; Li, Heng; Nair, Satheesh; Dougan, Gordon; Wain, John; Parkhill, Julian

    2009-08-15

    Here, we present a method for estimating the frequencies of SNP alleles present within pooled samples of DNA using high-throughput short-read sequencing. The method was tested on real data from six strains of the highly monomorphic pathogen Salmonella Paratyphi A, sequenced individually and in a pool. A variety of read mapping and quality-weighting procedures were tested to determine the optimal parameters, which afforded > or =80% sensitivity of SNP detection and strong correlation with true SNP frequency at poolwide read depth of 40x, declining only slightly at read depths 20-40x. The method was implemented in Perl and relies on the opensource software Maq for read mapping and SNP calling. The Perl script is freely available from ftp://ftp.sanger.ac.uk/pub/pathogens/pools/.

  18. Post-SM4 Sensitivity Calibration of the STIS Echelle Modes

    NASA Astrophysics Data System (ADS)

    Bostroem, K. Azalee; Aloisi, A.; Bohlin, R.; Hodge, P.; Proffitt, C.

    2012-01-01

    On-orbit sensitivity curves for all echelle modes were derived for post - servicing mis- sion 4 data using observations of the DA white dwarf G191-B2B. Additionally, new echelle ripple tables and grating dependent bad pixel tables were created for the FUV and NUV MAMA. We review the procedures used to derive the adopted throughputs and implement them in the pipeline as well as the motivation for the modification of the additional reference files and pipeline procedures.

  19. A high-throughput label-free nanoparticle analyser.

    PubMed

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  20. High Performance Computing Modernization Program Kerberos Throughput Test Report

    DTIC Science & Technology

    2017-10-26

    functionality as Kerberos plugins. The pre -release production kit was used in these tests to compare against the current release kit. YubiKey support...HPCMP Kerberos Throughput Test Report 3 2. THROUGHPUT TESTING 2.1 Testing Components Throughput testing was done to determine the benefits of the pre ...both the current release kit and the pre -release production kit for a total of 378 individual tests in order to note any improvements. Based on work

  1. Metabolomics Approach for Toxicity Screening of Volatile Substances

    EPA Science Inventory

    In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However, the ch...

  2. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  3. New High Throughput Methods to Estimate Chemical Exposure

    EPA Science Inventory

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...

  4. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    EPA Science Inventory

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  5. High-throughput crystal-optimization strategies in the South Paris Yeast Structural Genomics Project: one size fits all?

    PubMed

    Leulliot, Nicolas; Trésaugues, Lionel; Bremang, Michael; Sorel, Isabelle; Ulryck, Nathalie; Graille, Marc; Aboulfath, Ilham; Poupon, Anne; Liger, Dominique; Quevillon-Cheruel, Sophie; Janin, Joël; van Tilbeurgh, Herman

    2005-06-01

    Crystallization has long been regarded as one of the major bottlenecks in high-throughput structural determination by X-ray crystallography. Structural genomics projects have addressed this issue by using robots to set up automated crystal screens using nanodrop technology. This has moved the bottleneck from obtaining the first crystal hit to obtaining diffraction-quality crystals, as crystal optimization is a notoriously slow process that is difficult to automatize. This article describes the high-throughput optimization strategies used in the Yeast Structural Genomics project, with selected successful examples.

  6. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    NASA Astrophysics Data System (ADS)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  7. High Throughput Determination of Critical Human Dosing ...

    EPA Pesticide Factsheets

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data into predicted human equivalent doses that can be linked with biologically relevant exposure scenarios. Thus, HTTK provides essential data for risk prioritization for thousands of chemicals that lack TK data. One critical HTTK parameter that can be measured in vitro is the unbound fraction of a chemical in plasma (Fub). However, for chemicals that bind strongly to plasma, Fub is below the limits of detection (LOD) for high throughput analytical chemistry, and therefore cannot be quantified. A novel method for quantifying Fub was implemented for 85 strategically selected chemicals: measurement of Fub was attempted at 10%, 30%, and 100% of physiological plasma concentrations using rapid equilibrium dialysis assays. Varying plasma concentrations instead of chemical concentrations makes high throughput analytical methodology more likely to be successful. Assays at 100% plasma concentration were unsuccessful for 34 chemicals. For 12 of these 34 chemicals, Fub could be quantified at 10% and/or 30% plasma concentrations; these results imply that the assay failure at 100% plasma concentration was caused by plasma protein binding for these chemicals. Assay failure for the remaining 22 chemicals may

  8. Genome-wide RNAi Screening to Identify Host Factors That Modulate Oncolytic Virus Therapy.

    PubMed

    Allan, Kristina J; Mahoney, Douglas J; Baird, Stephen D; Lefebvre, Charles A; Stojdl, David F

    2018-04-03

    High-throughput genome-wide RNAi (RNA interference) screening technology has been widely used for discovering host factors that impact virus replication. Here we present the application of this technology to uncovering host targets that specifically modulate the replication of Maraba virus, an oncolytic rhabdovirus, and vaccinia virus with the goal of enhancing therapy. While the protocol has been tested for use with oncolytic Maraba virus and oncolytic vaccinia virus, this approach is applicable to other oncolytic viruses and can also be utilized for identifying host targets that modulate virus replication in mammalian cells in general. This protocol describes the development and validation of an assay for high-throughput RNAi screening in mammalian cells, the key considerations and preparation steps important for conducting a primary high-throughput RNAi screen, and a step-by-step guide for conducting a primary high-throughput RNAi screen; in addition, it broadly outlines the methods for conducting secondary screen validation and tertiary validation studies. The benefit of high-throughput RNAi screening is that it allows one to catalogue, in an extensive and unbiased fashion, host factors that modulate any aspect of virus replication for which one can develop an in vitro assay such as infectivity, burst size, and cytotoxicity. It has the power to uncover biotherapeutic targets unforeseen based on current knowledge.

  9. X-ray transparent microfluidic chips for high-throughput screening and optimization of in meso membrane protein crystallization

    PubMed Central

    Schieferstein, Jeremy M.; Pawate, Ashtamurthy S.; Wan, Frank; Sheraden, Paige N.; Broecker, Jana; Ernst, Oliver P.; Gennis, Robert B.

    2017-01-01

    Elucidating and clarifying the function of membrane proteins ultimately requires atomic resolution structures as determined most commonly by X-ray crystallography. Many high impact membrane protein structures have resulted from advanced techniques such as in meso crystallization that present technical difficulties for the set-up and scale-out of high-throughput crystallization experiments. In prior work, we designed a novel, low-throughput X-ray transparent microfluidic device that automated the mixing of protein and lipid by diffusion for in meso crystallization trials. Here, we report X-ray transparent microfluidic devices for high-throughput crystallization screening and optimization that overcome the limitations of scale and demonstrate their application to the crystallization of several membrane proteins. Two complementary chips are presented: (1) a high-throughput screening chip to test 192 crystallization conditions in parallel using as little as 8 nl of membrane protein per well and (2) a crystallization optimization chip to rapidly optimize preliminary crystallization hits through fine-gradient re-screening. We screened three membrane proteins for new in meso crystallization conditions, identifying several preliminary hits that we tested for X-ray diffraction quality. Further, we identified and optimized the crystallization condition for a photosynthetic reaction center mutant and solved its structure to a resolution of 3.5 Å. PMID:28469762

  10. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    PubMed

    Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E

    2017-01-01

    Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  11. High-Throughput Toxicity Testing: New Strategies for ...

    EPA Pesticide Factsheets

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it

  12. Sensitivity of STIS First-OrderMedium Resolution Modes

    NASA Astrophysics Data System (ADS)

    Proffitt, Charles R.

    2006-07-01

    The sensitivities for STIS first-order medium resolution modes were redetermined usingon-orbit observations of the standard DA white dwarfs G 191-B2B, GD 71, and GD 153.We review the procedures and assumptions used to derive the adopted throughputs, and discuss the remaining errors and uncertainties.

  13. Lead discovery for mammalian elongation of long chain fatty acids family 6 using a combination of high-throughput fluorescent-based assay and RapidFire mass spectrometry assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takamiya, Mari; Discovery Technology Laboratories, Sohyaku, Innovative Research Division, Mitsubishi Tanabe Pharma Corporation, Kawagishi, Toda-shi, Saitama; Sakurai, Masaaki

    A high-throughput RapidFire mass spectrometry assay is described for elongation of very long-chain fatty acids family 6 (Elovl6). Elovl6 is a microsomal enzyme that regulates the elongation of C12-16 saturated and monounsaturated fatty acids. Elovl6 may be a new therapeutic target for fat metabolism disorders such as obesity, type 2 diabetes, and nonalcoholic steatohepatitis. To identify new Elovl6 inhibitors, we developed a high-throughput fluorescence screening assay in 1536-well format. However, a number of false positives caused by fluorescent interference have been identified. To pick up the real active compounds among the primary hits from the fluorescence assay, we developed amore » RapidFire mass spectrometry assay and a conventional radioisotope assay. These assays have the advantage of detecting the main products directly without using fluorescent-labeled substrates. As a result, 276 compounds (30%) of the primary hits (921 compounds) in a fluorescence ultra-high-throughput screening method were identified as common active compounds in these two assays. It is concluded that both methods are very effective to eliminate false positives. Compared with the radioisotope method using an expensive {sup 14}C-labeled substrate, the RapidFire mass spectrometry method using unlabeled substrates is a high-accuracy, high-throughput method. In addition, some of the hit compounds selected from the screening inhibited cellular fatty acid elongation in HEK293 cells expressing Elovl6 transiently. This result suggests that these compounds may be promising lead candidates for therapeutic drugs. Ultra-high-throughput fluorescence screening followed by a RapidFire mass spectrometry assay was a suitable strategy for lead discovery against Elovl6. - Highlights: • A novel assay for elongation of very-long-chain fatty acids 6 (Elovl6) is proposed. • RapidFire mass spectrometry (RF-MS) assay is useful to select real screening hits. • RF-MS assay is proved to be beneficial because of its high-throughput and accuracy. • A combination of fluorescent and RF-MS assays is effective for Elovl6 inhibitors.« less

  14. High-throughput Screening of ToxCast" Phase I Chemicals in an Embryonic Stem Cell Assay Reveals Potential Disruption of a Critical Developmental Signaling Pathway

    EPA Science Inventory

    Little is known about the developmental toxicity of the expansive chemical landscape in existence today. Significant efforts are being made to apply novel methods to predict developmental activity of chemicals utilizing high-throughput screening (HTS) and high-content screening (...

  15. Comparison of Human Induced Pluripotent Stem Cell-Derived Neurons and Rat Primary CorticalNeurons as In Vitro Models of Neurite Outgrowth

    EPA Science Inventory

    High-throughput assays that can quantify chemical-induced changes at the cellular and molecular level have been recommended for use in chemical safety assessment. High-throughput, high content imaging assays for the key cellular events of neurodevelopment have been proposed to ra...

  16. Evaluation of sequencing approaches for high-throughput toxicogenomics (SOT)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platfo...

  17. High Throughput Assays and Exposure Science (ISES annual meeting)

    EPA Science Inventory

    High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...

  18. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  19. Atlanta I-85 HOV-to-HOT conversion : analysis of vehicle and person throughput.

    DOT National Transportation Integrated Search

    2013-10-01

    This report summarizes the vehicle and person throughput analysis for the High Occupancy Vehicle to High Occupancy Toll Lane : conversion in Atlanta, GA, undertaken by the Georgia Institute of Technology research team. The team tracked changes in : o...

  20. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    EPA Science Inventory

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  1. Accounting For Uncertainty in The Application Of High Throughput Datasets

    EPA Science Inventory

    The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...

  2. Advances in High-Throughput Speed, Low-Latency Communication for Embedded Instrumentation (7th Annual SFAF Meeting, 2012)

    ScienceCinema

    Jordan, Scott

    2018-01-24

    Scott Jordan on "Advances in high-throughput speed, low-latency communication for embedded instrumentation" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  3. Inter-Individual Variability in High-Throughput Risk Prioritization of Environmental Chemicals (Sot)

    EPA Science Inventory

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have...

  4. Inter-individual variability in high-throughput risk prioritization of environmental chemicals (IVIVE)

    EPA Science Inventory

    We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...

  5. Development of a thyroperoxidase inhibition assay for high-throughput screening

    EPA Science Inventory

    High-throughput screening (HTPS) assays to detect inhibitors of thyroperoxidase (TPO), the enzymatic catalyst for thyroid hormone (TH) synthesis, are not currently available. Herein we describe the development of a HTPS TPO inhibition assay. Rat thyroid microsomes and a fluores...

  6. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  7. Evaluating and Refining High Throughput Tools for Toxicokinetics

    EPA Science Inventory

    This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...

  8. Picking Cell Lines for High-Throughput Transcriptomic Toxicity Screening (SOT)

    EPA Science Inventory

    High throughput, whole genome transcriptomic profiling is a promising approach to comprehensively evaluate chemicals for potential biological effects. To be useful for in vitro toxicity screening, gene expression must be quantified in a set of representative cell types that captu...

  9. 20180312 - Uncertainty and Variability in High-Throughput Toxicokinetics for Risk Prioritization (SOT)

    EPA Science Inventory

    Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...

  10. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    PubMed Central

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  11. Is Virtual Reality Ready for Prime Time in the Medical Space? A Randomized Control Trial of Pediatric Virtual Reality for Acute Procedural Pain Management.

    PubMed

    Gold, Jeffrey I; Mahrer, Nicole E

    2018-04-01

    To conduct a randomized control trial to evaluate the feasibility and efficacy of virtual reality (VR) compared with standard of care (SOC) for reducing pain, anxiety, and improving satisfaction associated with blood draw in children ages 10-21 years. In total, 143 triads (patients, their caregiver, and the phlebotomist) were recruited in outpatient phlebotomy at a pediatric hospital and randomized to receive either VR or SOC when undergoing routine blood draw. Patients and caregivers completed preprocedural and postprocedural standardized measures of pain, anxiety, and satisfaction, and phlebotomists reported about the patient's experience during the procedure. Findings showed that VR significantly reduced acute procedural pain and anxiety compared with SOC. A significant interaction between patient-reported anxiety sensitivity and treatment condition indicated that patients undergoing routine blood draw benefit more from VR intervention when they are more fearful of physiological sensations related to anxiety. Patients and caregivers in the VR condition reported high levels of satisfaction with the procedure. VR is feasible, tolerated, and well-liked by patients, caregivers, and phlebotomists alike for routine blood draw. Given the immersive and engaging nature of the VR experience, VR has the capacity to act as a preventive intervention transforming the blood draw experience into a less distressing, potentially pain-free routine medical procedure, particularly for pediatric patients with high anxiety sensitivity. VR holds promise to reduce negative health outcomes for children and reduce distress in caregivers, while facilitating increased satisfaction and throughput in hectic outpatient phlebotomy clinics.

  12. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    PubMed

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  13. High-Throughput Tabular Data Processor – Platform independent graphical tool for processing large data sets

    PubMed Central

    Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475

  14. Evaluation of commercial DNA and RNA extraction methods for high-throughput sequencing of FFPE samples.

    PubMed

    Kresse, Stine H; Namløs, Heidi M; Lorenz, Susanne; Berner, Jeanne-Marie; Myklebost, Ola; Bjerkehagen, Bodil; Meza-Zepeda, Leonardo A

    2018-01-01

    Nucleic acid material of adequate quality is crucial for successful high-throughput sequencing (HTS) analysis. DNA and RNA isolated from archival FFPE material are frequently degraded and not readily amplifiable due to chemical damage introduced during fixation. To identify optimal nucleic acid extraction kits, DNA and RNA quantity, quality and performance in HTS applications were evaluated. DNA and RNA were isolated from five sarcoma archival FFPE blocks, using eight extraction protocols from seven kits from three different commercial vendors. For DNA extraction, the truXTRAC FFPE DNA kit from Covaris gave higher yields and better amplifiable DNA, but all protocols gave comparable HTS library yields using Agilent SureSelect XT and performed well in downstream variant calling. For RNA extraction, all protocols gave comparable yields and amplifiable RNA. However, for fusion gene detection using the Archer FusionPlex Sarcoma Assay, the truXTRAC FFPE RNA kit from Covaris and Agencourt FormaPure kit from Beckman Coulter showed the highest percentage of unique read-pairs, providing higher complexity of HTS data and more frequent detection of recurrent fusion genes. truXTRAC simultaneous DNA and RNA extraction gave similar outputs as individual protocols. These findings show that although successful HTS libraries could be generated in most cases, the different protocols gave variable quantity and quality for FFPE nucleic acid extraction. Selecting the optimal procedure is highly valuable and may generate results in borderline quality specimens.

  15. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    PubMed

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  16. Turning tumor-promoting copper into an anti-cancer weapon via high-throughput chemistry.

    PubMed

    Wang, F; Jiao, P; Qi, M; Frezza, M; Dou, Q P; Yan, B

    2010-01-01

    Copper is an essential element for multiple biological processes. Its concentration is elevated to a very high level in cancer tissues for promoting cancer development through processes such as angiogenesis. Organic chelators of copper can passively reduce cellular copper and serve the role as inhibitors of angiogenesis. However, they can also actively attack cellular targets such as proteasome, which plays a critical role in cancer development and survival. The discovery of such molecules initially relied on a step by step synthesis followed by biological assays. Today high-throughput chemistry and high-throughput screening have significantly expedited the copper-binding molecules discovery to turn "cancer-promoting" copper into anti-cancer agents.

  17. HTP-OligoDesigner: An Online Primer Design Tool for High-Throughput Gene Cloning and Site-Directed Mutagenesis.

    PubMed

    Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor

    2016-01-01

    Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.

  18. High-throughput continuous hydrothermal synthesis of nanomaterials (part II): unveiling the as-prepared CexZryYzO2-δ phase diagram.

    PubMed

    Quesada-Cabrera, Raul; Weng, Xiaole; Hyett, Geoff; Clark, Robin J H; Wang, Xue Z; Darr, Jawwad A

    2013-09-09

    High-throughput continuous hydrothermal flow synthesis was used to manufacture 66 unique nanostructured oxide samples in the Ce-Zr-Y-O system. This synthesis approach resulted in a significant increase in throughput compared to that of conventional batch or continuous hydrothermal synthesis methods. The as-prepared library samples were placed into a wellplate for both automated high-throughput powder X-ray diffraction and Raman spectroscopy data collection, which allowed comprehensive structural characterization and phase mapping. The data suggested that a continuous cubic-like phase field connects all three Ce-Zr-O, Ce-Y-O, and Y-Zr-O binary systems together with a smooth and steady transition between the structures of neighboring compositions. The continuous hydrothermal process led to as-prepared crystallite sizes in the range of 2-7 nm (as determined by using the Scherrer equation).

  19. State of the Art High-Throughput Approaches to Genotoxicity: Flow Micronucleus, Ames II, GreenScreen and Comet

    EPA Pesticide Factsheets

    State of the Art High-Throughput Approaches to Genotoxicity: Flow Micronucleus, Ames II, GreenScreen and Comet (Presented by Dr. Marilyn J. Aardema, Chief Scientific Advisor, Toxicology, Dr. Leon Stankowski, et. al. (6/28/2012)

  20. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  1. Incorporating Human Dosimetry and Exposure into High-Throughput In Vitro Toxicity Screening

    EPA Science Inventory

    Many chemicals in commerce today have undergone limited or no safety testing. To reduce the number of untested chemicals and prioritize limited testing resources, several governmental programs are using high-throughput in vitro screens for assessing chemical effects across multip...

  2. Environmental Impact on Vascular Development Predicted by High Throughput Screening

    EPA Science Inventory

    Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...

  3. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    EPA Science Inventory

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  4. AOPs and Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making

    EPA Science Inventory

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will b...

  5. Human variability in high-throughput risk prioritization of environmental chemicals (Texas AM U. webinar)

    EPA Science Inventory

    We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...

  6. HTTK: R Package for High-Throughput Toxicokinetics

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  7. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    EPA Science Inventory

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  8. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  9. In Vitro Toxicity Screening Technique for Volatile Substances Using Flow-Through System#

    EPA Science Inventory

    In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However the cha...

  10. Integration of Dosimetry, Exposure and High-Throughput Screening Data in Chemical Toxicity Assessment

    EPA Science Inventory

    High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, c...

  11. High-Throughput Toxicokinetics (HTTK) R package (CompTox CoP presentation)

    EPA Science Inventory

    Toxicokinetics (TK) provides a bridge between HTS and HTE by predicting tissue concentrations due to exposure, but traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to determine range of effic...

  12. A review of the theory, methods and recent applications of high-throughput single-cell droplet microfluidics

    NASA Astrophysics Data System (ADS)

    Lagus, Todd P.; Edd, Jon F.

    2013-03-01

    Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.

  13. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    PubMed Central

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  14. High-Throughput Lectin Microarray-Based Analysis of Live Cell Surface Glycosylation

    PubMed Central

    Li, Yu; Tao, Sheng-ce; Zhu, Heng; Schneck, Jonathan P.

    2011-01-01

    Lectins, plant-derived glycan-binding proteins, have long been used to detect glycans on cell surfaces. However, the techniques used to characterize serum or cells have largely been limited to mass spectrometry, blots, flow cytometry, and immunohistochemistry. While these lectin-based approaches are well established and they can discriminate a limited number of sugar isomers by concurrently using a limited number of lectins, they are not amenable for adaptation to a high-throughput platform. Fortunately, given the commercial availability of lectins with a variety of glycan specificities, lectins can be printed on a glass substrate in a microarray format to profile accessible cell-surface glycans. This method is an inviting alternative for analysis of a broad range of glycans in a high-throughput fashion and has been demonstrated to be a feasible method of identifying binding-accessible cell surface glycosylation on living cells. The current unit presents a lectin-based microarray approach for analyzing cell surface glycosylation in a high-throughput fashion. PMID:21400689

  15. Next-generation sequencing coupled with a cell-free display technology for high-throughput production of reliable interactome data

    PubMed Central

    Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko

    2012-01-01

    Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904

  16. Advanced Virus Detection Technologies Interest Group (AVDTIG): Efforts on High Throughput Sequencing (HTS) for Virus Detection.

    PubMed

    Khan, Arifa S; Vacante, Dominick A; Cassart, Jean-Pol; Ng, Siemon H S; Lambert, Christophe; Charlebois, Robert L; King, Kathryn E

    Several nucleic-acid based technologies have recently emerged with capabilities for broad virus detection. One of these, high throughput sequencing, has the potential for novel virus detection because this method does not depend upon prior viral sequence knowledge. However, the use of high throughput sequencing for testing biologicals poses greater challenges as compared to other newly introduced tests due to its technical complexities and big data bioinformatics. Thus, the Advanced Virus Detection Technologies Users Group was formed as a joint effort by regulatory and industry scientists to facilitate discussions and provide a forum for sharing data and experiences using advanced new virus detection technologies, with a focus on high throughput sequencing technologies. The group was initiated as a task force that was coordinated by the Parenteral Drug Association and subsequently became the Advanced Virus Detection Technologies Interest Group to continue efforts for using new technologies for detection of adventitious viruses with broader participation, including international government agencies, academia, and technology service providers. © PDA, Inc. 2016.

  17. The application of the high throughput sequencing technology in the transposable elements.

    PubMed

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  18. Near-common-path interferometer for imaging Fourier-transform spectroscopy in wide-field microscopy

    PubMed Central

    Wadduwage, Dushan N.; Singh, Vijay Raj; Choi, Heejin; Yaqoob, Zahid; Heemskerk, Hans; Matsudaira, Paul; So, Peter T. C.

    2017-01-01

    Imaging Fourier-transform spectroscopy (IFTS) is a powerful method for biological hyperspectral analysis based on various imaging modalities, such as fluorescence or Raman. Since the measurements are taken in the Fourier space of the spectrum, it can also take advantage of compressed sensing strategies. IFTS has been readily implemented in high-throughput, high-content microscope systems based on wide-field imaging modalities. However, there are limitations in existing wide-field IFTS designs. Non-common-path approaches are less phase-stable. Alternatively, designs based on the common-path Sagnac interferometer are stable, but incompatible with high-throughput imaging. They require exhaustive sequential scanning over large interferometric path delays, making compressive strategic data acquisition impossible. In this paper, we present a novel phase-stable, near-common-path interferometer enabling high-throughput hyperspectral imaging based on strategic data acquisition. Our results suggest that this approach can improve throughput over those of many other wide-field spectral techniques by more than an order of magnitude without compromising phase stability. PMID:29392168

  19. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    PubMed

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  20. Graph-based signal integration for high-throughput phenotyping

    PubMed Central

    2012-01-01

    Background Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. Results MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. Conclusions We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping. PMID:23320851

  1. High-throughput cloning and expression library creation for functional proteomics.

    PubMed

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-05-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particularly important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single-gene experiments, creating the need for fast, flexible, and reliable cloning systems. These collections of ORF clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial, we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator(TM) DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This tutorial is part of the International Proteomics Tutorial Programme (IPTP12). © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Genome-scale deletion screening of human long non-coding RNAs using a paired-guide RNA CRISPR library

    PubMed Central

    Zhu, Shiyou; Li, Wei; Liu, Jingze; Chen, Chen-Hao; Liao, Qi; Xu, Ping; Xu, Han; Xiao, Tengfei; Cao, Zhongzheng; Peng, Jingyu; Yuan, Pengfei; Brown, Myles; Liu, Xiaole Shirley; Wei, Wensheng

    2017-01-01

    CRISPR/Cas9 screens have been widely adopted to analyse coding gene functions, but high throughput screening of non-coding elements using this method is more challenging, because indels caused by a single cut in non-coding regions are unlikely to produce a functional knockout. A high-throughput method to produce deletions of non-coding DNA is needed. Herein, we report a high throughput genomic deletion strategy to screen for functional long non-coding RNAs (lncRNAs) that is based on a lentiviral paired-guide RNA (pgRNA) library. Applying our screening method, we identified 51 lncRNAs that can positively or negatively regulate human cancer cell growth. We individually validated 9 lncRNAs using CRISPR/Cas9-mediated genomic deletion and functional rescue, CRISPR activation or inhibition, and gene expression profiling. Our high-throughput pgRNA genome deletion method should enable rapid identification of functional mammalian non-coding elements. PMID:27798563

  3. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection

    PubMed Central

    Choudhry, Priya

    2016-01-01

    Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849

  4. High-throughput determination of structural phase diagram and constituent phases using GRENDEL

    NASA Astrophysics Data System (ADS)

    Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.

    2015-11-01

    Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.

  5. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    NASA Astrophysics Data System (ADS)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  6. Development of a high-throughput crystal structure-determination platform for JAK1 using a novel metal-chelator soaking system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caspers, Nicole L.; Han, Seungil; Rajamohan, Francis

    2016-10-27

    Crystals of phosphorylated JAK1 kinase domain were initially generated in complex with nucleotide (ADP) and magnesium. The tightly bound Mg 2+-ADP at the ATP-binding site proved recalcitrant to ligand displacement. Addition of a molar excess of EDTA helped to dislodge the divalent metal ion, promoting the release of ADP and allowing facile exchange with ATP-competitive small-molecule ligands. Many kinases require the presence of a stabilizing ligand in the ATP site for crystallization. This procedure could be useful for developing co-crystallization systems with an exchangeable ligand to enable structure-based drug design of other protein kinases.

  7. In Situ Hi-C Library Preparation for Plants to Study Their Three-Dimensional Chromatin Interactions on a Genome-Wide Scale.

    PubMed

    Liu, Chang

    2017-01-01

    The spatial organization of the genome in the nucleus is critical for many cellular processes. It has been broadly accepted that the packing of chromatin inside the nucleus is not random, but structured at several hierarchical levels. The Hi-C method combines Chromatin Conformation Capture and high-throughput sequencing, which allows interrogating genome-wide chromatin interactions. Depending on the sequencing depth, chromatin packing patterns derived from Hi-C experiments can be viewed on a chromosomal scale or at a local genic level. Here, I describe a protocol of plant in situ Hi-C library preparation, which covers procedures starting from tissue fixation to library amplification.

  8. Topography and refractometry of nanostructures using spatial light interference microscopy.

    PubMed

    Wang, Zhuo; Chun, Ik Su; Li, Xiuling; Ong, Zhun-Yong; Pop, Eric; Millet, Larry; Gillette, Martha; Popescu, Gabriel

    2010-01-15

    Spatial light interference microscopy (SLIM) is a novel method developed in our laboratory that provides quantitative phase images of transparent structures with a 0.3 nm spatial and 0.03 nm temporal accuracy owing to the white light illumination and its common path interferometric geometry. We exploit these features and demonstrate SLIM's ability to perform topography at a single atomic layer in graphene. Further, using a decoupling procedure that we developed for cylindrical structures, we extract the axially averaged refractive index of semiconductor nanotubes and a neurite of a live hippocampal neuron in culture. We believe that this study will set the basis for novel high-throughput topography and refractometry of man-made and biological nanostructures.

  9. Anticancer drug discovery and pharmaceutical chemistry: a history.

    PubMed

    Braña, Miguel F; Sánchez-Migallón, Ana

    2006-10-01

    There are several procedures for the chemical discovery and design of new drugs from the point of view of the pharmaceutical or medicinal chemistry. They range from classical methods to the very new ones, such as molecular modeling or high throughput screening. In this review, we will consider some historical approaches based on the screening of natural products, the chances for luck, the systematic screening of new chemical entities and serendipity. Another group comprises rational design, as in the case of metabolic pathways, conformation versus configuration and, finally, a brief description on available new targets to be carried out. In each approach, the structure of some examples of clinical interest will be shown.

  10. A simple dual online ultra-high pressure liquid chromatography system (sDO-UHPLC) for high throughput proteome analysis.

    PubMed

    Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won

    2015-08-21

    We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.

  11. Optimization and high-throughput screening of antimicrobial peptides.

    PubMed

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  12. Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.

    PubMed

    Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen

    2018-07-20

    Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    NASA Astrophysics Data System (ADS)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  14. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    PubMed

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.

  15. High-speed two-dimensional laser scanner based on Bragg gratings stored in photothermorefractive glass.

    PubMed

    Yaqoob, Zahid; Arain, Muzammil A; Riza, Nabeel A

    2003-09-10

    A high-speed free-space wavelength-multiplexed optical scanner with high-speed wavelength selection coupled with narrowband volume Bragg gratings stored in photothermorefractive (PTR) glass is reported. The proposed scanner with no moving parts has a modular design with a wide angular scan range, accurate beam pointing, low scanner insertion loss, and two-dimensional beam scan capabilities. We present a complete analysis and design procedure for storing multiple tilted Bragg-grating structures in a single PTR glass volume (for normal incidence) in an optimal fashion. Because the scanner design is modular, many PTR glass volumes (each having multiple tilted Bragg-grating structures) can be stacked together, providing an efficient throughput with operations in both the visible and the infrared (IR) regions. A proof-of-concept experimental study is conducted with four Bragg gratings in independent PTR glass plates, and both visible and IR region scanner operations are demonstrated.

  16. The Development of Protein Microarrays and Their Applications in DNA-Protein and Protein-Protein Interaction Analyses of Arabidopsis Transcription Factors

    PubMed Central

    Gong, Wei; He, Kun; Covington, Mike; Dinesh-Kumar, S. P.; Snyder, Michael; Harmer, Stacey L.; Zhu, Yu-Xian; Deng, Xing Wang

    2009-01-01

    We used our collection of Arabidopsis transcription factor (TF) ORFeome clones to construct protein microarrays containing as many as 802 TF proteins. These protein microarrays were used for both protein-DNA and protein-protein interaction analyses. For protein-DNA interaction studies, we examined AP2/ERF family TFs and their cognate cis-elements. By careful comparison of the DNA-binding specificity of 13 TFs on the protein microarray with previous non-microarray data, we showed that protein microarrays provide an efficient and high throughput tool for genome-wide analysis of TF-DNA interactions. This microarray protein-DNA interaction analysis allowed us to derive a comprehensive view of DNA-binding profiles of AP2/ERF family proteins in Arabidopsis. It also revealed four TFs that bound the EE (evening element) and had the expected phased gene expression under clock-regulation, thus providing a basis for further functional analysis of their roles in clock regulation of gene expression. We also developed procedures for detecting protein interactions using this TF protein microarray and discovered four novel partners that interact with HY5, which can be validated by yeast two-hybrid assays. Thus, plant TF protein microarrays offer an attractive high-throughput alternative to traditional techniques for TF functional characterization on a global scale. PMID:19802365

  17. Differential gene and transcript expression analysis of RNA-seq experiments with TopHat and Cufflinks

    PubMed Central

    Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior

    2012-01-01

    Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036

  18. High-Throughput Nuclear Magnetic Resonance Metabolomic Footprinting for Tissue Engineering

    PubMed Central

    Seagle, Christopher; Christie, Megan A.; Winnike, Jason H.; McClelland, Randall E.; Ludlow, John W.; O'Connell, Thomas M.; Gamcsik, Michael P.

    2008-01-01

    Abstract We report a high-throughput (HTP) nuclear magnetic resonance (NMR) method for analysis of media components and a metabolic schematic to help easily interpret the data. Spin-lattice relaxation values and concentrations were measured for 19 components and 2 internal referencing agents in pure and 2-day conditioned, hormonally defined media from a 3-dimensional (3D) multicoaxial human bioartificial liver (BAL). The 1H NMR spectral signal-to-noise ratio is 21 for 0.16 mM alanine in medium and is obtained in 12 min using a 400 MHz NMR spectrometer. For comparison, 2D gel cultures and 3D multicoaxial BALs were batch cultured, with medium changed every day for 15 days after inoculation with human liver cells in Matrigel–collagen type 1 gels. Glutamine consumption was higher by day 8 in the BAL than in 2D culture; lactate production was lower through the 15-day culture period. Alanine was the primary amino acid produced and tracked with lactate or urea production. Glucose and pyruvate consumption were similar in the BAL and 2D cultures. NMR analysis permits quality assurance of the bioreactor by identifying contaminants. Ethanol was observed because of a bioreactor membrane “wetting” procedure. A biochemical scheme is presented illustrating bioreactor metabolomic footprint results and demonstrating how this can be translated to modify bioreactor operational parameters or quality assurance issues. PMID:18544027

  19. Evaluation of Flow-Injection Tandem Mass Spectrometry for Rapid and High-Throughput Quantitative Determination of B-Vitamins in Nutritional Supplements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhandari, Deepak; Van Berkel, Gary J

    2012-01-01

    The use of flow-injection electrospray ionization tandem mass spectrometry for rapid and high-throughput mass spectral analysis of selected B-vitamins, viz. B1, B2, B3, B5, and B6, in nutritional formulations was demonstrated. A simple and rapid (~5 min) in-tube sample preparation was performed by adding extraction solvent to a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Automated flow injection introduced 1 L of the extracts directly into the mass spectrometer ion source without chromatographic separation. Sample-to-sample analysis time was 60 s representing significant improvement over conventional liquid chromatography approaches which typically require 25-45more » min, and often require more significant sample preparation procedures. Quantitative capabilities of the flow-injection analysis were tested using the method of standard additions and NIST standard reference material (SRM 3280) multivitamin/multielement tablets. The quantity determined for each B-vitamin in SRM 3280 was within the statistical range provided for the respective certified values. The same sample preparation and analysis approach was also applied to two different commercial vitamin supplement tablets and proved to be successful in the quantification of the selected B-vitamins as evidenced by an agreement with the labels values and the results obtained using isotope dilution liquid chromatography/mass spectrometry.« less

  20. Detection of co-eluted peptides using database search methods

    PubMed Central

    Alves, Gelio; Ogurtsov, Aleksey Y; Kwok, Siwei; Wu, Wells W; Wang, Guanghui; Shen, Rong-Fong; Yu, Yi-Kuo

    2008-01-01

    Background Current experimental techniques, especially those applying liquid chromatography mass spectrometry, have made high-throughput proteomic studies possible. The increase in throughput however also raises concerns on the accuracy of identification or quantification. Most experimental procedures select in a given MS scan only a few relatively most intense parent ions, each to be fragmented (MS2) separately, and most other minor co-eluted peptides that have similar chromatographic retention times are ignored and their information lost. Results We have computationally investigated the possibility of enhancing the information retrieval during a given LC/MS experiment by selecting the two or three most intense parent ions for simultaneous fragmentation. A set of spectra is created via superimposing a number of MS2 spectra, each can be identified by all search methods tested with high confidence, to mimick the spectra of co-eluted peptides. The generated convoluted spectra were used to evaluate the capability of several database search methods – SEQUEST, Mascot, X!Tandem, OMSSA, and RAId_DbS – in identifying true peptides from superimposed spectra of co-eluted peptides. We show that using these simulated spectra, all the database search methods will gain eventually in the number of true peptides identified by using the compound spectra of co-eluted peptides. Open peer review Reviewed by Vlad Petyuk (nominated by Arcady Mushegian), King Jordan and Shamil Sunyaev. For the full reviews, please go to the Reviewers' comments section. PMID:18597684

  1. Molecular strain typing of Brucella abortus isolates from Italy by two VNTR allele sizing technologies.

    PubMed

    De Santis, Riccardo; Ancora, Massimo; De Massis, Fabrizio; Ciammaruconi, Andrea; Zilli, Katiuscia; Di Giannatale, Elisabetta; Pittiglio, Valentina; Fillo, Silvia; Lista, Florigio

    2013-10-01

    Brucellosis, one of the most important re-emerging zoonoses in many countries, is caused by bacteria belonging to the genus Brucella. Furthermore these bacteria represent potential biological warfare agents and the identification of species and biovars of field strains may be crucial for tracing back source of infection, allowing to discriminate naturally occurring outbreaks instead of bioterrorist events. In the last years, multiple-locus variable-number tandem repeat analysis (MLVA) has been proposed as complement of the classical biotyping methods and it has been applied for genotyping large collections of Brucella spp. At present, the MLVA band profiles may be resolved by automated or manual procedures. The Lab on a chip technology represents a valid alternative to standard genotyping techniques (as agarose gel electrophoresis) and it has been previously used for Brucella genotyping. Recently, a new high-throughput genotyping analysis system based on capillary gel electrophoresis, the QIAxcel, has been described. The aim of the study was to evaluate the ability of two DNA sizing equipments, the QIAxcel System and the Lab chip GX, to correctly call alleles at the sixteen loci including one frequently used MLVA assay for Brucella genotyping. The results confirmed that these technologies represent a meaningful advancement in high-throughput Brucella genotyping. Considering the accuracy required to confidently resolve loci discrimination, QIAxcel shows a better ability to measure VNTR allele sizes compared to LabChip GX.

  2. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  3. High Throughput Transcriptomics @ USEPA (Toxicology ...

    EPA Pesticide Factsheets

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  4. Mobile element biology – new possibilities with high-throughput sequencing

    PubMed Central

    Xing, Jinchuan; Witherspoon, David J.; Jorde, Lynn B.

    2014-01-01

    Mobile elements compose more than half of the human genome, but until recently their large-scale detection was time-consuming and challenging. With the development of new high-throughput sequencing technologies, the complete spectrum of mobile element variation in humans can now be identified and analyzed. Thousands of new mobile element insertions have been discovered, yielding new insights into mobile element biology, evolution, and genomic variation. We review several high-throughput methods, with an emphasis on techniques that specifically target mobile element insertions in humans, and we highlight recent applications of these methods in evolutionary studies and in the analysis of somatic alterations in human cancers. PMID:23312846

  5. Advances in high throughput DNA sequence data compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz

    2016-06-01

    Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.

  6. Accelerating Virtual High-Throughput Ligand Docking: current technology and case study on a petascale supercomputer.

    PubMed

    Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome

    2014-04-25

    In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.

  7. LOCATE: a mouse protein subcellular localization database

    PubMed Central

    Fink, J. Lynn; Aturaliya, Rajith N.; Davis, Melissa J.; Zhang, Fasheng; Hanson, Kelly; Teasdale, Melvena S.; Kai, Chikatoshi; Kawai, Jun; Carninci, Piero; Hayashizaki, Yoshihide; Teasdale, Rohan D.

    2006-01-01

    We present here LOCATE, a curated, web-accessible database that houses data describing the membrane organization and subcellular localization of proteins from the FANTOM3 Isoform Protein Sequence set. Membrane organization is predicted by the high-throughput, computational pipeline MemO. The subcellular locations of selected proteins from this set were determined by a high-throughput, immunofluorescence-based assay and by manually reviewing >1700 peer-reviewed publications. LOCATE represents the first effort to catalogue the experimentally verified subcellular location and membrane organization of mammalian proteins using a high-throughput approach and provides localization data for ∼40% of the mouse proteome. It is available at . PMID:16381849

  8. In vivo monitoring of cellular energy metabolism using SoNar, a highly responsive sensor for NAD(+)/NADH redox state.

    PubMed

    Zhao, Yuzheng; Wang, Aoxue; Zou, Yejun; Su, Ni; Loscalzo, Joseph; Yang, Yi

    2016-08-01

    NADH and its oxidized form NAD(+) have a central role in energy metabolism, and their concentrations are often considered to be among the most important readouts of metabolic state. Here, we present a detailed protocol to image and monitor NAD(+)/NADH redox state in living cells and in vivo using a highly responsive, genetically encoded fluorescent sensor known as SoNar (sensor of NAD(H) redox). The chimeric SoNar protein was initially developed by inserting circularly permuted yellow fluorescent protein (cpYFP) into the NADH-binding domain of Rex protein from Thermus aquaticus (T-Rex). It functions by binding to either NAD(+) or NADH, thus inducing protein conformational changes that affect its fluorescent properties. We first describe steps for how to establish SoNar-expressing cells, and then discuss how to use the system to quantify the intracellular redox state. This approach is sensitive, accurate, simple and able to report subtle perturbations of various pathways of energy metabolism in real time. We also detail the application of SoNar to high-throughput chemical screening of candidate compounds targeting cell metabolism in a microplate-reader-based assay, along with in vivo fluorescence imaging of tumor xenografts expressing SoNar in mice. Typically, the approximate time frame for fluorescence imaging of SoNar is 30 min for living cells and 60 min for living mice. For high-throughput chemical screening in a 384-well-plate assay, the whole procedure generally takes no longer than 60 min to assess the effects of 380 compounds on cell metabolism.

  9. An economical and effective high-throughput DNA extraction protocol for molecular marker analysis in honey bees

    USDA-ARS?s Scientific Manuscript database

    Extraction of DNA from tissue samples can be expensive both in time and monetary resources and can often require handling and disposal of hazardous chemicals. We have developed a high throughput protocol for extracting DNA from honey bees that is of a high enough quality and quantity to enable hundr...

  10. Applications of high throughput (combinatorial) methodologies to electronic, magnetic, optical, and energy-related materials

    NASA Astrophysics Data System (ADS)

    Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.

    2013-06-01

    High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.

  11. 20171015 - Capabilities and Evaluation of the US EPA’s HTTK (High Throughput Toxicokinetics) R package (ISES)

    EPA Science Inventory

    Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure, however traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...

  12. High-throughput dietary exposure predictions for chemical migrants from food contact substances for use in chemical prioritization

    EPA Science Inventory

    Under the ExpoCast program, United States Environmental Protection Agency (EPA) researchers have developed a high-throughput (HT) framework for estimating aggregate exposures to chemicals from multiple pathways to support rapid prioritization of chemicals. Here, we present method...

  13. Environmental surveillance and monitoring. The next frontiers for high-throughput toxicology

    EPA Science Inventory

    High throughput toxicity testing (HTT) technologies along with the world-wide web are revolutionizing both generation and access to data regarding the bioactivities that chemicals can elicit when they interact with specific proteins, genes, or other targets in the body of an orga...

  14. High-Throughput Models for Exposure-Based Chemical Prioritization in the ExpoCast Project

    EPA Science Inventory

    The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research pr...

  15. High-Throughput Exposure Potential Prioritization for ToxCast Chemicals

    EPA Science Inventory

    The U.S. EPA must consider lists of hundreds to thousands of chemicals when prioritizing research resources in order to identify risk to human populations and the environment. High-throughput assays to identify biological activity in vitro have allowed the ToxCastTM program to i...

  16. Use of High-Throughput Testing and Approaches for Evaluating Chemical Risk-Relevance to Humans

    EPA Science Inventory

    ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput screening (HTS) and computational models that integrate knowledge of biological systems and in vivo toxicities. Many of these assays probe signaling pathways and cellular processes critical to...

  17. High-Throughput Simulation of Environmental Chemical Fate for Exposure Prioritization

    EPA Science Inventory

    The U.S. EPA must consider lists of hundreds to thousands of chemicals when allocating resources to identify risk in human populations and the environment. High-throughput screening assays to characterize biological activity in vitro have allowed the ToxCastTM program to identify...

  18. New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era (2010 JGI/ANL HPC Workshop)

    ScienceCinema

    Notredame, Cedric

    2018-05-02

    Cedric Notredame from the Centre for Genomic Regulation gives a presentation on New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era at the JGI/Argonne HPC Workshop on January 26, 2010.

  19. Molecular characterization of a novel Nucleorhabdovirus from black currant identified by high-throughput sequencing

    USDA-ARS?s Scientific Manuscript database

    Contigs with sequence similarities to several nucleorhabdoviruses were identified by high-throughput sequencing analysis from a black currant (Ribes nigrum L.) cultivar. The complete genomic sequence of this new nucleorhabdovirus is 14,432 nucleotides. Its genomic organization is typical of nucleorh...

  20. Estimating Toxicity Pathway Activating Doses for High Throughput Chemical Risk Assessments

    EPA Science Inventory

    Estimating a Toxicity Pathway Activating Dose (TPAD) from in vitro assays as an analog to a reference dose (RfD) derived from in vivo toxicity tests would facilitate high throughput risk assessments of thousands of data-poor environmental chemicals. Estimating a TPAD requires def...

  1. Incorporating Population Variability and Susceptible Subpopulations into Dosimetry for High-Throughput Toxicity Testing

    EPA Science Inventory

    Momentum is growing worldwide to use in vitro high-throughput screening (HTS) to evaluate human health effects of chemicals. However, the integration of dosimetry into HTS assays and incorporation of population variability will be essential before its application in a risk assess...

  2. Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    EPA Science Inventory

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracterization...

  3. Defining the taxonomic domain of applicability for mammalian-based high-throughput screening assays

    EPA Science Inventory

    Cell-based high throughput screening (HTS) technologies are becoming mainstream in chemical safety evaluations. The US Environmental Protection Agency (EPA) Toxicity Forecaster (ToxCastTM) and the multi-agency Tox21 Programs have been at the forefront in advancing this science, m...

  4. High Throughput Plasmid Sequencing with Illumina and CLC Bio (Seventh Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting 2012)

    ScienceCinema

    Athavale, Ajay

    2018-01-04

    Ajay Athavale (Monsanto) presents "High Throughput Plasmid Sequencing with Illumina and CLC Bio" at the 7th Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting held in June, 2012 in Santa Fe, NM.

  5. The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping

    ERIC Educational Resources Information Center

    Wei, Weiqi

    2012-01-01

    Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…

  6. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    PubMed

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  7. Droplet microfluidic technology for single-cell high-throughput screening.

    PubMed

    Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L

    2009-08-25

    We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.

  8. Microfluidic guillotine for single-cell wound repair studies

    NASA Astrophysics Data System (ADS)

    Blauch, Lucas R.; Gai, Ya; Khor, Jian Wei; Sood, Pranidhi; Marshall, Wallace F.; Tang, Sindy K. Y.

    2017-07-01

    Wound repair is a key feature distinguishing living from nonliving matter. Single cells are increasingly recognized to be capable of healing wounds. The lack of reproducible, high-throughput wounding methods has hindered single-cell wound repair studies. This work describes a microfluidic guillotine for bisecting single Stentor coeruleus cells in a continuous-flow manner. Stentor is used as a model due to its robust repair capacity and the ability to perform gene knockdown in a high-throughput manner. Local cutting dynamics reveals two regimes under which cells are bisected, one at low viscous stress where cells are cut with small membrane ruptures and high viability and one at high viscous stress where cells are cut with extended membrane ruptures and decreased viability. A cutting throughput up to 64 cells per minute—more than 200 times faster than current methods—is achieved. The method allows the generation of more than 100 cells in a synchronized stage of their repair process. This capacity, combined with high-throughput gene knockdown in Stentor, enables time-course mechanistic studies impossible with current wounding methods.

  9. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    PubMed

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  10. Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.

    PubMed

    Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N

    2004-01-01

    Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.

  11. Design and evaluation of Continuous Descent Approach as a fuel-saving procedure

    NASA Astrophysics Data System (ADS)

    Jin, Li

    Continuous Descent Approach (CDA), which is among the key concepts of the Next Generation Air Transportation System (NextGen), is a fuel economical procedure, but requires increased separation to accommodate spacing uncertainties among arriving aircraft. Such negative impact is often overlooked when benefits are estimated. Although a considerable number of researches have been devoted to the estimation of potential fuel saving of CDA, few have attempted to explain the fuel saving observed in field tests from an analytical point of view. This research gives insights into the reasons why CDA saves fuel, and a number of design guidelines for CDA procedures are derived. The analytical relationship between speed, altitude, and time-cumulative fuel consumption is derived based on Base of Aircraft Data (BADA) Total Energy Model. Theoretical analysis implies that speed profile has an impact as substantial as, if not more than, vertical profile on the fuel consumption in the terminal area. In addition, CDA is not intrinsically a fuel-saving procedure: whether CDA saves fuel or not is contingent upon whether the speed profile is properly designed or not. Based on this model, the potential fuel savings due to CDA at San Francisco International Airport were estimated, and the accuracy of this estimation is analyzed. Possible uncertainties in this fuel estimation primarily resulted from the modeled CDA procedure and the inaccuracy of BADA. This thesis also investigates the fuel savings due to CDAs under high traffic conditions, counting not only the savings benefiting from optimal vertical profiles but also the extra fuel burn resulting from the increased separations. The simulated CDAs traffic is based on radar track data, and deconflicted by a scheduling algorithm that targets minimized delays. The delays are absorbed by speed change and path stretching, accounting for the air traffic controls that are entailed by CDAs. The fuel burn statistics calculated based on the BADA Total Energy Model reveals that the CDAs save on average 171.87 kg per arrival, but the number is discounted by delay absorption. The savings diminish as the arrival demand increases, and could be even negative due to large delays. The throughput analysis demonstrated that the impact of CDA on airport capacity is insignificant and tolerable. The Atlanta International Airport was used as the testbed for sensitivity analysis, and the New York Metroplex was used as the test bed for throughput analysis.

  12. High-Reflectivity Coatings for a Vacuum Ultraviolet Spectropolarimeter

    NASA Astrophysics Data System (ADS)

    Narukage, Noriyuki; Kubo, Masahito; Ishikawa, Ryohko; Ishikawa, Shin-nosuke; Katsukawa, Yukio; Kobiki, Toshihiko; Giono, Gabriel; Kano, Ryouhei; Bando, Takamasa; Tsuneta, Saku; Auchère, Frédéric; Kobayashi, Ken; Winebarger, Amy; McCandless, Jim; Chen, Jianrong; Choi, Joanne

    2017-03-01

    Precise polarization measurements in the vacuum ultraviolet (VUV) region are expected to be a new tool for inferring the magnetic fields in the upper atmosphere of the Sun. High-reflectivity coatings are key elements to achieving high-throughput optics for precise polarization measurements. We fabricated three types of high-reflectivity coatings for a solar spectropolarimeter in the hydrogen Lyman-α (Lyα; 121.567 nm) region and evaluated their performance. The first high-reflectivity mirror coating offers a reflectivity of more than 80 % in Lyα optics. The second is a reflective narrow-band filter coating that has a peak reflectivity of 57 % in Lyα, whereas its reflectivity in the visible light range is lower than 1/10 of the peak reflectivity (˜ 5 % on average). This coating can be used to easily realize a visible light rejection system, which is indispensable for a solar telescope, while maintaining high throughput in the Lyα line. The third is a high-efficiency reflective polarizing coating that almost exclusively reflects an s-polarized beam at its Brewster angle of 68° with a reflectivity of 55 %. This coating achieves both high polarizing power and high throughput. These coatings contributed to the high-throughput solar VUV spectropolarimeter called the Chromospheric Lyman-Alpha SpectroPolarimeter (CLASP), which was launched on 3 September, 2015.

  13. A simple and sensitive high-throughput GFP screening in woody and herbaceous plants.

    PubMed

    Hily, Jean-Michel; Liu, Zongrang

    2009-03-01

    Green fluorescent protein (GFP) has been used widely as a powerful bioluminescent reporter, but its visualization by existing methods in tissues or whole plants and its utilization for high-throughput screening remains challenging in many species. Here, we report a fluorescence image analyzer-based method for GFP detection and its utility for high-throughput screening of transformed plants. Of three detection methods tested, the Typhoon fluorescence scanner was able to detect GFP fluorescence in all Arabidopsis thaliana tissues and apple leaves, while regular fluorescence microscopy detected it only in Arabidopsis flowers and siliques but barely in the leaves of either Arabidopsis or apple. The hand-held UV illumination method failed in all tissues of both species. Additionally, the Typhoon imager was able to detect GFP fluorescence in both green and non-green tissues of Arabidopsis seedlings as well as in imbibed seeds, qualifying it as a high-throughput screening tool, which was further demonstrated by screening the seedlings of primary transformed T(0) seeds. Of the 30,000 germinating Arabidopsis seedlings screened, at least 69 GFP-positive lines were identified, accounting for an approximately 0.23% transformation efficiency. About 14,000 seedlings grown in 16 Petri plates could be screened within an hour, making the screening process significantly more efficient and robust than any other existing high-throughput screening method for transgenic plants.

  14. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  15. A High-throughput Assay for mRNA Silencing in Primary Cortical Neurons in vitro with Oligonucleotide Therapeutics.

    PubMed

    Alterman, Julia F; Coles, Andrew H; Hall, Lauren M; Aronin, Neil; Khvorova, Anastasia; Didiot, Marie-Cécile

    2017-08-20

    Primary neurons represent an ideal cellular system for the identification of therapeutic oligonucleotides for the treatment of neurodegenerative diseases. However, due to the sensitive nature of primary cells, the transfection of small interfering RNAs (siRNA) using classical methods is laborious and often shows low efficiency. Recent progress in oligonucleotide chemistry has enabled the development of stabilized and hydrophobically modified small interfering RNAs (hsiRNAs). This new class of oligonucleotide therapeutics shows extremely efficient self-delivery properties and supports potent and durable effects in vitro and in vivo . We have developed a high-throughput in vitro assay to identify and test hsiRNAs in primary neuronal cultures. To simply, rapidly, and accurately quantify the mRNA silencing of hundreds of hsiRNAs, we use the QuantiGene 2.0 quantitative gene expression assay. This high-throughput, 96-well plate-based assay can quantify mRNA levels directly from sample lysate. Here, we describe a method to prepare short-term cultures of mouse primary cortical neurons in a 96-well plate format for high-throughput testing of oligonucleotide therapeutics. This method supports the testing of hsiRNA libraries and the identification of potential therapeutics within just two weeks. We detail methodologies of our high throughput assay workflow from primary neuron preparation to data analysis. This method can help identify oligonucleotide therapeutics for treatment of various neurological diseases.

  16. High-throughput screening with nanoimprinting 3D culture for efficient drug development by mimicking the tumor environment.

    PubMed

    Yoshii, Yukie; Furukawa, Takako; Waki, Atsuo; Okuyama, Hiroaki; Inoue, Masahiro; Itoh, Manabu; Zhang, Ming-Rong; Wakizaka, Hidekatsu; Sogawa, Chizuru; Kiyono, Yasushi; Yoshii, Hiroshi; Fujibayashi, Yasuhisa; Saga, Tsuneo

    2015-05-01

    Anti-cancer drug development typically utilizes high-throughput screening with two-dimensional (2D) cell culture. However, 2D culture induces cellular characteristics different from tumors in vivo, resulting in inefficient drug development. Here, we report an innovative high-throughput screening system using nanoimprinting 3D culture to simulate in vivo conditions, thereby facilitating efficient drug development. We demonstrated that cell line-based nanoimprinting 3D screening can more efficiently select drugs that effectively inhibit cancer growth in vivo as compared to 2D culture. Metabolic responses after treatment were assessed using positron emission tomography (PET) probes, and revealed similar characteristics between the 3D spheroids and in vivo tumors. Further, we developed an advanced method to adopt cancer cells from patient tumor tissues for high-throughput drug screening with nanoimprinting 3D culture, which we termed Cancer tissue-Originated Uniformed Spheroid Assay (COUSA). This system identified drugs that were effective in xenografts of the original patient tumors. Nanoimprinting 3D spheroids showed low permeability and formation of hypoxic regions inside, similar to in vivo tumors. Collectively, the nanoimprinting 3D culture provides easy-handling high-throughput drug screening system, which allows for efficient drug development by mimicking the tumor environment. The COUSA system could be a useful platform for drug development with patient cancer cells. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    PubMed Central

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-01-01

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925

  18. Movement of Fuel Ashore: Storage, Capacity, Throughput, and Distribution Analysis

    DTIC Science & Technology

    2015-12-01

    89  ix LIST OF FIGURES Figure 1.  Principles of Operational Maneuver from the Sea ........................... 7  Figure 2.  Compositing and...30  Table 2.  Force Mix Composition ...procedures, and force composition . Such alterations represent an acceptance of operational risk to buy down the foundational risk that the logistics network

  19. Break-up of droplets in a concentrated emulsion flowing through a narrow constriction

    NASA Astrophysics Data System (ADS)

    Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team

    2014-11-01

    Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.

  20. Environmental surveillance and monitoring the next frontier for pathway-based high throughput screening

    EPA Science Inventory

    In response to a proposed vision and strategy for toxicity testing in the 21st century nascent high throughput toxicology (HTT) programs have tested thousands of chemicals in hundreds of pathway-based biological assays. Although, to date, use of HTT data for safety assessment of ...

  1. Molecular characterization of a novel Luteovirus from peach identified by high-throughput sequencing

    USDA-ARS?s Scientific Manuscript database

    Contigs with sequence homologies to Cherry-associated luteovirus were identified by high-throughput sequencing analysis of two peach accessions undergoing quarantine testing. The complete genomic sequences of the two isolates of this virus are 5,819 and 5,814 nucleotides. Their genome organization i...

  2. 20180311 - Differential Gene Expression and Concentration-Response Modeling Workflow for High-Throughput Transcriptomic (HTTr) Data: Results From MCF7 Cells (SOT)

    EPA Science Inventory

    Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...

  3. 20171024 - Capabilities and Evaluation of the US EPA’s HTTK (High Throughput Toxicokinetics) R package (Webinar Presentation to European Chemical Agency

    EPA Science Inventory

    Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure. However traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...

  4. Identifying Toxicity Pathways with ToxCast High-Throughput Screening and Applications to Predicting Developmental Toxicity

    EPA Science Inventory

    Results from rodent and non-rodent prenatal developmental toxicity tests for over 300 chemicals have been curated into the relational database ToxRefDB. These same chemicals have been run in concentration-response format through over 500 high-throughput screening assays assessin...

  5. Discovery of viruses and virus-like pathogens in pistachio using high-throughput sequencing

    USDA-ARS?s Scientific Manuscript database

    Pistachio (Pistacia vera L.) trees from the National Clonal Germplasm Repository (NCGR) and orchards in California were surveyed for viruses and virus-like agents by high-throughput sequencing (HTS). Analyses of 60 trees including clonal UCB-1 hybrid rootstock (P. atlantica × P. integerrima) identif...

  6. SeqAPASS to evaluate conservation of high-throughput screening targets across non-mammalian species

    EPA Science Inventory

    Cell-based high-throughput screening (HTS) and computational technologies are being applied as tools for toxicity testing in the 21st century. The U.S. Environmental Protection Agency (EPA) embraced these technologies and created the ToxCast Program in 2007, which has served as a...

  7. 20180312 - Applying a High-Throughput PBTK Model for IVIVE (SOT)

    EPA Science Inventory

    The ability to link in vitro and in vivo toxicity enables the use of high-throughput in vitro assays as an alternative to resource intensive animal studies. Toxicokinetics (TK) should help describe this link, but prior work found weak correlation when using a TK model for in vitr...

  8. High-throughput screening of chemical effects on steroidogenesis using H295R human adrenocortical carcinoma cells

    EPA Science Inventory

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples...

  9. High-Throughput Simulation of Environmental Chemical Fate for Exposure Prioritization (Annual Meeting of ISES)

    EPA Science Inventory

    The U.S. EPA must consider thousands of chemicals when allocating resources to assess risk in human populations and the environment. High-throughput screening assays to characterize biological activity in vitro are being implemented in the ToxCastTM program to rapidly characteri...

  10. Integration of chemical-specific exposure and pharmacokinetic information with the chemical-agnostic AOP framework to support high throughput risk assessment

    EPA Science Inventory

    Application of the Adverse Outcome Pathway (AOP) framework and high throughput toxicity testing in chemical-specific risk assessment requires reconciliation of chemical concentrations sufficient to trigger a molecular initiating event measured in vitro and at the relevant target ...

  11. Evaluation of food-relevant chemicals in the ToxCast high-throughput screening program

    EPA Science Inventory

    There are thousands of chemicals that are directly added to or come in contact with food, many of which have undergone little to no toxicological evaluation. The ToxCast high-throughput screening (HTS) program has evaluated over 1,800 chemicals in concentration-response across ~8...

  12. Application of Physiologically-Based Pharmacokinetic/Pharmacodynamic Model for Interpretation of High-throughput Screening Assay for Thyroperoxidase Inhibition

    EPA Science Inventory

    In vitro based assays are used to identify potential endocrine disrupting chemicals. Thyroperoxidase (TPO), an enzyme essential for thyroid hormone (TH) synthesis, is a target site for disruption of the thyroid axis for which a high-throughput screening (HTPS) assay has recently ...

  13. Incorporating High-Throughput Exposure Predictions with Dosimetry-Adjusted In Vitro Bioactivity to Inform Chemical Toxicity Testing

    EPA Science Inventory

    We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compare...

  14. An Evaluation of 25 Selected ToxCast Chemicals in Medium-Throughput Assays to Detect Genotoxicity

    EPA Science Inventory

    ABSTRACTToxCast is a multi-year effort to develop a cost-effective approach for the US EPA to prioritize chemicals for toxicity testing. Initial evaluation of more than 500 high-throughput (HT) microwell-based assays without metabolic activation showed that most lacked high speci...

  15. High Throughput Assays for Exposure Science (NIEHS OHAT Staff Meeting presentation)

    EPA Science Inventory

    High throughput screening (HTS) data that characterize chemically induced biological activity have been generated for thousands of chemicals by the US interagency Tox21 and the US EPA ToxCast programs. In many cases there are no data available for comparing bioactivity from HTS w...

  16. Differential Gene Expression and Concentration-Response Modeling Workflow for High-Throughput Transcriptomic (HTTr) Data: Results From MCF7 Cells

    EPA Science Inventory

    Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...

  17. Differentiating pathway-specific from nonspecific effects in high-throughput toxicity data: A foundation for prioritizing adverse outcome pathway development

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s ToxCast program has screened thousands of chemicals for biological activity, primarily using high-throughput in vitro bioassays. Adverse outcome pathways (AOPs) offer a means to link pathway-specific biological activities with potential ...

  18. “httk”: EPA’s Tool for High Throughput Toxicokinetics (CompTox CoP)

    EPA Science Inventory

    Thousands of chemicals have been pro?led by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concentr...

  19. Tiered High-Throughput Screening Approach to Identify Thyroperoxidase Inhibitors within the ToxCast Phase I and II Chemical Libraries

    EPA Science Inventory

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limi...

  20. Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing

    EPA Science Inventory

    In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. ...

  1. Evaluating the Impact of Uncertainties in Clearance and Exposure When Prioritizing Chemicals Screened in High-Throughput Assays

    EPA Science Inventory

    The toxicity-testing paradigm has evolved to include high-throughput (HT) methods for addressing the increasing need to screen hundreds to thousands of chemicals rapidly. Approaches that involve in vitro screening assays, in silico predictions of exposure concentrations, and phar...

  2. Predictive Model of Rat Reproductive Toxicity from ToxCast High Throughput Screening

    EPA Science Inventory

    The EPA ToxCast research program uses high throughput screening for bioactivity profiling and predicting the toxicity of large numbers of chemicals. ToxCast Phase‐I tested 309 well‐characterized chemicals in over 500 assays for a wide range of molecular targets and cellular respo...

  3. Applying a High-Throughput PBTK Model for IVIVE

    EPA Science Inventory

    The ability to link in vitro and in vivo toxicity enables the use of high-throughput in vitro assays as an alternative to resource intensive animal studies. Toxicokinetics (TK) should help describe this link, but prior work found weak correlation when using a TK model for in vitr...

  4. High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)

    USDA-ARS?s Scientific Manuscript database

    Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...

  5. Neural Progenitor Cells as Models for High-Throughput Screens of Developmental Neurotoxicity: State of the Science

    EPA Science Inventory

    In vitro, high-throughput approaches have been widely recommended as an approach to screen chemicals for the potential to cause developmental neurotoxicity and prioritize them for additional testing. The choice of cellular models for such an approach will have important ramificat...

  6. High-throughput exposure modeling to support prioritization of chemicals in personal care products

    EPA Science Inventory

    We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or ...

  7. Sensitivity of neuroprogenitor cells to chemical-induced apoptosis using a multiplexed assay suitable for high-throughput screening*

    EPA Science Inventory

    AbstractHigh-throughput methods are useful for rapidly screening large numbers of chemicals for biological activity, including the perturbation of pathways that may lead to adverse cellular effects. In vitro assays for the key events of neurodevelopment, including apoptosis, may ...

  8. Using Alternative Approaches to Prioritize Testing for the Universe of Chemicals with Potential for Human Exposure (WC9)

    EPA Science Inventory

    One use of alternative methods is to target animal use at only those chemicals and tests that are absolutely necessary. We discuss prioritization of testing based on high-throughput screening assays (HTS), QSAR modeling, high-throughput toxicokinetics (HTTK), and exposure modelin...

  9. Integrated Model of Chemical Perturbations of a Biological PathwayUsing 18 In Vitro High Throughput Screening Assays for the Estrogen Receptor

    EPA Science Inventory

    We demonstrate a computational network model that integrates 18 in vitro, high-throughput screening assays measuring estrogen receptor (ER) binding, dimerization, chromatin binding, transcriptional activation and ER-dependent cell proliferation. The network model uses activity pa...

  10. Emory University: High-Throughput Protein-Protein Interaction Analysis for Hippo Pathway Profiling | Office of Cancer Genomics

    Cancer.gov

    The CTD2 Center at Emory University used high-throughput protein-protein interaction (PPI) mapping for Hippo signaling pathway profiling to rapidly unveil promising PPIs as potential therapeutic targets and advance functional understanding of signaling circuitry in cells. Read the abstract.

  11. Harnessing High-Throughput Monitoring Methods to Strengthen 21st Century Risk-Based Evaluations (SETAC Presentation)

    EPA Science Inventory

    Over the past ten years, the US government has invested in high-throughput (HT) methods to screen chemicals for biological activity. Under the interagency Tox21 consortium and the US Environmental Protection Agency’s (EPA) ToxCast™ program, thousands of chemicals have...

  12. Genomics tools available for unravelling mechanisms underlying agronomical traits in strawberry with more to come

    USDA-ARS?s Scientific Manuscript database

    In the last few years, high-throughput genomics promised to bridge the gap between plant physiology and plant sciences. In addition, high-throughput genotyping technologies facilitate marker-based selection for better performing genotypes. In strawberry, Fragaria vesca was the first reference sequen...

  13. High-throughput genotoxicity assay identifies antioxidants as inducers of DNA damage response and cell death

    EPA Science Inventory

    Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...

  14. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  15. Automated crystallographic system for high-throughput protein structure determination.

    PubMed

    Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F

    2003-07-01

    High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.

  16. A high-throughput, multi-channel photon-counting detector with picosecond timing

    NASA Astrophysics Data System (ADS)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  17. Direct assembling methodologies for high-throughput bioscreening

    PubMed Central

    Rodríguez-Dévora, Jorge I.; Shi, Zhi-dong; Xu, Tao

    2012-01-01

    Over the last few decades, high-throughput (HT) bioscreening, a technique that allows rapid screening of biochemical compound libraries against biological targets, has been widely used in drug discovery, stem cell research, development of new biomaterials, and genomics research. To achieve these ambitions, scaffold-free (or direct) assembly of biological entities of interest has become critical. Appropriate assembling methodologies are required to build an efficient HT bioscreening platform. The development of contact and non-contact assembling systems as a practical solution has been driven by a variety of essential attributes of the bioscreening system, such as miniaturization, high throughput, and high precision. The present article reviews recent progress on these assembling technologies utilized for the construction of HT bioscreening platforms. PMID:22021162

  18. High throughput light absorber discovery, Part 2: Establishing structure–band gap energy relationships

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; ...

    2016-09-23

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less

  19. High throughput light absorber discovery, Part 2: Establishing structure–band gap energy relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan

    Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less

  20. High-Throughput Assay Optimization and Statistical Interpolation of Rubella-Specific Neutralizing Antibody Titers

    PubMed Central

    Lambert, Nathaniel D.; Pankratz, V. Shane; Larrabee, Beth R.; Ogee-Nwankwo, Adaeze; Chen, Min-hsin; Icenogle, Joseph P.

    2014-01-01

    Rubella remains a social and economic burden due to the high incidence of congenital rubella syndrome (CRS) in some countries. For this reason, an accurate and efficient high-throughput measure of antibody response to vaccination is an important tool. In order to measure rubella-specific neutralizing antibodies in a large cohort of vaccinated individuals, a high-throughput immunocolorimetric system was developed. Statistical interpolation models were applied to the resulting titers to refine quantitative estimates of neutralizing antibody titers relative to the assayed neutralizing antibody dilutions. This assay, including the statistical methods developed, can be used to assess the neutralizing humoral immune response to rubella virus and may be adaptable for assessing the response to other viral vaccines and infectious agents. PMID:24391140

Top