Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
High-throughput diagnosis of potato cyst nematodes in soil samples.
Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon
2015-01-01
Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.
D. Lee Taylor; Michael G. Booth; Jack W. McFarland; Ian C. Herriott; Niall J. Lennon; Chad Nusbaum; Thomas G. Marr
2008-01-01
High throughput sequencing methods are widely used in analyses of microbial diversity but are generally applied to small numbers of samples, which precludes charaterization of patterns of microbial diversity across space and time. We have designed a primer-tagging approach that allows pooling and subsequent sorting of numerous samples, which is directed to...
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F
2010-06-01
The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.
Jasmine, Farzana; Shinkle, Justin; Sabarinathan, Mekala; Ahsan, Habibul; Pierce, Brandon L; Kibriya, Muhammad G
2018-03-12
Relative telomere length (RTL) is a potential biomarker of aging and risk for chronic disease. Previously, we developed a probe-based RTL assay on Luminex platform, where probes for Telomere (T) and reference gene (R) for a given DNA sample were tested in a single well. Here, we describe a method of pooling multiple samples in one well to increase the throughput and cost-effectiveness. We used four different microbeads for the same T-probe and four different microbeads for the same R-probe. Each pair of probe sets were hybridized to DNA in separate plates and then pooled in a single plate for all the subsequent steps. We used DNA samples from 60 independent individuals and repeated in multiple batches to test the precision. The precision was good to excellent with Intraclass correlation coefficient (ICC) of 0.908 (95% CI 0.856-0.942). More than 67% of the variation in the RTL could be explained by sample-to-sample variation; less than 0.1% variation was due to batch-to-batch variation and 0.3% variation was explained by bead-to-bead variation. We increased the throughput of RTL Luminex assay from 60 to 240 samples per run. The new assay was validated against the original Luminex assay without pooling (r = 0.79, P = 1.44 × 10 -15 ). In an independent set of samples (n = 550), the new assay showed a negative correlation of RTL with age (r = -0.41), a result providing external validation for the method. We describe a novel high throughput pooled-sample multiplex Luminex assay for RTL with good to excellent precision suitable for large-scale studies. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNally, N.; Liu, Xiang Yang; Choudary, P.V.
1997-01-01
The authors describe a microplate-based high-throughput procedure for rapid assay of the enzyme activities of nitrate reductase and nitrite reductase, using extremely small volumes of reagents. The new procedure offers the advantages of rapidity, small sample size-nanoliter volumes, low cost, and a dramatic increase in the throughput sample number that can be analyzed simultaneously. Additional advantages can be accessed by using microplate reader application software packages that permit assigning a group type to the wells, recording of the data on exportable data files and exercising the option of using the kinetic or endpoint reading modes. The assay can also bemore » used independently for detecting nitrite residues/contamination in environmental/food samples. 10 refs., 2 figs.« less
High-throughput discovery of rare human nucleotide polymorphisms by Ecotilling
Till, Bradley J.; Zerr, Troy; Bowers, Elisabeth; Greene, Elizabeth A.; Comai, Luca; Henikoff, Steven
2006-01-01
Human individuals differ from one another at only ∼0.1% of nucleotide positions, but these single nucleotide differences account for most heritable phenotypic variation. Large-scale efforts to discover and genotype human variation have been limited to common polymorphisms. However, these efforts overlook rare nucleotide changes that may contribute to phenotypic diversity and genetic disorders, including cancer. Thus, there is an increasing need for high-throughput methods to robustly detect rare nucleotide differences. Toward this end, we have adapted the mismatch discovery method known as Ecotilling for the discovery of human single nucleotide polymorphisms. To increase throughput and reduce costs, we developed a universal primer strategy and implemented algorithms for automated band detection. Ecotilling was validated by screening 90 human DNA samples for nucleotide changes in 5 gene targets and by comparing results to public resequencing data. To increase throughput for discovery of rare alleles, we pooled samples 8-fold and found Ecotilling to be efficient relative to resequencing, with a false negative rate of 5% and a false discovery rate of 4%. We identified 28 new rare alleles, including some that are predicted to damage protein function. The detection of rare damaging mutations has implications for models of human disease. PMID:16893952
Forreryd, Andy; Johansson, Henrik; Albrekt, Ann-Sofie; Lindstedt, Malin
2014-05-16
Allergic contact dermatitis (ACD) develops upon exposure to certain chemical compounds termed skin sensitizers. To reduce the occurrence of skin sensitizers, chemicals are regularly screened for their capacity to induce sensitization. The recently developed Genomic Allergen Rapid Detection (GARD) assay is an in vitro alternative to animal testing for identification of skin sensitizers, classifying chemicals by evaluating transcriptional levels of a genomic biomarker signature. During assay development and biomarker identification, genome-wide expression analysis was applied using microarrays covering approximately 30,000 transcripts. However, the microarray platform suffers from drawbacks in terms of low sample throughput, high cost per sample and time consuming protocols and is a limiting factor for adaption of GARD into a routine assay for screening of potential sensitizers. With the purpose to simplify assay procedures, improve technical parameters and increase sample throughput, we assessed the performance of three high throughput gene expression platforms--nCounter®, BioMark HD™ and OpenArray®--and correlated their performance metrics against our previously generated microarray data. We measured the levels of 30 transcripts from the GARD biomarker signature across 48 samples. Detection sensitivity, reproducibility, correlations and overall structure of gene expression measurements were compared across platforms. Gene expression data from all of the evaluated platforms could be used to classify most of the sensitizers from non-sensitizers in the GARD assay. Results also showed high data quality and acceptable reproducibility for all platforms but only medium to poor correlations of expression measurements across platforms. In addition, evaluated platforms were superior to the microarray platform in terms of cost efficiency, simplicity of protocols and sample throughput. We evaluated the performance of three non-array based platforms using a limited set of transcripts from the GARD biomarker signature. We demonstrated that it was possible to achieve acceptable discriminatory power in terms of separation between sensitizers and non-sensitizers in the GARD assay while reducing assay costs, simplify assay procedures and increase sample throughput by using an alternative platform, providing a first step towards the goal to prepare GARD for formal validation and adaption of the assay for industrial screening of potential sensitizers.
Quesada-Cabrera, Raul; Weng, Xiaole; Hyett, Geoff; Clark, Robin J H; Wang, Xue Z; Darr, Jawwad A
2013-09-09
High-throughput continuous hydrothermal flow synthesis was used to manufacture 66 unique nanostructured oxide samples in the Ce-Zr-Y-O system. This synthesis approach resulted in a significant increase in throughput compared to that of conventional batch or continuous hydrothermal synthesis methods. The as-prepared library samples were placed into a wellplate for both automated high-throughput powder X-ray diffraction and Raman spectroscopy data collection, which allowed comprehensive structural characterization and phase mapping. The data suggested that a continuous cubic-like phase field connects all three Ce-Zr-O, Ce-Y-O, and Y-Zr-O binary systems together with a smooth and steady transition between the structures of neighboring compositions. The continuous hydrothermal process led to as-prepared crystallite sizes in the range of 2-7 nm (as determined by using the Scherrer equation).
Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.
Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels
2012-10-01
We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William F.; Yegian, Derek T.; Earnest, Thomas N.; Jaklevich, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2007-09-25
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2005-07-19
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
A high-throughput method for GMO multi-detection using a microfluidic dynamic array.
Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J
2014-02-01
The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.
Comprehensive Analysis of Immunological Synapse Phenotypes Using Supported Lipid Bilayers.
Valvo, Salvatore; Mayya, Viveka; Seraia, Elena; Afrose, Jehan; Novak-Kotzer, Hila; Ebner, Daniel; Dustin, Michael L
2017-01-01
Supported lipid bilayers (SLB) formed on glass substrates have been a useful tool for study of immune cell signaling since the early 1980s. The mobility of lipid-anchored proteins in the system, first described for antibodies binding to synthetic phospholipid head groups, allows for the measurement of two-dimensional binding reactions and signaling processes in a single imaging plane over time or for fixed samples. The fragility of SLB and the challenges of building and validating individual substrates limit most experimenters to ~10 samples per day, perhaps increasing this few-fold when examining fixed samples. Successful experiments might then require further days to fully analyze. We present methods for automation of many steps in SLB formation, imaging in 96-well glass bottom plates, and analysis that enables >100-fold increase in throughput for fixed samples and wide-field fluorescence. This increased throughput will allow better coverage of relevant parameters and more comprehensive analysis of aspects of the immunological synapse that are well reconstituted by SLB.
An improved high-throughput lipid extraction method for the analysis of human brain lipids.
Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett
2013-03-01
We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.
Tiersch, Terrence R.; Yang, Huiping; Hu, E.
2011-01-01
With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko
2010-06-23
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less
Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen
2015-04-15
High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.
Novel method for the high-throughput processing of slides for the comet assay
Karbaschi, Mahsa; Cooke, Marcus S.
2014-01-01
Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. “Scoring”, or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure. PMID:25425241
Novel method for the high-throughput processing of slides for the comet assay.
Karbaschi, Mahsa; Cooke, Marcus S
2014-11-26
Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.
Multiplexed mass cytometry profiling of cellular states perturbed by small-molecule regulators
Bodenmiller, Bernd; Zunder, Eli R.; Finck, Rachel; Chen, Tiffany J.; Savig, Erica S.; Bruggner, Robert V.; Simonds, Erin F.; Bendall, Sean C.; Sachs, Karen; Krutzik, Peter O.; Nolan, Garry P.
2013-01-01
The ability to comprehensively explore the impact of bio-active molecules on human samples at the single-cell level can provide great insight for biomedical research. Mass cytometry enables quantitative single-cell analysis with deep dimensionality, but currently lacks high-throughput capability. Here we report a method termed mass-tag cellular barcoding (MCB) that increases mass cytometry throughput by sample multiplexing. 96-well format MCB was used to characterize human peripheral blood mononuclear cell (PBMC) signaling dynamics, cell-to-cell communication, the signaling variability between 8 donors, and to define the impact of 27 inhibitors on this system. For each compound, 14 phosphorylation sites were measured in 14 PBMC types, resulting in 18,816 quantified phosphorylation levels from each multiplexed sample. This high-dimensional systems-level inquiry allowed analysis across cell-type and signaling space, reclassified inhibitors, and revealed off-target effects. MCB enables high-content, high-throughput screening, with potential applications for drug discovery, pre-clinical testing, and mechanistic investigation of human disease. PMID:22902532
A high-throughput semi-automated preparation for filtered synaptoneurosomes.
Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A
2014-09-30
Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.
2013-03-01
Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.
High-throughput sequencing: a failure mode analysis.
Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A
2005-01-04
Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.
Erickson, Heidi S
2012-09-28
The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.
Huang, Dejian; Ou, Boxin; Hampsch-Woodill, Maureen; Flanagan, Judith A; Prior, Ronald L
2002-07-31
The oxygen radical absorbance capacity (ORAC) assay has been widely accepted as a standard tool to measure the antioxidant activity in the nutraceutical, pharmaceutical, and food industries. However, the ORAC assay has been criticized for a lack of accessibility due to the unavailability of the COBAS FARA II analyzer, an instrument discontinued by the manufacturer. In addition, the manual sample preparation is time-consuming and labor-intensive. The objective of this study was to develop a high-throughput instrument platform that can fully automate the ORAC assay procedure. The new instrument platform consists of a robotic eight-channel liquid handling system and a microplate fluorescence reader. By using the high-throughput platform, the efficiency of the assay is improved with at least a 10-fold increase in sample throughput over the current procedure. The mean of intra- and interday CVs was
Liu, Chao; Xue, Chundong; Chen, Xiaodong; Shan, Lei; Tian, Yu; Hu, Guoqing
2015-06-16
Viscoelasticity-induced particle migration has recently received increasing attention due to its ability to obtain high-quality focusing over a wide range of flow rates. However, its application is limited to low throughput regime since the particles can defocus as flow rate increases. Using an engineered carrier medium with constant and low viscosity and strong elasticity, the sample flow rates are improved to be 1 order of magnitude higher than those in existing studies. Utilizing differential focusing of particles of different sizes, here, we present sheathless particle/cell separation in simple straight microchannels that possess excellent parallelizability for further throughput enhancement. The present method can be implemented over a wide range of particle/cell sizes and flow rates. We successfully separate small particles from larger particles, MCF-7 cells from red blood cells (RBCs), and Escherichia coli (E. coli) bacteria from RBCs in different straight microchannels. The proposed method could broaden the applications of viscoelastic microfluidic devices to particle/cell separation due to the enhanced sample throughput and simple channel design.
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
USDA-ARS?s Scientific Manuscript database
Next generation fungal amplicon sequencing is being used with increasing frequency to study fungal diversity in various ecosystems; however, the influence of sample preparation on the characterization of fungal community is poorly understood. We investigated the effects of four procedural modificati...
Optimisation of wavelength modulated Raman spectroscopy: towards high throughput cell screening.
Praveen, Bavishna B; Mazilu, Michael; Marchington, Robert F; Herrington, C Simon; Riches, Andrew; Dholakia, Kishan
2013-01-01
In the field of biomedicine, Raman spectroscopy is a powerful technique to discriminate between normal and cancerous cells. However the strong background signal from the sample and the instrumentation affects the efficiency of this discrimination technique. Wavelength Modulated Raman spectroscopy (WMRS) may suppress the background from the Raman spectra. In this study we demonstrate a systematic approach for optimizing the various parameters of WMRS to achieve a reduction in the acquisition time for potential applications such as higher throughput cell screening. The Signal to Noise Ratio (SNR) of the Raman bands depends on the modulation amplitude, time constant and total acquisition time. It was observed that the sampling rate does not influence the signal to noise ratio of the Raman bands if three or more wavelengths are sampled. With these optimised WMRS parameters, we increased the throughput in the binary classification of normal human urothelial cells and bladder cancer cells by reducing the total acquisition time to 6 s which is significantly lower in comparison to previous acquisition times required for the discrimination between similar cell types.
Combinatorial and high-throughput approaches in polymer science
NASA Astrophysics Data System (ADS)
Zhang, Huiqi; Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.
2005-01-01
Combinatorial and high-throughput approaches have become topics of great interest in the last decade due to their potential ability to significantly increase research productivity. Recent years have witnessed a rapid extension of these approaches in many areas of the discovery of new materials including pharmaceuticals, inorganic materials, catalysts and polymers. This paper mainly highlights our progress in polymer research by using an automated parallel synthesizer, microwave synthesizer and ink-jet printer. The equipment and methodologies in our experiments, the high-throughput experimentation of different polymerizations (such as atom transfer radical polymerization, cationic ring-opening polymerization and emulsion polymerization) and the automated matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS) sample preparation are described.
Oono, Ryoko
2017-01-01
High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions 'how and why are communities different?' This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences.
2017-01-01
High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions ‘how and why are communities different?’ This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences. PMID:29253889
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
Strategies for high-throughput focused-beam ptychography
Jacobsen, Chris; Deng, Junjing; Nashed, Youssef
2017-08-08
X-ray ptychography is being utilized for a wide range of imaging experiments with a resolution beyond the limit of the X-ray optics used. Introducing a parameter for the ptychographic resolution gainG p(the ratio of the beam size over the achieved pixel size in the reconstructed image), strategies for data sampling and for increasing imaging throughput when the specimen is at the focus of an X-ray beam are considered. As a result, the tradeoffs between large and small illumination spots are examined.
Strategies for high-throughput focused-beam ptychography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsen, Chris; Deng, Junjing; Nashed, Youssef
X-ray ptychography is being utilized for a wide range of imaging experiments with a resolution beyond the limit of the X-ray optics used. Introducing a parameter for the ptychographic resolution gainG p(the ratio of the beam size over the achieved pixel size in the reconstructed image), strategies for data sampling and for increasing imaging throughput when the specimen is at the focus of an X-ray beam are considered. As a result, the tradeoffs between large and small illumination spots are examined.
2011-01-01
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments. PMID:22136293
Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E
2011-12-02
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.
High-throughput SRCD using multi-well plates and its applications
NASA Astrophysics Data System (ADS)
Hussain, Rohanah; Jávorfi, Tamás; Rudd, Timothy R.; Siligardi, Giuliano
2016-12-01
The sample compartment for high-throughput synchrotron radiation circular dichroism (HT-SRCD) has been developed to satisfy an increased demand of protein characterisation in terms of folding and binding interaction properties not only in the traditional field of structural biology but also in the growing research area of material science with the potential to save time by 80%. As the understanding of protein behaviour in different solvent environments has increased dramatically the development of novel functions such as recombinant proteins modified to have different functions from harvesting solar energy to metabolonics for cleaning heavy and metal and organic molecule pollutions, there is a need to characterise speedily these system.
Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing
Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad
2015-01-01
Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407
Beeman, Katrin; Baumgärtner, Jens; Laubenheimer, Manuel; Hergesell, Karlheinz; Hoffmann, Martin; Pehl, Ulrich; Fischer, Frank; Pieck, Jan-Carsten
2017-12-01
Mass spectrometry (MS) is known for its label-free detection of substrates and products from a variety of enzyme reactions. Recent hardware improvements have increased interest in the use of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS for high-throughput drug discovery. Despite interest in this technology, several challenges remain and must be overcome before MALDI-MS can be integrated as an automated "in-line reader" for high-throughput drug discovery. Two such hurdles include in situ sample processing and deposition, as well as integration of MALDI-MS for enzymatic screening assays that usually contain high levels of MS-incompatible components. Here we adapt our c-MET kinase assay to optimize for MALDI-MS compatibility and test its feasibility for compound screening. The pros and cons of the Echo (Labcyte) as a transfer system for in situ MALDI-MS sample preparation are discussed. We demonstrate that this method generates robust data in a 1536-grid format. We use the MALDI-MS to directly measure the ratio of c-MET substrate and phosphorylated product to acquire IC50 curves and demonstrate that the pharmacology is unaffected. The resulting IC50 values correlate well between the common label-based capillary electrophoresis and the label-free MALDI-MS detection method. We predict that label-free MALDI-MS-based high-throughput screening will become increasingly important and more widely used for drug discovery.
A recently developed, commercially available, open-air, surface sampling ion source for mass spectrometers provides individual analyses in several seconds. To realize its full throughput potential, an autosampler and field sample carrier were designed and built. The autosampler ...
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
Oran, Paul E.; Trenchevska, Olgica; Nedelkov, Dobrin; Borges, Chad R.; Schaab, Matthew R.; Rehder, Douglas S.; Jarvis, Jason W.; Sherma, Nisha D.; Shen, Luhui; Krastins, Bryan; Lopez, Mary F.; Schwenke, Dawn C.; Reaven, Peter D.; Nelson, Randall W.
2014-01-01
Insulin-like growth factor 1 (IGF1) is an important biomarker for the management of growth hormone disorders. Recently there has been rising interest in deploying mass spectrometric (MS) methods of detection for measuring IGF1. However, widespread clinical adoption of any MS-based IGF1 assay will require increased throughput and speed to justify the costs of analyses, and robust industrial platforms that are reproducible across laboratories. Presented here is an MS-based quantitative IGF1 assay with performance rating of >1,000 samples/day, and a capability of quantifying IGF1 point mutations and posttranslational modifications. The throughput of the IGF1 mass spectrometric immunoassay (MSIA) benefited from a simplified sample preparation step, IGF1 immunocapture in a tip format, and high-throughput MALDI-TOF MS analysis. The Limit of Detection and Limit of Quantification of the resulting assay were 1.5 μg/L and 5 μg/L, respectively, with intra- and inter-assay precision CVs of less than 10%, and good linearity and recovery characteristics. The IGF1 MSIA was benchmarked against commercially available IGF1 ELISA via Bland-Altman method comparison test, resulting in a slight positive bias of 16%. The IGF1 MSIA was employed in an optimized parallel workflow utilizing two pipetting robots and MALDI-TOF-MS instruments synced into one-hour phases of sample preparation, extraction and MSIA pipette tip elution, MS data collection, and data processing. Using this workflow, high-throughput IGF1 quantification of 1,054 human samples was achieved in approximately 9 hours. This rate of assaying is a significant improvement over existing MS-based IGF1 assays, and is on par with that of the enzyme-based immunoassays. Furthermore, a mutation was detected in ∼1% of the samples (SNP: rs17884626, creating an A→T substitution at position 67 of the IGF1), demonstrating the capability of IGF1 MSIA to detect point mutations and posttranslational modifications. PMID:24664114
Farhoud, Murtada H; Wessels, Hans J C T; Wevers, Ron A; van Engelen, Baziel G; van den Heuvel, Lambert P; Smeitink, Jan A
2005-01-01
In 2D-based comparative proteomics of scarce samples, such as limited patient material, established methods for prefractionation and subsequent use of different narrow range IPG strips to increase overall resolution are difficult to apply. Also, a high number of samples, a prerequisite for drawing meaningful conclusions when pathological and control samples are considered, will increase the associated amount of work almost exponentially. Here, we introduce a novel, effective, and economic method designed to obtain maximum 2D resolution while maintaining the high throughput necessary to perform large-scale comparative proteomics studies. The method is based on connecting different IPG strips serially head-to-tail so that a complete line of different IPG strips with sequential pH regions can be focused in the same experiment. We show that when 3 IPG strips (covering together the pH range of 3-11) are connected head-to-tail an optimal resolution is achieved along the whole pH range. Sample consumption, time required, and associated costs are reduced by almost 70%, and the workload is reduced significantly.
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, Daniel J.; Tfaily, Malak M.; Moore, Ronald J.
To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, lowmore » pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.« less
Performance Evaluation of the Sysmex CS-5100 Automated Coagulation Analyzer.
Chen, Liming; Chen, Yu
2015-01-01
Coagulation testing is widely applied clinically, and laboratories increasingly demand automated coagulation analyzers with short turn-around times and high-throughput. The purpose of this study was to evaluate the performance of the Sysmex CS-5100 automated coagulation analyzer for routine use in a clinical laboratory. The prothrombin time (PT), international normalized ratio (INR), activated partial thromboplastin time (APTT), fibrinogen (Fbg), and D-dimer were compared between the Sysmex CS-5100 and Sysmex CA-7000 analyzers, and the imprecision, comparison, throughput, STAT function, and performance for abnormal samples were measured in each. The within-run and between-run coefficients of variation (CV) for the PT, APTT, INR, and D-dimer analyses showed excellent results both in the normal and pathologic ranges. The correlation coefficients between the Sysmex CS-5100 and Sysmex CA-7000 were highly correlated. The throughput of the Sysmex CS-5100 was faster than that of the Sysmex CA-7000. There was no interference at all by total bilirubin concentrations and triglyceride concentrations in the Sysmex CS-5100 analyzer. We demonstrated that the Sysmex CS-5100 performs with satisfactory imprecision and is well suited for coagulation analysis in laboratories processing large sample numbers and icteric and lipemic samples.
NASA Astrophysics Data System (ADS)
McLagan, David S.; Huang, Haiyong; Lei, Ying D.; Wania, Frank; Mitchell, Carl P. J.
2017-07-01
Analysis of high sulphur-containing samples for total mercury content using automated thermal decomposition, amalgamation, and atomic absorption spectroscopy instruments (USEPA Method 7473) leads to rapid and costly SO2 poisoning of catalysts. In an effort to overcome this issue, we tested whether the addition of powdered sodium carbonate (Na2CO3) to the catalyst and/or directly on top of sample material increases throughput of sulphur-impregnated (8-15 wt%) activated carbon samples per catalyst tube. Adding 5 g of Na2CO3 to the catalyst alone only marginally increases the functional lifetime of the catalyst (31 ± 4 g of activated carbon analyzed per catalyst tube) in relation to unaltered catalyst of the AMA254 total mercury analyzer (17 ± 4 g of activated carbon). Adding ≈ 0.2 g of Na2CO3 to samples substantially increases (81 ± 17 g of activated carbon) catalyst life over the unaltered catalyst. The greatest improvement is achieved by adding Na2CO3 to both catalyst and samples (200 ± 70 g of activated carbon), which significantly increases catalyst performance over all other treatments and enables an order of magnitude greater sample throughput than the unaltered samples and catalyst. It is likely that Na2CO3 efficiently sequesters SO2, even at high furnace temperatures to produce Na2SO4 and CO2, largely negating the poisonous impact of SO2 on the catalyst material. Increased corrosion of nickel sampling boats resulting from this methodological variation is easily resolved by substituting quartz boats. Overall, this variation enables an efficient and significantly more affordable means of employing automated atomic absorption spectrometry instruments for total mercury analysis of high-sulphur matrices.
A Mathematical Approach for Compiling and Optimizing Hardware Implementations of DSP Transforms
2010-08-01
FPGA throughput [billion samples per second] performance [ Gflop /s] 0 30 60 90 120 150 0 1 2 3 4 5 0 5,000 10,000 15,000 20,000 25,000...30,000 35,000 40,000 45,000 area [slices] DFT 64 (floating point) on Xilinx Virtex-6 FPGA throughput [billion samples per second] performance [ Gflop ...Virtex-6 FPGA throughput [billion samples per second] performance [ Gflop /s] 0 50 100 150 200 250 0 1 2 3 4 5 0 10,000 20,000 30,000 40,000
A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.
Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang
2009-11-21
We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.
Microreactor Cells for High-Throughput X-ray Absorption Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beesley, Angela; Tsapatsaris, Nikolaos; Weiher, Norbert
2007-01-19
High-throughput experimentation has been applied to X-ray Absorption spectroscopy as a novel route for increasing research productivity in the catalysis community. Suitable instrumentation has been developed for the rapid determination of the local structure in the metal component of precursors for supported catalysts. An automated analytical workflow was implemented that is much faster than traditional individual spectrum analysis. It allows the generation of structural data in quasi-real time. We describe initial results obtained from the automated high throughput (HT) data reduction and analysis of a sample library implemented through the 96 well-plate industrial standard. The results show that a fullymore » automated HT-XAS technology based on existing industry standards is feasible and useful for the rapid elucidation of geometric and electronic structure of materials.« less
Edwards, Bonnie; Lesnick, John; Wang, Jing; Tang, Nga; Peters, Carl
2016-02-01
Epigenetics continues to emerge as an important target class for drug discovery and cancer research. As programs scale to evaluate many new targets related to epigenetic expression, new tools and techniques are required to enable efficient and reproducible high-throughput epigenetic screening. Assay miniaturization increases screening throughput and reduces operating costs. Echo liquid handlers can transfer compounds, samples, reagents, and beads in submicroliter volumes to high-density assay formats using only acoustic energy-no contact or tips required. This eliminates tip costs and reduces the risk of reagent carryover. In this study, we demonstrate the miniaturization of a methyltransferase assay using Echo liquid handlers and two different assay technologies: AlphaLISA from PerkinElmer and EPIgeneous HTRF from Cisbio. © 2015 Society for Laboratory Automation and Screening.
Metabolomic technologies are increasingly being applied to study biological questions in a range of different settings from clinical through to environmental. As with other high-throughput technologies, such as those used in transcriptomics and proteomics, metabolomics continues...
Narcise, Cristine Ingrid S; Coo, Lilibeth Dlc; Del Mundo, Florian R
2005-12-15
A flow injection-column preconcentration-hydride generation atomic absorption spectrophotometric (FI-column-HGAAS) method was developed for determining mug/l levels of As(III) and As(V) in water samples, with simultaneous preconcentration and speciation. The speciation scheme involved determining As(V) at neutral pH and As(III+V) at pH 12, with As(III) obtained by difference. The enrichment factor (EF) increased with increase in sample loading volume from 2.5 to 10ml, and for preconcentration using the chloride-form anion exchange column, EFs ranged from 5 to 48 for As(V) and 4 to 24 for As(III+V), with corresponding detection limits of 0.03-0.3 and 0.07-0.3mug/l. Linear concentration range (LCR) also varied with sample loading volume, and for a 5-ml sample was 0.3-5 and 0.2-8mug/l for As(V) and As(III+V), respectively. Sample throughput, which decreased with increase in sample volume, was 8-17 samples/h. For the hydroxide-form column, the EFS for 2.5-10ml samples were 3-23 for As(V) and 2-15 for As(III+V), with corresponding detection limits of 0.07-0.4 and 0.1-0.5mug/l. The LCR for a 5-ml sample was 0.3-10mug/l for As(V) and 0.2-20mug/l for As(III+V). Sample throughput was 10-20 samples/h. The developed method has been effectively applied to tap water and mineral water samples, with recoveries ranging from 90 to 102% for 5-ml samples passed through the two columns.
Selecting the most appropriate time points to profile in high-throughput studies
Kleyman, Michael; Sefer, Emre; Nicola, Teodora; Espinoza, Celia; Chhabra, Divya; Hagood, James S; Kaminski, Naftali; Ambalavanan, Namasivayam; Bar-Joseph, Ziv
2017-01-01
Biological systems are increasingly being studied by high throughput profiling of molecular data over time. Determining the set of time points to sample in studies that profile several different types of molecular data is still challenging. Here we present the Time Point Selection (TPS) method that solves this combinatorial problem in a principled and practical way. TPS utilizes expression data from a small set of genes sampled at a high rate. As we show by applying TPS to study mouse lung development, the points selected by TPS can be used to reconstruct an accurate representation for the expression values of the non selected points. Further, even though the selection is only based on gene expression, these points are also appropriate for representing a much larger set of protein, miRNA and DNA methylation changes over time. TPS can thus serve as a key design strategy for high throughput time series experiments. Supporting Website: www.sb.cs.cmu.edu/TPS DOI: http://dx.doi.org/10.7554/eLife.18541.001 PMID:28124972
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders are having ...
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
Abstract One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders a...
High-Throughput Industrial Coatings Research at The Dow Chemical Company.
Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T
2016-09-12
At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.
Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime
NASA Astrophysics Data System (ADS)
Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie
2017-09-01
Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.
High-Throughput Analysis and Automation for Glycomics Studies.
Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.
Next generation platforms for high-throughput biodosimetry
Repin, Mikhail; Turner, Helen C.; Garty, Guy; Brenner, David J.
2014-01-01
Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of biodosimetry assays was described. These platforms can be used at different stages of biodosimetry assays starting from blood collection into microtubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multiwell and multichannel plates. Robotically friendly platforms can be used for different biodosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. PMID:24837249
Micropatterned comet assay enables high throughput and sensitive DNA damage quantification
Ge, Jing; Chow, Danielle N.; Fessler, Jessica L.; Weingeist, David M.; Wood, David K.; Engelward, Bevin P.
2015-01-01
The single cell gel electrophoresis assay, also known as the comet assay, is a versatile method for measuring many classes of DNA damage, including base damage, abasic sites, single strand breaks and double strand breaks. However, limited throughput and difficulties with reproducibility have limited its utility, particularly for clinical and epidemiological studies. To address these limitations, we created a microarray comet assay. The use of a micrometer scale array of cells increases the number of analysable comets per square centimetre and enables automated imaging and analysis. In addition, the platform is compatible with standard 24- and 96-well plate formats. Here, we have assessed the consistency and sensitivity of the microarray comet assay. We showed that the linear detection range for H2O2-induced DNA damage in human lymphoblastoid cells is between 30 and 100 μM, and that within this range, inter-sample coefficient of variance was between 5 and 10%. Importantly, only 20 comets were required to detect a statistically significant induction of DNA damage for doses within the linear range. We also evaluated sample-to-sample and experiment-to-experiment variation and found that for both conditions, the coefficient of variation was lower than what has been reported for the traditional comet assay. Finally, we also show that the assay can be performed using a 4× objective (rather than the standard 10× objective for the traditional assay). This adjustment combined with the microarray format makes it possible to capture more than 50 analysable comets in a single image, which can then be automatically analysed using in-house software. Overall, throughput is increased more than 100-fold compared to the traditional assay. Together, the results presented here demonstrate key advances in comet assay technology that improve the throughput, sensitivity, and robustness, thus enabling larger scale clinical and epidemiological studies. PMID:25527723
Micropatterned comet assay enables high throughput and sensitive DNA damage quantification.
Ge, Jing; Chow, Danielle N; Fessler, Jessica L; Weingeist, David M; Wood, David K; Engelward, Bevin P
2015-01-01
The single cell gel electrophoresis assay, also known as the comet assay, is a versatile method for measuring many classes of DNA damage, including base damage, abasic sites, single strand breaks and double strand breaks. However, limited throughput and difficulties with reproducibility have limited its utility, particularly for clinical and epidemiological studies. To address these limitations, we created a microarray comet assay. The use of a micrometer scale array of cells increases the number of analysable comets per square centimetre and enables automated imaging and analysis. In addition, the platform is compatible with standard 24- and 96-well plate formats. Here, we have assessed the consistency and sensitivity of the microarray comet assay. We showed that the linear detection range for H2O2-induced DNA damage in human lymphoblastoid cells is between 30 and 100 μM, and that within this range, inter-sample coefficient of variance was between 5 and 10%. Importantly, only 20 comets were required to detect a statistically significant induction of DNA damage for doses within the linear range. We also evaluated sample-to-sample and experiment-to-experiment variation and found that for both conditions, the coefficient of variation was lower than what has been reported for the traditional comet assay. Finally, we also show that the assay can be performed using a 4× objective (rather than the standard 10× objective for the traditional assay). This adjustment combined with the microarray format makes it possible to capture more than 50 analysable comets in a single image, which can then be automatically analysed using in-house software. Overall, throughput is increased more than 100-fold compared to the traditional assay. Together, the results presented here demonstrate key advances in comet assay technology that improve the throughput, sensitivity, and robustness, thus enabling larger scale clinical and epidemiological studies. © The Author 2014. Published by Oxford University Press on behalf of the Mutagenesis Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; Lamson, Jacob S.; He, Jennifer; Hoover, Cindi A.; Blow, Matthew J.; Bristow, James; Butland, Gareth
2015-01-01
ABSTRACT Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with any transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative d-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. PMID:25968644
Ganna, Andrea; Lee, Donghwan; Ingelsson, Erik; Pawitan, Yudi
2015-07-01
It is common and advised practice in biomedical research to validate experimental or observational findings in a population different from the one where the findings were initially assessed. This practice increases the generalizability of the results and decreases the likelihood of reporting false-positive findings. Validation becomes critical when dealing with high-throughput experiments, where the large number of tests increases the chance to observe false-positive results. In this article, we review common approaches to determine statistical thresholds for validation and describe the factors influencing the proportion of significant findings from a 'training' sample that are replicated in a 'validation' sample. We refer to this proportion as rediscovery rate (RDR). In high-throughput studies, the RDR is a function of false-positive rate and power in both the training and validation samples. We illustrate the application of the RDR using simulated data and real data examples from metabolomics experiments. We further describe an online tool to calculate the RDR using t-statistics. We foresee two main applications. First, if the validation study has not yet been collected, the RDR can be used to decide the optimal combination between the proportion of findings taken to validation and the size of the validation study. Secondly, if a validation study has already been done, the RDR estimated using the training data can be compared with the observed RDR from the validation data; hence, the success of the validation study can be assessed. © The Author 2014. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean
2016-10-01
To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.
Operational Evaluation of the Rapid Viability PCR Method for ...
Journal Article This research work has a significant impact on the use of the RV-PCR method to analyze post-decontamination environmental samples during an anthrax event. The method has shown 98% agreement with the traditional culture based method. With such a success, this method, upon validation, will significantly increase the laboratory throughput/capacity to analyze a large number of anthrax event samples in a relatively short time.
A rapid enzymatic assay for high-throughput screening of adenosine-producing strains
Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei
2015-01-01
Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842
Mass Transfer Testing of a 12.5-cm Rotor Centrifugal Contactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. H. Meikrantz; T. G. Garn; J. D. Law
2008-09-01
TRUEX mass transfer tests were performed using a single stage commercially available 12.5 cm centrifugal contactor and stable cerium (Ce) and europium (Eu). Test conditions included throughputs ranging from 2.5 to 15 Lpm and rotor speeds of 1750 and 2250 rpm. Ce and Eu extraction forward distribution coefficients ranged from 13 to 19. The first and second stage strip back distributions were 0.5 to 1.4 and .002 to .004, respectively, throughout the dynamic test conditions studied. Visual carryover of aqueous entrainment in all organic phase samples was estimated at < 0.1 % and organic carryover into all aqueous phase samplesmore » was about ten times less. Mass transfer efficiencies of = 98 % for both Ce and Eu in the extraction section were obtained over the entire range of test conditions. The first strip stage mass transfer efficiencies ranged from 75 to 93% trending higher with increasing throughput. Second stage mass transfer was greater than 99% in all cases. Increasing the rotor speed from 1750 to 2250 rpm had no significant effect on efficiency for all throughputs tested.« less
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
High throughput nonparametric probability density estimation
Farmer, Jenny
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803
[Weighted gene co-expression network analysis in biomedicine research].
Liu, Wei; Li, Li; Ye, Hua; Tu, Wei
2017-11-25
High-throughput biological technologies are now widely applied in biology and medicine, allowing scientists to monitor thousands of parameters simultaneously in a specific sample. However, it is still an enormous challenge to mine useful information from high-throughput data. The emergence of network biology provides deeper insights into complex bio-system and reveals the modularity in tissue/cellular networks. Correlation networks are increasingly used in bioinformatics applications. Weighted gene co-expression network analysis (WGCNA) tool can detect clusters of highly correlated genes. Therefore, we systematically reviewed the application of WGCNA in the study of disease diagnosis, pathogenesis and other related fields. First, we introduced principle, workflow, advantages and disadvantages of WGCNA. Second, we presented the application of WGCNA in disease, physiology, drug, evolution and genome annotation. Then, we indicated the application of WGCNA in newly developed high-throughput methods. We hope this review will help to promote the application of WGCNA in biomedicine research.
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; ...
2016-02-24
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
High-throughput methods for characterizing the mechanical properties of coatings
NASA Astrophysics Data System (ADS)
Siripirom, Chavanin
The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component segregates to the surface during curing. The distribution of siloxane as a function of thickness into the sample showed differences depending on the formulation parameters. The coatings which had higher siloxane content near the surface were those coatings found to perform well in field tests.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Monolithic methacrylate packed 96-tips for high throughput bioanalysis.
Altun, Zeki; Skoglund, Christina; Abdel-Rehim, Mohamed
2010-04-16
In the pharmaceutical industry the growing number of samples to be analyzed requires high throughput and fully automated analytical techniques. Commonly used sample-preparation methods are solid-phase extraction (SPE), liquid-liquid extraction (LLE) and protein precipitation. In this paper we will discus a new sample-preparation technique based on SPE for high throughput drug extraction developed and used by our group. This new sample-preparation method is based on monolithic methacrylate polymer as packing sorbent for 96-tip robotic device. Using this device a 96-well plate could be handled in 2-4min. The key aspect of the monolithic phase is that monolithic material can offer both good binding capacity and low back-pressure properties compared to e.g. silica phases. The present paper presents the successful application of monolithic 96-tips and LC-MS/MS by the sample preparation of busulphan, rescovitine, metoprolol, pindolol and local anaesthetics from human plasma samples and cyklophosphamid from mice blood samples. Copyright 2009 Elsevier B.V. All rights reserved.
Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil
2016-01-01
Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721
Song, Zewei; Schlatter, Dan; Kennedy, Peter; Kinkel, Linda L.; Kistler, H. Corby; Nguyen, Nhu; Bates, Scott T.
2015-01-01
Next generation fungal amplicon sequencing is being used with increasing frequency to study fungal diversity in various ecosystems; however, the influence of sample preparation on the characterization of fungal community is poorly understood. We investigated the effects of four procedural modifications to library preparation for high-throughput sequencing (HTS). The following treatments were considered: 1) the amount of soil used in DNA extraction, 2) the inclusion of additional steps (freeze/thaw cycles, sonication, or hot water bath incubation) in the extraction procedure, 3) the amount of DNA template used in PCR, and 4) the effect of sample pooling, either physically or computationally. Soils from two different ecosystems in Minnesota, USA, one prairie and one forest site, were used to assess the generality of our results. The first three treatments did not significantly influence observed fungal OTU richness or community structure at either site. Physical pooling captured more OTU richness compared to individual samples, but total OTU richness at each site was highest when individual samples were computationally combined. We conclude that standard extraction kit protocols are well optimized for fungal HTS surveys, but because sample pooling can significantly influence OTU richness estimates, it is important to carefully consider the study aims when planning sampling procedures. PMID:25974078
Round, Adam; Felisaz, Franck; Fodinger, Lukas; Gobbo, Alexandre; Huet, Julien; Villard, Cyril; Blanchet, Clement E.; Pernot, Petra; McSweeney, Sean; Roessle, Manfred; Svergun, Dmitri I.; Cipriani, Florent
2015-01-01
Small-angle X-ray scattering (SAXS) of macromolecules in solution is in increasing demand by an ever more diverse research community, both academic and industrial. To better serve user needs, and to allow automated and high-throughput operation, a sample changer (BioSAXS Sample Changer) that is able to perform unattended measurements of up to several hundred samples per day has been developed. The Sample Changer is able to handle and expose sample volumes of down to 5 µl with a measurement/cleaning cycle of under 1 min. The samples are stored in standard 96-well plates and the data are collected in a vacuum-mounted capillary with automated positioning of the solution in the X-ray beam. Fast and efficient capillary cleaning avoids cross-contamination and ensures reproducibility of the measurements. Independent temperature control for the well storage and for the measurement capillary allows the samples to be kept cool while still collecting data at physiological temperatures. The Sample Changer has been installed at three major third-generation synchrotrons: on the BM29 beamline at the European Synchrotron Radiation Facility (ESRF), the P12 beamline at the PETRA-III synchrotron (EMBL@PETRA-III) and the I22/B21 beamlines at Diamond Light Source, with the latter being the first commercial unit supplied by Bruker ASC. PMID:25615861
USDA-ARS?s Scientific Manuscript database
The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...
High-throughput sample adaptive offset hardware architecture for high-efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin
2018-03-01
A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Improved radiation dose efficiency in solution SAXS using a sheath flow sample environment
Kirby, Nigel; Cowieson, Nathan; Hawley, Adrian M.; Mudie, Stephen T.; McGillivray, Duncan J.; Kusel, Michael; Samardzic-Boban, Vesna; Ryan, Timothy M.
2016-01-01
Radiation damage is a major limitation to synchrotron small-angle X-ray scattering analysis of biomacromolecules. Flowing the sample during exposure helps to reduce the problem, but its effectiveness in the laminar-flow regime is limited by slow flow velocity at the walls of sample cells. To overcome this limitation, the coflow method was developed, where the sample flows through the centre of its cell surrounded by a flow of matched buffer. The method permits an order-of-magnitude increase of X-ray incident flux before sample damage, improves measurement statistics and maintains low sample concentration limits. The method also efficiently handles sample volumes of a few microlitres, can increase sample throughput, is intrinsically resistant to capillary fouling by sample and is suited to static samples and size-exclusion chromatography applications. The method unlocks further potential of third-generation synchrotron beamlines to facilitate new and challenging applications in solution scattering. PMID:27917826
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clair, Geremy; Piehowski, Paul D.; Nicola, Teodora
Global proteomics approaches allow characterization of whole tissue lysates to an impressive depth. However, it is now increasingly recognized that to better understand the complexity of multicellular organisms, global protein profiling of specific spatially defined regions/substructures of tissues (i.e. spatially-resolved proteomics) is essential. Laser capture microdissection (LCM) enables microscopic isolation of defined regions of tissues preserving crucial spatial information. However, current proteomics workflows entail several manual sample preparation steps and are challenged by the microscopic mass-limited samples generated by LCM, and that impact measurement robustness, quantification, and throughput. Here, we coupled LCM with a fully automated sample preparation workflow thatmore » with a single manual step allows: protein extraction, tryptic digestion, peptide cleanup and LC-MS/MS analysis of proteomes from microdissected tissues. Benchmarking against the current state of the art in ultrasensitive global proteomic analysis, our approach demonstrated significant improvements in quantification and throughput. Using our LCM-SNaPP proteomics approach, we characterized to a depth of more than 3,400 proteins, the ontogeny of protein changes during normal lung development in laser capture microdissected alveolar tissue containing ~4,000 cells per sample. Importantly, the data revealed quantitative changes for 350 low abundance transcription factors and signaling molecules, confirming earlier transcript-level observations and defining seven modules of coordinated transcription factor/signaling molecule expression patterns, suggesting that a complex network of temporal regulatory control directs normal lung development with epigenetic regulation fine-tuning pre-natal developmental processes. Our LCM-proteomics approach facilitates efficient, spatially-resolved, ultrasensitive global proteomics analyses in high-throughput that will be enabling for several clinical and biological applications.« less
High-throughput receptor-based assay for the detection of spirolides by chemiluminescence.
Rodríguez, Laura P; Vilariño, Natalia; Molgó, Jordi; Aráoz, Rómulo; Botana, Luis M
2013-12-01
The spirolides are marine toxins that belong to a new class of macrocyclic imines produced by dinoflagellates. In this study a previously described solid-phase receptor-based assay for the detection of spirolides was optimized for high-throughput screening and prevalidated. This method is based on the competition between 13-desmethyl spirolide C and biotin-α-bungarotoxin immobilized on a streptavidin-coated surface, for binding to nicotinic acetylcholine receptors. In this inhibition assay the amount of nAChR bound to the well surface is quantified using a specific antibody, followed by a second anti-mouse IgG antibody labeled with horseradish peroxidase (HRP). The assay protocol was optimized for 384-well microplates, which allowed a reduction of the amount of reagents per sample and an increase of the number of samples per plate versus previously published receptor-based assays. The sensitivity of the assay for 13-desmethyl spirolide C ranged from 5 to 150 ng mL(-1). The performance of the assay in scallop extracts was adequate, with an estimated detection limit for 13-desmethyl spirolide C of 50 μg kg(-1) of shellfish meat. The recovery rate of 13-desmethyl spirolide C for spiked samples with this assay was 80% and the inter-assay coefficient of variation was 8%. This 384-well microplate, chemiluminescence method can be used as a high-throughput screening assay to detect 13-desmethyl spirolide C in shellfish meat in order to reduce the number of samples to be processed through bioassays or analytical methods. Copyright © 2013 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The effect of refrigeration on bacterial communities within raw and pasteurized buffalo milk was studied using high-throughput sequencing. High quality samples of raw buffalo milk were obtained from five dairy farms in the Guangxi province of China. A sample of each milk was pasteurized, and both r...
Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.
Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin
2016-02-01
High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
High-throughput Titration of Luciferase-expressing Recombinant Viruses
Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon
2014-01-01
Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536
Sergeant, Martin J.; Constantinidou, Chrystala; Cogan, Tristan; Penn, Charles W.; Pallen, Mark J.
2012-01-01
The analysis of 16S-rDNA sequences to assess the bacterial community composition of a sample is a widely used technique that has increased with the advent of high throughput sequencing. Although considerable effort has been devoted to identifying the most informative region of the 16S gene and the optimal informatics procedures to process the data, little attention has been paid to the PCR step, in particular annealing temperature and primer length. To address this, amplicons derived from 16S-rDNA were generated from chicken caecal content DNA using different annealing temperatures, primers and different DNA extraction procedures. The amplicons were pyrosequenced to determine the optimal protocols for capture of maximum bacterial diversity from a chicken caecal sample. Even at very low annealing temperatures there was little effect on the community structure, although the abundance of some OTUs such as Bifidobacterium increased. Using shorter primers did not reveal any novel OTUs but did change the community profile obtained. Mechanical disruption of the sample by bead beating had a significant effect on the results obtained, as did repeated freezing and thawing. In conclusion, existing primers and standard annealing temperatures captured as much diversity as lower annealing temperatures and shorter primers. PMID:22666455
Fernandes, Richard; Carey, Conn; Hynes, James; Papkovsky, Dmitri
2013-01-01
The importance of food safety has resulted in a demand for a more rapid, high-throughput method for total viable count (TVC). The industry standard for TVC determination (ISO 4833:2003) is widely used but presents users with some drawbacks. The method is materials- and labor-intensive, requiring multiple agar plates per sample. More importantly, the method is slow, with 72 h typically required for a definitive result. Luxcel Biosciences has developed the GreenLight Model 960, a microtiter plate-based assay providing a rapid high-throughput method of aerobic bacterial load assessment through analysis of microbial oxygen consumption. Results are generated in 1-12 h, depending on microbial load. The mix and measure procedure allows rapid detection of microbial oxygen consumption and equates oxygen consumption to microbial load (CFU/g), providing a simple, sensitive means of assessing the microbial contamination levels in foods (1). As bacteria in the test sample grow and respire, they deplete O2, which is detected as an increase in the GreenLight probe signal above the baseline level (2). The time required to reach this increase in signal can be used to calculate the CFU/g of the original sample, based on a predetermined calibration. The higher the initial microbial load, the earlier this threshold is reached (1).
Sergeant, Martin J; Constantinidou, Chrystala; Cogan, Tristan; Penn, Charles W; Pallen, Mark J
2012-01-01
The analysis of 16S-rDNA sequences to assess the bacterial community composition of a sample is a widely used technique that has increased with the advent of high throughput sequencing. Although considerable effort has been devoted to identifying the most informative region of the 16S gene and the optimal informatics procedures to process the data, little attention has been paid to the PCR step, in particular annealing temperature and primer length. To address this, amplicons derived from 16S-rDNA were generated from chicken caecal content DNA using different annealing temperatures, primers and different DNA extraction procedures. The amplicons were pyrosequenced to determine the optimal protocols for capture of maximum bacterial diversity from a chicken caecal sample. Even at very low annealing temperatures there was little effect on the community structure, although the abundance of some OTUs such as Bifidobacterium increased. Using shorter primers did not reveal any novel OTUs but did change the community profile obtained. Mechanical disruption of the sample by bead beating had a significant effect on the results obtained, as did repeated freezing and thawing. In conclusion, existing primers and standard annealing temperatures captured as much diversity as lower annealing temperatures and shorter primers.
NASA Astrophysics Data System (ADS)
McDonald, S. A.; Marone, F.; Hintermüller, C.; Bensadoun, J.-C.; Aebischer, P.; Stampanoni, M.
2009-09-01
The use of conventional absorption based X-ray microtomography can become limited for samples showing only very weak absorption contrast. However, a wide range of samples studied in biology and materials science can produce significant phase shifts of the X-ray beam, and thus the use of the phase signal can provide substantially increased contrast and therefore new and otherwise inaccessible information. The application of two approaches for high-throughput, high-resolution X-ray phase contrast tomography, both available on the TOMCAT beamline of the SLS, is illustrated. Differential Phase Contrast (DPC) imaging uses a grating interferometer and a phase-stepping technique. It has been integrated into the beamline environment on TOMCAT in terms of the fast acquisition and reconstruction of data and the availability to scan samples within an aqueous environment. The second phase contrast approach is a modified transfer of intensity approach that can yield the 3D distribution of the phase (refractive index) of a weakly absorbing object from a single tomographic dataset. These methods are being used for the evaluation of cell integrity in 3D, with the specific aim of following and analyzing progressive cell degeneration to increase knowledge of the mechanistic events of neurodegenerative disorders such as Parkinson's disease.
Zweigenbaum, J; Henion, J
2000-06-01
The high-throughput determination of small molecules in biological matrixes has become an important part of drug discovery. This work shows that increased throughput LC/MS/MS techniques can be used for the analysis of selected estrogen receptor modulators in human plasma where more than 2000 samples may be analyzed in a 24-h period. The compounds used to demonstrate the high-throughput methodology include tamoxifen, raloxifene, 4-hydroxytamoxifen, nafoxidine, and idoxifene. Tamoxifen and raloxifene are used in both breast cancer therapy and osteoporosis and have shown prophylactic potential for the reduction of the risk of breast cancer. The described strategy provides LC/MS/MS separation and quantitation for each of the five test articles in control human plasma. The method includes sample preparation employing liquid-liquid extraction in the 96-well format, an LC separation of the five compounds in less than 30 s, and selected reaction monitoring detection from low nano- to microgram per milliter levels. Precision and accuracy are determined where each 96-well plate is considered a typical "tray" having calibration standards and quality control (QC) samples dispersed through each plate. A concept is introduced where 24 96-well plates analyzed in 1 day is considered a "grand tray", and the method is cross-validated with standards placed only at the beginning of the first plate and the end of the last plate. Using idoxifene-d5 as an internal standard, the results obtained for idoxifene and tamoxifen satisfy current bioanalytical method validation criteria on two separate days where 2112 and 2304 samples were run, respectively. Method validation included 24-h autosampler stability and one freeze-thaw cycle stability for the extracts. Idoxifene showed acceptable results with accuracy ranging from 0.3% for the high quality control (QC) to 15.4% for the low QC and precision of 3.6%-13.9% relative standard deviation. Tamoxifen showed accuracy ranging from 1.6% to 13.8% and precision from 7.8% to 15.2%. The linear dynamic range for these compounds was 3 orders of magnitude. The limit of quantification was 5 and 50 ng/ mL for tamoxifen and idoxifene, respectively. The other compounds in this study in general satisfy the more relaxed bioanalytical acceptance criteria for modern drug discovery. It is suggested that the quantification levels reported in this high-throughput analysis example are adequate for many drug discovery and related early pharmaceutical studies.
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Bailey, Denis; Crump, Michael; da Cunha Santos, Gilda
2013-01-01
BACKGROUND: Numerous genomic abnormalities in B-cell non-Hodgkin lymphomas (NHLs) have been revealed by novel high-throughput technologies, including recurrent mutations in EZH2 (enhancer of zeste homolog 2) and CD79B (B cell antigen receptor complex-associated protein beta chain) genes. This study sought to determine the evolution of the mutational status of EZH2 and CD79B over time in different samples from the same patient in a cohort of B-cell NHLs, through use of a customized multiplex mutation assay. METHODS: DNA that was extracted from cytological material stored on FTA cards as well as from additional specimens, including archived frozen and formalin-fixed histological specimens, archived stained smears, and cytospin preparations, were submitted to a multiplex mutation assay specifically designed for the detection of point mutations involving EZH2 and CD79B, using MassARRAY spectrometry followed by Sanger sequencing. RESULTS: All 121 samples from 80 B-cell NHL cases were successfully analyzed. Mutations in EZH2 (Y646) and CD79B (Y196) were detected in 13.2% and 8% of the samples, respectively, almost exclusively in follicular lymphomas and diffuse large B-cell lymphomas. In one-third of the positive cases, a wild type was detected in a different sample from the same patient during follow-up. CONCLUSIONS: Testing multiple minimal tissue samples using a high-throughput multiplex platform exponentially increases tissue availability for molecular analysis and might facilitate future studies of tumor progression and the related molecular events. Mutational status of EZH2 and CD79B may vary in B-cell NHL samples over time and support the concept that individualized therapy should be based on molecular findings at the time of treatment, rather than on results obtained from previous specimens. Cancer (Cancer Cytopathol) 2013;121:377–386. © 2013 American Cancer Society. PMID:23361872
Towards practical time-of-flight secondary ion mass spectrometry lignocellulolytic enzyme assays
2013-01-01
Background Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS) is a surface sensitive mass spectrometry technique with potential strengths as a method for detecting enzymatic activity on solid materials. In particular, ToF-SIMS has been applied to detect the enzymatic degradation of woody lignocellulose. Proof-of-principle experiments previously demonstrated the detection of both lignin-degrading and cellulose-degrading enzymes on solvent-extracted hardwood and softwood. However, these preliminary experiments suffered from low sample throughput and were restricted to samples which had been solvent-extracted in order to minimize the potential for mass interferences between low molecular weight extractive compounds and polymeric lignocellulose components. Results The present work introduces a new, higher-throughput method for processing powdered wood samples for ToF-SIMS, meanwhile exploring likely sources of sample contamination. Multivariate analysis (MVA) including Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR) was regularly used to check for sample contamination as well as to detect extractives and enzyme activity. New data also demonstrates successful ToF-SIMS analysis of unextracted samples, placing an emphasis on identifying the low-mass secondary ion peaks related to extractives, revealing how extractives change previously established peak ratios used to describe enzyme activity, and elucidating peak intensity patterns for better detection of cellulase activity in the presence of extractives. The sensitivity of ToF-SIMS to a range of cellulase doses is also shown, along with preliminary experiments augmenting the cellulase cocktail with other proteins. Conclusions These new procedures increase the throughput of sample preparation for ToF-SIMS analysis of lignocellulose and expand the applications of the method to include unextracted lignocellulose. These are important steps towards the practical use of ToF-SIMS as a tool to screen for changes in plant composition, whether the transformation of the lignocellulose is achieved through enzyme application, plant mutagenesis, or other treatments. PMID:24034438
A Formal Messaging Notation for Alaskan Aviation Data
NASA Technical Reports Server (NTRS)
Rios, Joseph L.
2015-01-01
Data exchange is an increasingly important aspect of the National Airspace System. While many data communication channels have become more capable of sending and receiving data at higher throughput rates, there is still a need to use communication channels efficiently with limited throughput. The limitation can be based on technological issues, financial considerations, or both. This paper provides a complete description of several important aviation weather data in Abstract Syntax Notation format. By doing so, data providers can take advantage of Abstract Syntax Notation's ability to encode data in a highly compressed format. When data such as pilot weather reports, surface weather observations, and various weather predictions are compressed in such a manner, it allows for the efficient use of throughput-limited communication channels. This paper provides details on the Abstract Syntax Notation One (ASN.1) implementation for Alaskan aviation data, and demonstrates its use on real-world aviation weather data samples as Alaska has sparse terrestrial data infrastructure and data are often sent via relatively costly satellite channels.
Novel method for high-throughput colony PCR screening in nanoliter-reactors
Walser, Marcel; Pellaux, Rene; Meyer, Andreas; Bechtold, Matthias; Vanderschuren, Herve; Reinhardt, Richard; Magyar, Joseph; Panke, Sven; Held, Martin
2009-01-01
We introduce a technology for the rapid identification and sequencing of conserved DNA elements employing a novel suspension array based on nanoliter (nl)-reactors made from alginate. The reactors have a volume of 35 nl and serve as reaction compartments during monoseptic growth of microbial library clones, colony lysis, thermocycling and screening for sequence motifs via semi-quantitative fluorescence analyses. nl-Reactors were kept in suspension during all high-throughput steps which allowed performing the protocol in a highly space-effective fashion and at negligible expenses of consumables and reagents. As a first application, 11 high-quality microsatellites for polymorphism studies in cassava were isolated and sequenced out of a library of 20 000 clones in 2 days. The technology is widely scalable and we envision that throughputs for nl-reactor based screenings can be increased up to 100 000 and more samples per day thereby efficiently complementing protocols based on established deep-sequencing technologies. PMID:19282448
A high throughput mechanical screening device for cartilage tissue engineering.
Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L
2014-06-27
Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.
A transmission imaging spectrograph and microfabricated channel system for DNA analysis.
Simpson, J W; Ruiz-Martinez, M C; Mulhern, G T; Berka, J; Latimer, D R; Ball, J A; Rothberg, J M; Went, G T
2000-01-01
In this paper we present the development of a DNA analysis system using a microfabricated channel device and a novel transmission imaging spectrograph which can be efficiently incorporated into a high throughput genomics facility for both sizing and sequencing of DNA fragments. The device contains 48 channels etched on a glass substrate. The channels are sealed with a flat glass plate which also provides a series of apertures for sample loading and contact with buffer reservoirs. Samples can be easily loaded in volumes up to 640 nL without band broadening because of an efficient electrokinetic stacking at the electrophoresis channel entrance. The system uses a dual laser excitation source and a highly sensitive charge-coupled device (CCD) detector allowing for simultaneous detection of many fluorescent dyes. The sieving matrices for the separation of single-stranded DNA fragments are polymerized in situ in denaturing buffer systems. Examples of separation of single-stranded DNA fragments up to 500 bases in length are shown, including accurate sizing of GeneCalling fragments, and sequencing samples prepared with a reduced amount of dye terminators. An increase in sample throughput has been achieved by color multiplexing.
Automated high-throughput protein purification using an ÄKTApurifier and a CETAC autosampler.
Yoo, Daniel; Provchy, Justin; Park, Cynthia; Schulz, Craig; Walker, Kenneth
2014-05-30
As the pace of drug discovery accelerates there is an increased focus on screening larger numbers of protein therapeutic candidates to identify those that are functionally superior and to assess manufacturability earlier in the process. Although there have been advances toward high throughput (HT) cloning and expression, protein purification is still an area where improvements can be made to conventional techniques. Current methodologies for purification often involve a tradeoff between HT automation or capacity and quality. We present an ÄKTA combined with an autosampler, the ÄKTA-AS, which has the capability of purifying up to 240 samples in two chromatographic dimensions without the need for user intervention. The ÄKTA-AS has been shown to be reliable with sample volumes between 0.5 mL and 100 mL, and the innovative use of a uniquely configured loading valve ensures reliability by efficiently removing air from the system as well as preventing sample cross contamination. Incorporation of a sample pump flush minimizes sample loss and enables recoveries ranging from the low tens of micrograms to milligram quantities of protein. In addition, when used in an affinity capture-buffer exchange format the final samples are formulated in a buffer compatible with most assays without requirement of additional downstream processing. The system is designed to capture samples in 96-well microplate format allowing for seamless integration of downstream HT analytic processes such as microfluidic or HPLC analysis. Most notably, there is minimal operator intervention to operate this system, thereby increasing efficiency, sample consistency and reducing the risk of human error. Copyright © 2014 Elsevier B.V. All rights reserved.
Low-dose fixed-target serial synchrotron crystallography.
Owen, Robin L; Axford, Danny; Sherrell, Darren A; Kuo, Anling; Ernst, Oliver P; Schulz, Eike C; Miller, R J Dwayne; Mueller-Werkmeister, Henrike M
2017-04-01
The development of serial crystallography has been driven by the sample requirements imposed by X-ray free-electron lasers. Serial techniques are now being exploited at synchrotrons. Using a fixed-target approach to high-throughput serial sampling, it is demonstrated that high-quality data can be collected from myoglobin crystals, allowing room-temperature, low-dose structure determination. The combination of fixed-target arrays and a fast, accurate translation system allows high-throughput serial data collection at high hit rates and with low sample consumption.
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
High-throughput quantitative analysis by desorption electrospray ionization mass spectrometry.
Manicke, Nicholas E; Kistler, Thomas; Ifa, Demian R; Cooks, R Graham; Ouyang, Zheng
2009-02-01
A newly developed high-throughput desorption electrospray ionization (DESI) source was characterized in terms of its performance in quantitative analysis. A 96-sample array, containing pharmaceuticals in various matrices, was analyzed in a single run with a total analysis time of 3 min. These solution-phase samples were examined from a hydrophobic PTFE ink printed on glass. The quantitative accuracy, precision, and limit of detection (LOD) were characterized. Chemical background-free samples of propranolol (PRN) with PRN-d(7) as internal standard (IS) and carbamazepine (CBZ) with CBZ-d(10) as IS were examined. So were two other sample sets consisting of PRN/PRN-d(7) at varying concentration in a biological milieu of 10% urine or porcine brain total lipid extract, total lipid concentration 250 ng/microL. The background-free samples, examined in a total analysis time of 1.5 s/sample, showed good quantitative accuracy and precision, with a relative error (RE) and relative standard deviation (RSD) generally less than 3% and 5%, respectively. The samples in urine and the lipid extract required a longer analysis time (2.5 s/sample) and showed RSD values of around 10% for the samples in urine and 4% for the lipid extract samples and RE values of less than 3% for both sets. The LOD for PRN and CBZ when analyzed without chemical background was 10 and 30 fmol, respectively. The LOD of PRN increased to 400 fmol analyzed in 10% urine, and 200 fmol when analyzed in the brain lipid extract.
Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.
2015-01-01
Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
Gentle, fast and effective crystal soaking by acoustic dispensing
Ng, Jia Tsing; Talon, Romain; Nekrosiute, Karolina; Krojer, Tobias; Douangamath, Alice; Brandao-Neto, Jose; Pearce, Nicholas M.; von Delft, Frank
2017-01-01
The steady expansion in the capacity of modern beamlines for high-throughput data collection, enabled by increasing X-ray brightness, capacity of robotics and detector speeds, has pushed the bottleneck upstream towards sample preparation. Even in ligand-binding studies using crystal soaking, the experiment best able to exploit beamline capacity, a primary limitation is the need for gentle and nontrivial soaking regimens such as stepwise concentration increases, even for robust and well characterized crystals. Here, the use of acoustic droplet ejection for the soaking of protein crystals with small molecules is described, and it is shown that it is both gentle on crystals and allows very high throughput, with 1000 unique soaks easily performed in under 10 min. In addition to having very low compound consumption (tens of nanolitres per sample), the positional precision of acoustic droplet ejection enables the targeted placement of the compound/solvent away from crystals and towards drop edges, allowing gradual diffusion of solvent across the drop. This ensures both an improvement in the reproducibility of X-ray diffraction and increased solvent tolerance of the crystals, thus enabling higher effective compound-soaking concentrations. The technique is detailed here with examples from the protein target JMJD2D, a histone lysine demethylase with roles in cancer and the focus of active structure-based drug-design efforts. PMID:28291760
High-throughput and automated SAXS/USAXS experiment for industrial use at BL19B2 in SPring-8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osaka, Keiichi, E-mail: k-osaka@spring8.or.jp; Inoue, Daisuke; Sato, Masugu
A highly automated system combining a sample transfer robot with focused SR beam has been established for small-angle and ultra small-angle X-ray scattering (SAXS/USAXS) measurement at BL19B2 for industrial use of SPring-8. High-throughput data collection system can be realized by means of X-ray beam of high photon flux density concentrated by a cylindrical mirror, and a two-dimensional pixel detector PILATUS-2M. For SAXS measurement, we can obtain high-quality data within 1 minute for one exposure using this system. The sample transfer robot has a capacity of 90 samples with a large variety of shapes. The fusion of high-throughput and robotic systemmore » has enhanced the usability of SAXS/USAXS capability for industrial application.« less
Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won
2015-08-21
We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.
Li, B; Chan, E C Y
2003-01-01
We present an approach to customize the sample submission process for high-throughput purification (HTP) of combinatorial parallel libraries using preparative liquid chromatography electrospray ionization mass spectrometry. In this study, Visual Basic and Visual Basic for Applications programs were developed using Microsoft Visual Basic 6 and Microsoft Excel 2000, respectively. These programs are subsequently applied for the seamless electronic submission and handling of data for HTP. Functions were incorporated into these programs where medicinal chemists can perform on-line verification of the purification status and on-line retrieval of postpurification data. The application of these user friendly and cost effective programs in our HTP technology has greatly increased our work efficiency by reducing paper work and manual manipulation of data.
High pressure inertial focusing for separating and concentrating bacteria at high throughput
NASA Astrophysics Data System (ADS)
Cruz, J.; Hooshmand Zadeh, S.; Graells, T.; Andersson, M.; Malmström, J.; Wu, Z. G.; Hjort, K.
2017-08-01
Inertial focusing is a promising microfluidic technology for concentration and separation of particles by size. However, there is a strong correlation of increased pressure with decreased particle size. Theory and experimental results for larger particles were used to scale down the phenomenon and find the conditions that focus 1 µm particles. High pressure experiments in robust glass chips were used to demonstrate the alignment. We show how the technique works for 1 µm spherical polystyrene particles and for Escherichia coli, not being harmful for the bacteria at 50 µl min-1. The potential to focus bacteria, simplicity of use and high throughput make this technology interesting for healthcare applications, where concentration and purification of a sample may be required as an initial step.
Next generation platforms for high-throughput biodosimetry.
Repin, Mikhail; Turner, Helen C; Garty, Guy; Brenner, David J
2014-06-01
Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of biodosimetry assays was described. These platforms can be used at different stages of biodosimetry assays starting from blood collection into microtubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multiwell and multichannel plates. Robotically friendly platforms can be used for different biodosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Yang, Jing-Hua; Shao, Jing; Wang, Hou-Yu; Dong, Jing-Yu; Fan, Liu-Yin; Cao, Cheng-Xi; Xu, Yu-Quan
2012-09-01
Herein, a simple novel free-flow electrophoresis (FFE) method was developed via introduction of organic solvent into the electrolyte system, increasing the solute solubility and throughput of the sample. As a proof of concept, phenazine-1-carboxylic acid (PCA) from Pseudomonas sp. M18 was selected as a model solute for the demonstration on feasibility of novel FFE method on account of its faint solubility in aqueous circumstance. In the developed method, the organic solvent was added into not only the sample buffer to improve the solubility of the solute, but also the background buffer to construct a uniform aqueous-organic circumstance. These factors of organic solvent percentage and types as well as pH value of background buffer were investigated for the purification of PCA in the FFE device via CE. The experiments revealed that the percentage and the types of organic solvent exerted major influence on the purification of PCA. Under the optimized conditions (30 mM phosphate buffer in 60:40 (v/v) water-methanol at an apparent pH 7.0, 3.26 mL/min background flux, 10-min residence time of injected sample, and 400 V), PCA could be continuously purified from its impurities. The flux of sample injection was 10.05 μL/min, and the recovery was up to 93.7%. An 11.9-fold improvement of throughput was found with a carrier buffer containing 40% (v/v) methanol, compared with the pure aqueous phase. The developed procedure is of evident significance for the purification of weak polarity solute via FFE. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A New High-Throughput Approach to Genotype Ancient Human Gastrointestinal Parasites.
Côté, Nathalie M L; Daligault, Julien; Pruvost, Mélanie; Bennett, E Andrew; Gorgé, Olivier; Guimaraes, Silvia; Capelli, Nicolas; Le Bailly, Matthieu; Geigl, Eva-Maria; Grange, Thierry
2016-01-01
Human gastrointestinal parasites are good indicators for hygienic conditions and health status of past and present individuals and communities. While microscopic analysis of eggs in sediments of archeological sites often allows their taxonomic identification, this method is rarely effective at the species level, and requires both the survival of intact eggs and their proper identification. Genotyping via PCR-based approaches has the potential to achieve a precise species-level taxonomic determination. However, so far it has mostly been applied to individual eggs isolated from archeological samples. To increase the throughput and taxonomic accuracy, as well as reduce costs of genotyping methods, we adapted a PCR-based approach coupled with next-generation sequencing to perform precise taxonomic identification of parasitic helminths directly from archeological sediments. Our study of twenty-five 100 to 7,200 year-old archeological samples proved this to be a powerful, reliable and efficient approach for species determination even in the absence of preserved eggs, either as a stand-alone method or as a complement to microscopic studies.
Application of an industrial robot to nuclear pharmacy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viola, J.
1994-12-31
Increased patient throughput and lengthened P.E.T. scan protocols have increased the radiation dose received by P.E.T. technologists. Automated methods of tracer infusion and blood sampling have been introduced to reduce direct contact with the radioisotopes, but significant radiation exposure still exists during the receipt and dispensing of the patient dose. To address this situation the authors have developed an automated robotic system which performs these tasks, thus limiting the physical contact between operator and radioisotope.
Large-scale human skin lipidomics by quantitative, high-throughput shotgun mass spectrometry.
Sadowski, Tomasz; Klose, Christian; Gerl, Mathias J; Wójcik-Maciejewicz, Anna; Herzog, Ronny; Simons, Kai; Reich, Adam; Surma, Michal A
2017-03-07
The lipid composition of human skin is essential for its function; however the simultaneous quantification of a wide range of stratum corneum (SC) and sebaceous lipids is not trivial. We developed and validated a quantitative high-throughput shotgun mass spectrometry-based platform for lipid analysis of tape-stripped SC skin samples. It features coverage of 16 lipid classes; total quantification to the level of individual lipid molecules; high reproducibility and high-throughput capabilities. With this method we conducted a large lipidomic survey of 268 human SC samples, where we investigated the relationship between sampling depth and lipid composition, lipidome variability in samples from 14 different sampling sites on the human body and finally, we assessed the impact of age and sex on lipidome variability in 104 healthy subjects. We found sebaceous lipids to constitute an abundant component of the SC lipidome as they diffuse into the topmost SC layers forming a gradient. Lipidomic variability with respect to sampling depth, site and subject is considerable, and mainly accredited to sebaceous lipids, while stratum corneum lipids vary less. This stresses the importance of sampling design and the role of sebaceous lipids in skin studies.
Restructuring of the Aquatic Bacterial Community by Hydric Dynamics Associated with Superstorm Sandy
Ulrich, Nikea; Rosenberger, Abigail; Brislawn, Colin; Wright, Justin; Kessler, Collin; Toole, David; Solomon, Caroline; Strutt, Steven; McClure, Erin
2016-01-01
ABSTRACT Bacterial community composition and longitudinal fluctuations were monitored in a riverine system during and after Superstorm Sandy to better characterize inter- and intracommunity responses associated with the disturbance associated with a 100-year storm event. High-throughput sequencing of the 16S rRNA gene was used to assess microbial community structure within water samples from Muddy Creek Run, a second-order stream in Huntingdon, PA, at 12 different time points during the storm event (29 October to 3 November 2012) and under seasonally matched baseline conditions. High-throughput sequencing of the 16S rRNA gene was used to track changes in bacterial community structure and divergence during and after Superstorm Sandy. Bacterial community dynamics were correlated to measured physicochemical parameters and fecal indicator bacteria (FIB) concentrations. Bioinformatics analyses of 2.1 million 16S rRNA gene sequences revealed a significant increase in bacterial diversity in samples taken during peak discharge of the storm. Beta-diversity analyses revealed longitudinal shifts in the bacterial community structure. Successional changes were observed, in which Betaproteobacteria and Gammaproteobacteria decreased in 16S rRNA gene relative abundance, while the relative abundance of members of the Firmicutes increased. Furthermore, 16S rRNA gene sequences matching pathogenic bacteria, including strains of Legionella, Campylobacter, Arcobacter, and Helicobacter, as well as bacteria of fecal origin (e.g., Bacteroides), exhibited an increase in abundance after peak discharge of the storm. This study revealed a significant restructuring of in-stream bacterial community structure associated with hydric dynamics of a storm event. IMPORTANCE In order to better understand the microbial risks associated with freshwater environments during a storm event, a more comprehensive understanding of the variations in aquatic bacterial diversity is warranted. This study investigated the bacterial communities during and after Superstorm Sandy to provide fine time point resolution of dynamic changes in bacterial composition. This study adds to the current literature by revealing the variation in bacterial community structure during the course of a storm. This study employed high-throughput DNA sequencing, which generated a deep analysis of inter- and intracommunity responses during a significant storm event. This study has highlighted the utility of applying high-throughput sequencing for water quality monitoring purposes, as this approach enabled a more comprehensive investigation of the bacterial community structure. Altogether, these data suggest a drastic restructuring of the stream bacterial community during a storm event and highlight the potential of high-throughput sequencing approaches for assessing the microbiological quality of our environment. PMID:27060115
Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C
2016-05-17
We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Carter, Melissa D.; Crow, Brian S.; Pantazides, Brooke G.; Watson, Caroline M.; deCastro, B. Rey; Thomas, Jerry D.; Blake, Thomas A.; Johnson, Rudolph C.
2017-01-01
A high-throughput prioritization method was developed for use with a validated confirmatory method detecting organophosphorus nerve agent exposure by immunomagnetic separation-HPLC-MS/MS. A ballistic gradient was incorporated into this analytical method in order to profile unadducted butyrylcholinesterase (BChE) in clinical samples. With Zhang, et al. 1999’s Z′-factor of 0.88 ± 0.01 (SD) of control analytes and Z-factor of 0.25 ± 0.06 (SD) of serum samples, the assay is rated an “excellent assay” for the synthetic peptide controls used and a “double assay” when used to prioritize clinical samples. Hits, defined as samples containing BChE Ser-198 adducts or no BChE present, were analyzed in a confirmatory method for identification and quantitation of the BChE adduct, if present. The ability to prioritize samples by highest exposure for confirmatory analysis is of particular importance in an exposure to cholinesterase inhibitors such as organophosphorus nerve agents where a large number of clinical samples may be collected. In an initial blind screen, 67 out of 70 samples were accurately identified giving an assay accuracy of 96% and yielded no false negatives. The method is the first to provide a high-throughput prioritization assay for profiling adduction of Ser-198 BChE in clinical samples. PMID:23954929
Sample-Clock Phase-Control Feedback
NASA Technical Reports Server (NTRS)
Quirk, Kevin J.; Gin, Jonathan W.; Nguyen, Danh H.; Nguyen, Huy
2012-01-01
To demodulate a communication signal, a receiver must recover and synchronize to the symbol timing of a received waveform. In a system that utilizes digital sampling, the fidelity of synchronization is limited by the time between the symbol boundary and closest sample time location. To reduce this error, one typically uses a sample clock in excess of the symbol rate in order to provide multiple samples per symbol, thereby lowering the error limit to a fraction of a symbol time. For systems with a large modulation bandwidth, the required sample clock rate is prohibitive due to current technological barriers and processing complexity. With precise control of the phase of the sample clock, one can sample the received signal at times arbitrarily close to the symbol boundary, thus obviating the need, from a synchronization perspective, for multiple samples per symbol. Sample-clock phase-control feedback was developed for use in the demodulation of an optical communication signal, where multi-GHz modulation bandwidths would require prohibitively large sample clock frequencies for rates in excess of the symbol rate. A custom mixedsignal (RF/digital) offset phase-locked loop circuit was developed to control the phase of the 6.4-GHz clock that samples the photon-counting detector output. The offset phase-locked loop is driven by a feedback mechanism that continuously corrects for variation in the symbol time due to motion between the transmitter and receiver as well as oscillator instability. This innovation will allow significant improvements in receiver throughput; for example, the throughput of a pulse-position modulation (PPM) with 16 slots can increase from 188 Mb/s to 1.5 Gb/s.
Verdirame, Maria; Veneziano, Maria; Alfieri, Anna; Di Marco, Annalise; Monteagudo, Edith; Bonelli, Fabio
2010-03-11
Turbulent Flow Chromatography (TFC) is a powerful approach for on-line extraction in bioanalytical studies. It improves sensitivity and reduces sample preparation time, two factors that are of primary importance in drug discovery. In this paper the application of the ARIA system to the analytical support of in vivo pharmacokinetics (PK) and in vitro drug metabolism studies is described, with an emphasis in high throughput optimization. For PK studies, a comparison between acetonitrile plasma protein precipitation (APPP) and TFC was carried out. Our optimized TFC methodology gave better S/N ratios and lower limit of quantification (LOQ) than conventional procedures. A robust and high throughput analytical method to support hepatocyte metabolic stability screening of new chemical entities was developed by hyphenation of TFC with mass spectrometry. An in-loop dilution injection procedure was implemented to overcome one of the main issues when using TFC, that is the early elution of hydrophilic compounds that renders low recoveries. A comparison between off-line solid phase extraction (SPE) and TFC was also carried out, and recovery, sensitivity (LOQ), matrix effect and robustness were evaluated. The use of two parallel columns in the configuration of the system provided a further increase of the throughput. Copyright 2009 Elsevier B.V. All rights reserved.
Software Voting in Asynchronous NMR (N-Modular Redundancy) Computer Structures.
1983-05-06
added reliability is exchanged for increased system cost and decreased throughput. Some applications require extremely reliable systems, so the only...not the other way around. Although no systems proidc abstract voting yet. as more applications are written for NMR systems, the programmers are going...throughput goes down, the overhead goes up. Mathematically : Overhead= Non redundant Throughput- Actual Throughput (1) In this section, the actual throughput
High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Human Plasma.
Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan
2016-07-01
We present a high-throughput, nontargeted lipidomics approach using liquid chromatography coupled to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. We applied this method to screen a wide range of fatty acids from medium-chain to very long-chain (8 to 24 carbon atoms) in human plasma samples. The method enables us to chromatographically separate branched-chain species from their straight-chain isomers as well as separate biologically important ω-3 and ω-6 polyunsaturated fatty acids. We used 51 fatty acid species to demonstrate the quantitative capability of this method with quantification limits in the nanomolar range; however, this method is not limited only to these fatty acid species. High-throughput sample preparation was developed and carried out on a robotic platform that allows extraction of 96 samples simultaneously within 3 h. This high-throughput platform was used to assess the influence of different types of human plasma collection and preparation on the nonesterified fatty acid profile of healthy donors. Use of the anticoagulants EDTA and heparin has been compared with simple clotting, and only limited changes have been detected in most nonesterified fatty acid concentrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
High-throughput DNA extraction of forensic adhesive tapes.
Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes
2016-09-01
Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.
Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy
2011-03-01
The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.
Dunning, F. Mark; Ruge, Daniel R.; Piazza, Timothy M.; Stanker, Larry H.; Zeytin, Füsûn N.
2012-01-01
Rapid, high-throughput assays that detect and quantify botulinum neurotoxin (BoNT) activity in diverse matrices are required for environmental, clinical, pharmaceutical, and food testing. The current standard, the mouse bioassay, is sensitive but is low in throughput and precision. In this study, we present three biochemical assays for the detection and quantification of BoNT serotype A, B, and F proteolytic activities in complex matrices that offer picomolar to femtomolar sensitivity with small assay volumes and total assay times of less than 24 h. These assays consist of magnetic beads conjugated with BoNT serotype-specific antibodies that are used to purify BoNT from complex matrices before the quantification of bound BoNT proteolytic activity using the previously described BoTest reporter substrates. The matrices tested include human serum, whole milk, carrot juice, and baby food, as well as buffers containing common pharmaceutical excipients. The limits of detection were below 1 pM for BoNT/A and BoNT/F and below 10 pM for BoNT/B in most tested matrices using 200-μl samples and as low as 10 fM for BoNT/A with an increased sample volume. Together, these data describe rapid, robust, and high-throughput assays for BoNT detection that are compatible with a wide range of matrices. PMID:22923410
Wu, Jiapeng; Hong, Yiguo; Guan, Fengjie; Wang, Yan; Tan, Yehui; Yue, Weizhong; Wu, Meilin; Bin, Liying; Wang, Jiaping; Wen, Jiali
2016-02-01
The well-known zinc-cadmium reduction method is frequently used for determination of nitrate. However, this method is seldom to be applied on field research of nitrate due to the long time consuming and large sample volume demand. Here, we reported a modified zinc-cadmium reduction method (MZCRM) for measurement of nitrate at natural-abundance level in both seawater and freshwater. The main improvements of MZCRM include using small volume disposable tubes for reaction, a vortex apparatus for shaking to increase reduction rate, and a microplate reader for high-throughput spectrophotometric measurements. Considering salt effect, two salinity sections (5~10 psu and 20~35 psu) were set up for more accurate determination of nitrate in low and high salinity condition respectively. Under optimized experimental conditions, the reduction rates were stabilized on 72% and 63% on the salinity of 5 and 20 psu respectively. The lowest detection limit for nitrate was 0.5 μM and was linear up to 100 μM (RSDs was 4.8%). Environmental samples assay demonstrated that MZCRM was well consistent with conventional zinc-cadmium reduction method. In total, this modified method improved accuracy and efficiency of operations greatly, and would be realized a rapid and high-throughput determination of nitrate in field analysis of nitrate with low cost.
High-throughput sequencing of forensic genetic samples using punches of FTA cards with buccal swabs.
Kampmann, Marie-Louise; Buchard, Anders; Børsting, Claus; Morling, Niels
2016-01-01
Here, we demonstrate that punches from buccal swab samples preserved on FTA cards can be used for high-throughput DNA sequencing, also known as massively parallel sequencing (MPS). We typed 44 reference samples with the HID-Ion AmpliSeq Identity Panel using washed 1.2 mm punches from FTA cards with buccal swabs and compared the results with those obtained with DNA extracted using the EZ1 DNA Investigator Kit. Concordant profiles were obtained for all samples. Our protocol includes simple punch, wash, and PCR steps, reducing cost and hands-on time in the laboratory. Furthermore, it facilitates automation of DNA sequencing.
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis
2012-01-01
Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.
Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong
2012-01-25
The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.
NASA Technical Reports Server (NTRS)
Jandebeur, T. S.
1980-01-01
The effect of sample concentration on throughput and resolution in a modified continuous particle electrophoresis (CPE) system with flow in an upward direction is investigated. Maximum resolution is achieved at concentrations ranging from 2 x 10 to the 8th power cells/ml to 8 x 10 to the 8th power cells/ml. The widest peak separation is at 2 x 10 to the 8th power cells/ml; however, the sharpest peaks and least overlap between cell populations is at 8 x 10 to the 8th power cells/ml. Apparently as a result of improved electrophoresis cell performance due to coasting the chamber with bovine serum albumin, changing the electrode membranes and rinse, and lowering buffer temperatures, sedimentation effects attending to higher concentrations are diminished. Throughput as measured by recovery of fixed cells is diminished at the concentrations judged most likely to yield satisfactory resolution. The tradeoff appears to be improved recovery/throughput at the expense of resolution.
High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.
Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L
2005-01-01
An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.
Uplink Downlink Rate Balancing and Throughput Scaling in FDD Massive MIMO Systems
NASA Astrophysics Data System (ADS)
Bergel, Itsik; Perets, Yona; Shamai, Shlomo
2016-05-01
In this work we extend the concept of uplink-downlink rate balancing to frequency division duplex (FDD) massive MIMO systems. We consider a base station with large number antennas serving many single antenna users. We first show that any unused capacity in the uplink can be traded off for higher throughput in the downlink in a system that uses either dirty paper (DP) coding or linear zero-forcing (ZF) precoding. We then also study the scaling of the system throughput with the number of antennas in cases of linear Beamforming (BF) Precoding, ZF Precoding, and DP coding. We show that the downlink throughput is proportional to the logarithm of the number of antennas. While, this logarithmic scaling is lower than the linear scaling of the rate in the uplink, it can still bring significant throughput gains. For example, we demonstrate through analysis and simulation that increasing the number of antennas from 4 to 128 will increase the throughput by more than a factor of 5. We also show that a logarithmic scaling of downlink throughput as a function of the number of receive antennas can be achieved even when the number of transmit antennas only increases logarithmically with the number of receive antennas.
Enhancing Bottom-up and Top-down Proteomic Measurements with Ion Mobility Separations
Baker, Erin Shammel; Burnum-Johnson, Kristin E.; Ibrahim, Yehia M.; ...
2015-07-03
Proteomic measurements with greater throughput, sensitivity and additional structural information enhance the in-depth characterization of complex mixtures and targeted studies with additional information and higher confidence. While liquid chromatography separation coupled with mass spectrometry (LC-MS) measurements have provided information on thousands of proteins in different sample types, the additional of another rapid separation stage providing structural information has many benefits for analyses. Technical advances in ion funnels and multiplexing have enabled ion mobility separations to be easily and effectively coupled with LC-MS proteomics to enhance the information content of measurements. Finally, herein, we report on applications illustrating increased sensitivity, throughput,more » and structural information by utilizing IMS-MS and LC-IMS-MS measurements for both bottom-up and top-down proteomics measurements.« less
Colour-barcoded magnetic microparticles for multiplexed bioassays.
Lee, Howon; Kim, Junhoi; Kim, Hyoki; Kim, Jiyun; Kwon, Sunghoon
2010-09-01
Encoded particles have a demonstrated value for multiplexed high-throughput bioassays such as drug discovery and clinical diagnostics. In diverse samples, the ability to use a large number of distinct identification codes on assay particles is important to increase throughput. Proper handling schemes are also needed to readout these codes on free-floating probe microparticles. Here we create vivid, free-floating structural coloured particles with multi-axis rotational control using a colour-tunable magnetic material and a new printing method. Our colour-barcoded magnetic microparticles offer a coding capacity easily into the billions with distinct magnetic handling capabilities including active positioning for code readouts and active stirring for improved reaction kinetics in microscale environments. A DNA hybridization assay is done using the colour-barcoded magnetic microparticles to demonstrate multiplexing capabilities.
Investigation Of In-Line Monitoring Options At H Canyon/HB Line For Plutonium Oxide Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sexton, L.
2015-10-14
H Canyon and HB Line have a production goal of 1 MT per year of plutonium oxide feedstock for the MOX facility by FY17 (AFS-2 mission). In order to meet this goal, steps will need to be taken to improve processing efficiency. One concept for achieving this goal is to implement in-line process monitoring at key measurement points within the facilities. In-line monitoring during operations has the potential to increase throughput and efficiency while reducing costs associated with laboratory sample analysis. In the work reported here, we mapped the plutonium oxide process, identified key measurement points, investigated alternate technologies thatmore » could be used for in-line analysis, and initiated a throughput benefit analysis.« less
Gold nanoparticles for high-throughput genotyping of long-range haplotypes
NASA Astrophysics Data System (ADS)
Chen, Peng; Pan, Dun; Fan, Chunhai; Chen, Jianhua; Huang, Ke; Wang, Dongfang; Zhang, Honglu; Li, You; Feng, Guoyin; Liang, Peiji; He, Lin; Shi, Yongyong
2011-10-01
Completion of the Human Genome Project and the HapMap Project has led to increasing demands for mapping complex traits in humans to understand the aetiology of diseases. Identifying variations in the DNA sequence, which affect how we develop disease and respond to pathogens and drugs, is important for this purpose, but it is difficult to identify these variations in large sample sets. Here we show that through a combination of capillary sequencing and polymerase chain reaction assisted by gold nanoparticles, it is possible to identify several DNA variations that are associated with age-related macular degeneration and psoriasis on significant regions of human genomic DNA. Our method is accurate and promising for large-scale and high-throughput genetic analysis of susceptibility towards disease and drug resistance.
A high-throughput two channel discrete wavelet transform architecture for the JPEG2000 standard
NASA Astrophysics Data System (ADS)
Badakhshannoory, Hossein; Hashemi, Mahmoud R.; Aminlou, Alireza; Fatemi, Omid
2005-07-01
The Discrete Wavelet Transform (DWT) is increasingly recognized in image and video compression standards, as indicated by its use in JPEG2000. The lifting scheme algorithm is an alternative DWT implementation that has a lower computational complexity and reduced resource requirement. In the JPEG2000 standard two lifting scheme based filter banks are introduced: the 5/3 and 9/7. In this paper a high throughput, two channel DWT architecture for both of the JPEG2000 DWT filters is presented. The proposed pipelined architecture has two separate input channels that process the incoming samples simultaneously with minimum memory requirement for each channel. The architecture had been implemented in VHDL and synthesized on a Xilinx Virtex2 XCV1000. The proposed architecture applies DWT on a 2K by 1K image at 33 fps with a 75 MHZ clock frequency. This performance is achieved with 70% less resources than two independent single channel modules. The high throughput and reduced resource requirement has made this architecture the proper choice for real time applications such as Digital Cinema.
Targeted Capture and High-Throughput Sequencing Using Molecular Inversion Probes (MIPs).
Cantsilieris, Stuart; Stessman, Holly A; Shendure, Jay; Eichler, Evan E
2017-01-01
Molecular inversion probes (MIPs) in combination with massively parallel DNA sequencing represent a versatile, yet economical tool for targeted sequencing of genomic DNA. Several thousand genomic targets can be selectively captured using long oligonucleotides containing unique targeting arms and universal linkers. The ability to append sequencing adaptors and sample-specific barcodes allows large-scale pooling and subsequent high-throughput sequencing at relatively low cost per sample. Here, we describe a "wet bench" protocol detailing the capture and subsequent sequencing of >2000 genomic targets from 192 samples, representative of a single lane on the Illumina HiSeq 2000 platform.
Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.
Yang, Darren; Wong, Wesley P
2018-01-01
We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples...
BEAMS Lab: Novel approaches to finding a balance between throughput and sensitivity
NASA Astrophysics Data System (ADS)
Liberman, Rosa G.; Skipper, Paul L.; Prakash, Chandra; Shaffer, Christopher L.; Flarakos, Jimmy; Tannenbaum, Steven R.
2007-06-01
Development of 14C AMS has long pursued the twin goals of maximizing both sensitivity and precision in the interest, among others, of optimizing radiocarbon dating. Application of AMS to biomedical research is less constrained with respect to sensitivity requirements, but more demanding of high throughput. This work presents some technical and conceptual developments in sample processing and analytical instrumentation designed to streamline the process of extracting quantitative data from the various types of samples encountered in analytical biochemistry.
Zhang, Shu-Xin; Peng, Rong; Jiang, Ran; Chai, Xin-Sheng; Barnes, Donald G
2018-02-23
This paper reports on a high-throughput headspace gas chromatographic method (HS-GC) for the determination of nitrite content in water sample, based on GC measurement of cyclohexene produced from the reaction between nitrite and cyclamate in a closed vial. The method has a relative standard deviation of <3.5%; The differences between the results of the nitrite measurements obtained by this method and those of a reference method were less than 5.8% and the recoveries of the method were in the range of 94.8-102% (for a spiked nitrite content range from 0.002 to 0.03 mg/L). The limit of detection of the method was 0.46 μg L -1 . Due to an overlapping mode in the headspace auto-sampler system, the method can provide an automated and high-throughput nitrite analysis for the surface water samples. In short, the present HS-GC method is simple, accurate, and sensitive, and it is very suitable to be used in the batch sample testing. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated sample area definition for high-throughput microscopy.
Zeder, M; Ellrott, A; Amann, R
2011-04-01
High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.
High-throughput screening for combinatorial thin-film library of thermoelectric materials.
Watanabe, Masaki; Kita, Takuji; Fukumura, Tomoteru; Ohtomo, Akira; Ueno, Kazunori; Kawasaki, Masashi
2008-01-01
A high-throughput method has been developed to evaluate the Seebeck coefficient and electrical resistivity of combinatorial thin-film libraries of thermoelectric materials from room temperature to 673 K. Thin-film samples several millimeters in size were deposited on an integrated Al2O3 substrate with embedded lead wires and local heaters for measurement of the thermopower under a controlled temperature gradient. An infrared camera was used for real-time observation of the temperature difference Delta T between two electrical contacts on the sample to obtain the Seebeck coefficient. The Seebeck coefficient and electrical resistivity of constantan thin films were shown to be almost identical to standard data for bulk constantan. High-throughput screening was demonstrated for a thermoelectric Mg-Si-Ge combinatorial library.
Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hura, Greg L.; Menon, Angeli L.; Hammel, Michal
2009-07-20
We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes formore » 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.« less
Russo, Marina; Dugo, Paola; Marzocco, Stefania; Inferrera, Veronica; Mondello, Luigi
2015-12-01
Important objectives of a high-performance liquid chromatography preparative process are: purity of products isolated, yield, and throughput. The multidimensional preparative liquid chromatography method used in this work was developed mainly to increase the throughput; moreover purity and yield are increased thanks to the automated collection of the molecules based on the intensity of a signal generated from the mass spectrometer detector, in this way only a specific product can be targeted. This preparative system allowed, in few analyses both in the first and second dimensions, the isolation of eight pure compounds present at very different concentration in the original sample with high purity (>95%) and yield, which showed how the system is efficient and versatile. Pure molecules were used to validate the analytical method and to test the anti-inflammatory and antiproliferative potential of flavonoids. The contemporary presence, in bergamot juice, of all the flavonoids together increases the anti-inflammatory effect with respect to the single compound alone. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Alvarez, Guillermo Dufort Y; Favaro, Federico; Lecumberry, Federico; Martin, Alvaro; Oliver, Juan P; Oreggioni, Julian; Ramirez, Ignacio; Seroussi, Gadiel; Steinfeld, Leonardo
2018-02-01
This work presents a wireless multichannel electroencephalogram (EEG) recording system featuring lossless and near-lossless compression of the digitized EEG signal. Two novel, low-complexity, efficient compression algorithms were developed and tested in a low-power platform. The algorithms were tested on six public EEG databases comparing favorably with the best compression rates reported up to date in the literature. In its lossless mode, the platform is capable of encoding and transmitting 59-channel EEG signals, sampled at 500 Hz and 16 bits per sample, at a current consumption of 337 A per channel; this comes with a guarantee that the decompressed signal is identical to the sampled one. The near-lossless mode allows for significant energy savings and/or higher throughputs in exchange for a small guaranteed maximum per-sample distortion in the recovered signal. Finally, we address the tradeoff between computation cost and transmission savings by evaluating three alternatives: sending raw data, or encoding with one of two compression algorithms that differ in complexity and compression performance. We observe that the higher the throughput (number of channels and sampling rate) the larger the benefits obtained from compression.
Pan, Yuchen; Sackmann, Eric K; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S; Herr, Amy E
2016-12-23
High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality - the binding affinity - is quantified through the dissociation constant (K D ) of each recombinant antibody and the target antigen. To characterize the K D of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The K D for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization.
Pan, Yuchen; Sackmann, Eric K.; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S.; Herr, Amy E.
2016-01-01
High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality – the binding affinity – is quantified through the dissociation constant (KD) of each recombinant antibody and the target antigen. To characterize the KD of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The KD for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization. PMID:28008969
Surveying the repair of ancient DNA from bones via high-throughput sequencing.
Mouttham, Nathalie; Klunk, Jennifer; Kuch, Melanie; Fourney, Ron; Poinar, Hendrik
2015-07-01
DNA damage in the form of abasic sites, chemically altered nucleotides, and strand fragmentation is the foremost limitation in obtaining genetic information from many ancient samples. Upon cell death, DNA continues to endure various chemical attacks such as hydrolysis and oxidation, but repair pathways found in vivo no longer operate. By incubating degraded DNA with specific enzyme combinations adopted from these pathways, it is possible to reverse some of the post-mortem nucleic acid damage prior to downstream analyses such as library preparation, targeted enrichment, and high-throughput sequencing. Here, we evaluate the performance of two available repair protocols on previously characterized DNA extracts from four mammoths. Both methods use endonucleases and glycosylases along with a DNA polymerase-ligase combination. PreCR Repair Mix increases the number of molecules converted to sequencing libraries, leading to an increase in endogenous content and a decrease in cytosine-to-thymine transitions due to cytosine deamination. However, the effects of Nelson Repair Mix on repair of DNA damage remain inconclusive.
A Robotic Platform for Quantitative High-Throughput Screening
Michael, Sam; Auld, Douglas; Klumpp, Carleen; Jadhav, Ajit; Zheng, Wei; Thorne, Natasha; Austin, Christopher P.; Inglese, James
2008-01-01
Abstract High-throughput screening (HTS) is increasingly being adopted in academic institutions, where the decoupling of screening and drug development has led to unique challenges, as well as novel uses of instrumentation, assay formulations, and software tools. Advances in technology have made automated unattended screening in the 1,536-well plate format broadly accessible and have further facilitated the exploration of new technologies and approaches to screening. A case in point is our recently developed quantitative HTS (qHTS) paradigm, which tests each library compound at multiple concentrations to construct concentration-response curves (CRCs) generating a comprehensive data set for each assay. The practical implementation of qHTS for cell-based and biochemical assays across libraries of > 100,000 compounds (e.g., between 700,000 and 2,000,000 sample wells tested) requires maximal efficiency and miniaturization and the ability to easily accommodate many different assay formats and screening protocols. Here, we describe the design and utilization of a fully integrated and automated screening system for qHTS at the National Institutes of Health's Chemical Genomics Center. We report system productivity, reliability, and flexibility, as well as modifications made to increase throughput, add additional capabilities, and address limitations. The combination of this system and qHTS has led to the generation of over 6 million CRCs from > 120 assays in the last 3 years and is a technology that can be widely implemented to increase efficiency of screening and lead generation. PMID:19035846
Müller, Marco; Wasmer, Katharina; Vetter, Walter
2018-06-29
Countercurrent chromatography (CCC) is an all liquid based separation technique typically used for the isolation and purification of natural compounds. The simplicity of the method makes it easy to scale up CCC separations from analytical to preparative and even industrial scale. However, scale-up of CCC separations requires two different instruments with varying coil dimensions. Here we developed two variants of the CCC multiple injection mode as an alternative to increase the throughput and enhance productivity of a CCC separation when using only one instrument. The concept is based on the parallel injection of samples at different points in the CCC column system and the simultaneous separation using one pump only. The wiring of the CCC setup was modified by the insertion of a 6-port selection valve, multiple T-pieces and sample loops. Furthermore, the introduction of storage sample loops enabled the CCC system to be used with repeated injection cycles. Setup and advantages of both multiple injection modes were shown by the isolation of the furan fatty acid 11-(3,4-dimethyl-5-pentylfuran-2-yl)-undecanoic acid (11D5-EE) from an ethyl ester oil rich in 4,7,10,13,16,19-docosahexaenoic acid (DHA-EE). 11D5-EE was enriched in one step from 1.9% to 99% purity. The solvent consumption per isolated amount of analyte could be reduced by ∼40% compared to increased throughput CCC and by ∼5% in the repeated multiple injection mode which also facilitated the isolation of the major compound (DHA-EE) in the sample. Copyright © 2018 Elsevier B.V. All rights reserved.
Nemes, Peter; Hoover, William J; Keire, David A
2013-08-06
Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.
High-Throughput Density Measurement Using Magnetic Levitation.
Ge, Shencheng; Wang, Yunzhe; Deshler, Nicolas J; Preston, Daniel J; Whitesides, George M
2018-06-20
This work describes the development of an integrated analytical system that enables high-throughput density measurements of diamagnetic particles (including cells) using magnetic levitation (MagLev), 96-well plates, and a flatbed scanner. MagLev is a simple and useful technique with which to carry out density-based analysis and separation of a broad range of diamagnetic materials with different physical forms (e.g., liquids, solids, gels, pastes, gums, etc.); one major limitation, however, is the capacity to perform high-throughput density measurements. This work addresses this limitation by (i) re-engineering the shape of the magnetic fields so that the MagLev system is compatible with 96-well plates, and (ii) integrating a flatbed scanner (and simple optical components) to carry out imaging of the samples that levitate in the system. The resulting system is compatible with both biological samples (human erythrocytes) and nonbiological samples (simple liquids and solids, such as 3-chlorotoluene, cholesterol crystals, glass beads, copper powder, and polymer beads). The high-throughput capacity of this integrated MagLev system will enable new applications in chemistry (e.g., analysis and separation of materials) and biochemistry (e.g., cellular responses under environmental stresses) in a simple and label-free format on the basis of a universal property of all matter, i.e., density.
2018-01-01
Effect-directed analysis (EDA) is a commonly used approach for effect-based identification of endocrine disruptive chemicals in complex (environmental) mixtures. However, for routine toxicity assessment of, for example, water samples, current EDA approaches are considered time-consuming and laborious. We achieved faster EDA and identification by downscaling of sensitive cell-based hormone reporter gene assays and increasing fractionation resolution to allow testing of smaller fractions with reduced complexity. The high-resolution EDA approach is demonstrated by analysis of four environmental passive sampler extracts. Downscaling of the assays to a 384-well format allowed analysis of 64 fractions in triplicate (or 192 fractions without technical replicates) without affecting sensitivity compared to the standard 96-well format. Through a parallel exposure method, agonistic and antagonistic androgen and estrogen receptor activity could be measured in a single experiment following a single fractionation. From 16 selected candidate compounds, identified through nontargeted analysis, 13 could be confirmed chemically and 10 were found to be biologically active, of which the most potent nonsteroidal estrogens were identified as oxybenzone and piperine. The increased fractionation resolution and the higher throughput that downscaling provides allow for future application in routine high-resolution screening of large numbers of samples in order to accelerate identification of (emerging) endocrine disruptors. PMID:29547277
2015-01-01
A hybrid microchip/capillary electrophoresis (CE) system was developed to allow unbiased and lossless sample loading and high-throughput repeated injections. This new hybrid CE system consists of a poly(dimethylsiloxane) (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel, and a fused-silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channel and the fused-silica capillary separation column. Analytes are rapidly separated in the fused-silica capillary, and following separation, high-sensitivity MS detection is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high-throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates, and CE separation voltages. PMID:24865952
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, Ryan T.; Wang, Chenchen; Rausch, Sarah J.
2014-07-01
A hybrid microchip/capillary CE system was developed to allow unbiased and lossless sample loading and high throughput repeated injections. This new hybrid CE system consists of a polydimethylsiloxane (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel and a fused silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channelmore » and the fused silica capillary separation column. Analytes are rapidly separated in the fused silica capillary with high resolution. High sensitivity MS detection after CE separation is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a good linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates and CE separation voltages.« less
Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C
2016-01-01
Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.
Throughput Benefit Assessment for Tactical Runway Configuration Management (TRCM)
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Fenbert, James W.
2014-01-01
The System-Oriented Runway Management (SORM) concept is a collection of needed capabilities focused on a more efficient use of runways while considering all of the factors that affect runway use. Tactical Runway Configuration Management (TRCM), one of the SORM capabilities, provides runway configuration and runway usage recommendations, monitoring the active runway configuration for suitability given existing factors, based on a 90 minute planning horizon. This study evaluates the throughput benefits using a representative sample of today's traffic volumes at three airports: Memphis International Airport (MEM), Dallas-Fort Worth International Airport (DFW), and John F. Kennedy International Airport (JFK). Based on this initial assessment, there are statistical throughput benefits for both arrivals and departures at MEM with an average of 4% for arrivals, and 6% for departures. For DFW, there is a statistical benefit for arrivals with an average of 3%. Although there is an average of 1% benefit observed for departures, it is not statistically significant. For JFK, there is a 12% benefit for arrivals, but a 2% penalty for departures. The results obtained are for current traffic volumes and should show greater benefit for increased future demand. This paper also proposes some potential TRCM algorithm improvements for future research. A continued research plan is being worked to implement these improvements and to re-assess the throughput benefit for today and future projected traffic volumes.
Murlidhar, Vasudha; Zeinali, Mina; Grabauskiene, Svetlana; Ghannad-Rezaie, Mostafa; Wicha, Max S; Simeone, Diane M; Ramnath, Nithya; Reddy, Rishindra M; Nagrath, Sunitha
2014-12-10
Circulating tumor cells (CTCs) are believed to play an important role in metastasis, a process responsible for the majority of cancer-related deaths. But their rarity in the bloodstream makes microfluidic isolation complex and time-consuming. Additionally the low processing speeds can be a hindrance to obtaining higher yields of CTCs, limiting their potential use as biomarkers for early diagnosis. Here, a high throughput microfluidic technology, the OncoBean Chip, is reported. It employs radial flow that introduces a varying shear profile across the device, enabling efficient cell capture by affinity at high flow rates. The recovery from whole blood is validated with cancer cell lines H1650 and MCF7, achieving a mean efficiency >80% at a throughput of 10 mL h(-1) in contrast to a flow rate of 1 mL h(-1) standardly reported with other microfluidic devices. Cells are recovered with a viability rate of 93% at these high speeds, increasing the ability to use captured CTCs for downstream analysis. Broad clinical application is demonstrated using comparable flow rates from blood specimens obtained from breast, pancreatic, and lung cancer patients. Comparable CTC numbers are recovered in all the samples at the two flow rates, demonstrating the ability of the technology to perform at high throughputs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Shabbir, Faizan; Gong, Chao
2015-04-13
We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less
Liu, Gary W; Livesay, Brynn R; Kacherovsky, Nataly A; Cieslewicz, Maryelise; Lutz, Emi; Waalkes, Adam; Jensen, Michael C; Salipante, Stephen J; Pun, Suzie H
2015-08-19
Peptide ligands are used to increase the specificity of drug carriers to their target cells and to facilitate intracellular delivery. One method to identify such peptide ligands, phage display, enables high-throughput screening of peptide libraries for ligands binding to therapeutic targets of interest. However, conventional methods for identifying target binders in a library by Sanger sequencing are low-throughput, labor-intensive, and provide a limited perspective (<0.01%) of the complete sequence space. Moreover, the small sample space can be dominated by nonspecific, preferentially amplifying "parasitic sequences" and plastic-binding sequences, which may lead to the identification of false positives or exclude the identification of target-binding sequences. To overcome these challenges, we employed next-generation Illumina sequencing to couple high-throughput screening and high-throughput sequencing, enabling more comprehensive access to the phage display library sequence space. In this work, we define the hallmarks of binding sequences in next-generation sequencing data, and develop a method that identifies several target-binding phage clones for murine, alternatively activated M2 macrophages with a high (100%) success rate: sequences and binding motifs were reproducibly present across biological replicates; binding motifs were identified across multiple unique sequences; and an unselected, amplified library accurately filtered out parasitic sequences. In addition, we validate the Multiple Em for Motif Elicitation tool as an efficient and principled means of discovering binding sequences.
Bergander, Tryggve; Nilsson-Välimaa, Kristina; Oberg, Katarina; Lacki, Karol M
2008-01-01
Steadily increasing demand for more efficient and more affordable biomolecule-based therapies put a significant burden on biopharma companies to reduce the cost of R&D activities associated with introduction of a new drug to the market. Reducing the time required to develop a purification process would be one option to address the high cost issue. The reduction in time can be accomplished if more efficient methods/tools are available for process development work, including high-throughput techniques. This paper addresses the transitions from traditional column-based process development to a modern high-throughput approach utilizing microtiter filter plates filled with a well-defined volume of chromatography resin. The approach is based on implementing the well-known batch uptake principle into microtiter plate geometry. Two variants of the proposed approach, allowing for either qualitative or quantitative estimation of dynamic binding capacity as a function of residence time, are described. Examples of quantitative estimation of dynamic binding capacities of human polyclonal IgG on MabSelect SuRe and of qualitative estimation of dynamic binding capacity of amyloglucosidase on a prototype of Capto DEAE weak ion exchanger are given. The proposed high-throughput method for determination of dynamic binding capacity significantly reduces time and sample consumption as compared to a traditional method utilizing packed chromatography columns without sacrificing the accuracy of data obtained.
Christiansen, Anders; Kringelum, Jens V; Hansen, Christian S; Bøgh, Katrine L; Sullivan, Eric; Patel, Jigar; Rigby, Neil M; Eiwegger, Thomas; Szépfalusi, Zsolt; de Masi, Federico; Nielsen, Morten; Lund, Ole; Dufva, Martin
2015-08-06
Phage display is a prominent screening technique with a multitude of applications including therapeutic antibody development and mapping of antigen epitopes. In this study, phages were selected based on their interaction with patient serum and exhaustively characterised by high-throughput sequencing. A bioinformatics approach was developed in order to identify peptide motifs of interest based on clustering and contrasting to control samples. Comparison of patient and control samples confirmed a major issue in phage display, namely the selection of unspecific peptides. The potential of the bioinformatic approach was demonstrated by identifying epitopes of a prominent peanut allergen, Ara h 1, in sera from patients with severe peanut allergy. The identified epitopes were confirmed by high-density peptide micro-arrays. The present study demonstrates that high-throughput sequencing can empower phage display by (i) enabling the analysis of complex biological samples, (ii) circumventing the traditional laborious picking and functional testing of individual phage clones and (iii) reducing the number of selection rounds.
Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W
2017-09-19
Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.
Jiang, Fan; Fu, Wei; Clarke, Anthony R; Schutze, Mark Kurt; Susanto, Agus; Zhu, Shuifang; Li, Zhihong
2016-11-01
Invasive species can be detrimental to a nation's ecology, economy and human health. Rapid and accurate diagnostics are critical to limit the establishment and spread of exotic organisms. The increasing rate of biological invasions relative to the taxonomic expertise available generates a demand for high-throughput, DNA-based diagnostics methods for identification. We designed species-specific qPCR primer and probe combinations for 27 economically important tephritidae species in six genera (Anastrepha, Bactrocera, Carpomya, Ceratitis, Dacus and Rhagoletis) based on 935 COI DNA barcode haplotypes from 181 fruit fly species publically available in BOLD, and then tested the specificity for each primer pair and probe through qPCR of 35 of those species. We then developed a standardization reaction system for detecting the 27 target species based on a microfluidic dynamic array and also applied the method to identify unknown immature samples from port interceptions and field monitoring. This method led to a specific and simultaneous detection for all 27 species in 7.5 h, using only 0.2 μL of reaction system in each reaction chamber. The approach successfully discriminated among species within complexes that had genetic similarities of up to 98.48%, while it also identified all immature samples consistent with the subsequent results of morphological examination of adults which were reared from larvae of cohorts from the same samples. We present an accurate, rapid and high-throughput innovative approach for detecting fruit flies of quarantine concern. This is a new method which has broad potential to be one of international standards for plant quarantine and invasive species detection. © 2016 John Wiley & Sons Ltd.
Adaptive data rate capacity of meteor-burst communications
NASA Astrophysics Data System (ADS)
Larsen, J. D.; Melville, S. W.; Mawrey, R. S.
The use of adaptive data rates in the meteor-burst communications environment is investigated. Measured results obtained from a number of meteor links are presented and compared with previous theoretical predictions. The contribution of various meteor trail families to throughput capacity are also investigated. The results show that the use of adaptive data rates can significantly increase the throughput capacity of meteor-burst communication systems. The greatest rate of increase in throughput with increase in operating rate is found at low operating rates. This finding has been confirmed for a variety of links and days. Reasonable correspondence is obtained between the predicted modified overdense model and the observed results. Overdense trails, in particular two trail types within the overdense family, are shown to dominate adaptive data throughput.
Von Dreele, Robert B.; D'Amico, Kevin
2006-10-31
A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.
DeForge, L E; Billeci, K L; Kramer, S M
2000-11-01
Given the increasing incidence of methicillin resistant Staphylococcus aureus (MRSA) and the recent emergence of MRSA with a reduced susceptibility to vancomycin, alternative approaches to the treatment of infection are of increasing relevance. The purpose of these studies was to evaluate the effect of IFN-gamma on the ability of white blood cells to kill S. aureus and to develop a simpler, higher throughput bacterial killing assay. Using a methicillin sensitive clinical isolate of S. aureus, a clinical isolate of MRSA, and a commercially available strain of MRSA, studies were conducted using a killing assay in which the bacteria were added directly into whole blood. The viability of the bacteria in samples harvested at various time points was then evaluated both by the classic CFU assay and by a new assay using alamarBlue. In the latter method, serially diluted samples and a standard curve containing known concentrations of bacteria were placed on 96-well plates, and alamarBlue was added. Fluorescence readings were taken, and the viability of the bacteria in the samples was calculated using the standard curve. The results of these studies demonstrated that the CFU and alamarBlue methods yielded equivalent detection of bacteria diluted in buffer. For samples incubated in whole blood, however, the alamarBlue method tended to yield lower viabilities than the CFU method due to the emergence of a slower growing subpopulation of S. aureus upon incubation in the blood matrix. A significant increase in bacterial killing was observed upon pretreatment of whole blood for 24 h with 5 or 25 ng/ml IFN-gamma. This increase in killing was detected equivalently by the CFU and alamarBlue methods. In summary, these studies describe a method that allows for the higher throughput analysis of the effects of immunomodulators on bacterial killing.
White, David T; Eroglu, Arife Unal; Wang, Guohua; Zhang, Liyun; Sengupta, Sumitra; Ding, Ding; Rajpurohit, Surendra K; Walker, Steven L; Ji, Hongkai; Qian, Jiang; Mumm, Jeff S
2017-01-01
The zebrafish has emerged as an important model for whole-organism small-molecule screening. However, most zebrafish-based chemical screens have achieved only mid-throughput rates. Here we describe a versatile whole-organism drug discovery platform that can achieve true high-throughput screening (HTS) capacities. This system combines our automated reporter quantification in vivo (ARQiv) system with customized robotics, and is termed ‘ARQiv-HTS’. We detail the process of establishing and implementing ARQiv-HTS: (i) assay design and optimization, (ii) calculation of sample size and hit criteria, (iii) large-scale egg production, (iv) automated compound titration, (v) dispensing of embryos into microtiter plates, and (vi) reporter quantification. We also outline what we see as best practice strategies for leveraging the power of ARQiv-HTS for zebrafish-based drug discovery, and address technical challenges of applying zebrafish to large-scale chemical screens. Finally, we provide a detailed protocol for a recently completed inaugural ARQiv-HTS effort, which involved the identification of compounds that elevate insulin reporter activity. Compounds that increased the number of insulin-producing pancreatic beta cells represent potential new therapeutics for diabetic patients. For this effort, individual screening sessions took 1 week to conclude, and sessions were performed iteratively approximately every other day to increase throughput. At the conclusion of the screen, more than a half million drug-treated larvae had been evaluated. Beyond this initial example, however, the ARQiv-HTS platform is adaptable to almost any reporter-based assay designed to evaluate the effects of chemical compounds in living small-animal models. ARQiv-HTS thus enables large-scale whole-organism drug discovery for a variety of model species and from numerous disease-oriented perspectives. PMID:27831568
High-throughput methods for electron crystallography.
Stokes, David L; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas
2013-01-01
Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing a native lipid environment for these proteins. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, electron microscopy can be used to collect images and diffraction and the corresponding data can be combined to produce a three-dimensional reconstruction, which under favorable conditions can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on titration of cyclodextrin as a chelating agent for detergent; a specialized pipetting robot has been designed not only to add cyclodextrin in a systematic way, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described.
Schönberg, Anna; Theunert, Christoph; Li, Mingkun; Stoneking, Mark; Nasidze, Ivan
2011-09-01
To investigate the demographic history of human populations from the Caucasus and surrounding regions, we used high-throughput sequencing to generate 147 complete mtDNA genome sequences from random samples of individuals from three groups from the Caucasus (Armenians, Azeri and Georgians), and one group each from Iran and Turkey. Overall diversity is very high, with 144 different sequences that fall into 97 different haplogroups found among the 147 individuals. Bayesian skyline plots (BSPs) of population size change through time show a population expansion around 40-50 kya, followed by a constant population size, and then another expansion around 15-18 kya for the groups from the Caucasus and Iran. The BSP for Turkey differs the most from the others, with an increase from 35 to 50 kya followed by a prolonged period of constant population size, and no indication of a second period of growth. An approximate Bayesian computation approach was used to estimate divergence times between each pair of populations; the oldest divergence times were between Turkey and the other four groups from the South Caucasus and Iran (~400-600 generations), while the divergence time of the three Caucasus groups from each other was comparable to their divergence time from Iran (average of ~360 generations). These results illustrate the value of random sampling of complete mtDNA genome sequences that can be obtained with high-throughput sequencing platforms.
Implementation of context independent code on a new array processor: The Super-65
NASA Technical Reports Server (NTRS)
Colbert, R. O.; Bowhill, S. A.
1981-01-01
The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.
High Throughput Sequence Analysis for Disease Resistance in Maize
USDA-ARS?s Scientific Manuscript database
Preliminary results of a computational analysis of high throughput sequencing data from Zea mays and the fungus Aspergillus are reported. The Illumina Genome Analyzer was used to sequence RNA samples from two strains of Z. mays (Va35 and Mp313) collected over a time course as well as several specie...
USDA-ARS?s Scientific Manuscript database
Bermuda grass samples were examined by transmission electron microscopy and 28-30 nm spherical virus particles were observed. Total RNA from these plants was subjected to high throughput sequencing (HTS). The nearly full genome sequence of a previously uncharacterized Panicovirus was identified from...
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
Li, Xiao-jun; Yi, Eugene C; Kemp, Christopher J; Zhang, Hui; Aebersold, Ruedi
2005-09-01
There is an increasing interest in the quantitative proteomic measurement of the protein contents of substantially similar biological samples, e.g. for the analysis of cellular response to perturbations over time or for the discovery of protein biomarkers from clinical samples. Technical limitations of current proteomic platforms such as limited reproducibility and low throughput make this a challenging task. A new LC-MS-based platform is able to generate complex peptide patterns from the analysis of proteolyzed protein samples at high throughput and represents a promising approach for quantitative proteomics. A crucial component of the LC-MS approach is the accurate evaluation of the abundance of detected peptides over many samples and the identification of peptide features that can stratify samples with respect to their genetic, physiological, or environmental origins. We present here a new software suite, SpecArray, that generates a peptide versus sample array from a set of LC-MS data. A peptide array stores the relative abundance of thousands of peptide features in many samples and is in a format identical to that of a gene expression microarray. A peptide array can be subjected to an unsupervised clustering analysis to stratify samples or to a discriminant analysis to identify discriminatory peptide features. We applied the SpecArray to analyze two sets of LC-MS data: one was from four repeat LC-MS analyses of the same glycopeptide sample, and another was from LC-MS analysis of serum samples of five male and five female mice. We demonstrate through these two study cases that the SpecArray software suite can serve as an effective software platform in the LC-MS approach for quantitative proteomics.
High-throughput quantification of hydroxyproline for determination of collagen.
Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan
2011-10-15
An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.
2016-09-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakala, Jacqueline Alexandra
2016-11-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
Filter Paper-based Nucleic Acid Storage in High-throughput Solid Tumor Genotyping.
Stachler, Matthew; Jia, Yonghui; Sharaf, Nematullah; Wade, Jacqueline; Longtine, Janina; Garcia, Elizabeth; Sholl, Lynette M
2015-01-01
Molecular testing of tumors from formalin-fixed paraffin-embedded (FFPE) tissue blocks is central to clinical practice; however, it requires histology support and increases test turnaround time. Prospective fresh frozen tissue collection requires special handling, additional storage space, and may not be feasible for small specimens. Filter paper-based collection of tumor DNA reduces the need for histology support, requires little storage space, and preserves high-quality nucleic acid. We investigated the performance of tumor smears on filter paper in solid tumor genotyping, as compared with paired FFPE samples. Whatman FTA Micro Card (FTA preps) smears were prepared from 21 fresh tumor samples. A corresponding cytology smear was used to assess tumor cellularity and necrosis. DNA was isolated from FTA preps and FFPE core samples using automated methods and quantified using SYBR green dsDNA detection. Samples were genotyped for 471 mutations on a mass spectrophotometry-based platform (Sequenom). DNA concentrations from FTA preps and FFPE correlated for untreated carcinomas but not for mesenchymal tumors (Spearman σ=0.39 and σ=-0.1, respectively). Average DNA concentrations were lower from FTA preps as compared with FFPE, but DNA quality was higher with less fragmentation. Seventy-six percent of FTA preps and 86% of FFPE samples generated adequate DNA for genotyping. FTA preps tended to perform poorly for collection of DNA from pretreated carcinomas and mesenchymal neoplasms. Of the 16 paired DNA samples that were genotyped, 15 (94%) gave entirely concordant results. Filter paper-based sample preservation is a feasible alternative to FFPE for use in automated, high-throughput genotyping of carcinomas.
Abal-Fabeiro, J L; Maside, X; Llovo, J; Bello, X; Torres, M; Treviño, M; Moldes, L; Muñoz, A; Carracedo, A; Bartolomé, C
2014-04-01
The epidemiological study of human cryptosporidiosis requires the characterization of species and subtypes involved in human disease in large sample collections. Molecular genotyping is costly and time-consuming, making the implementation of low-cost, highly efficient technologies increasingly necessary. Here, we designed a protocol based on MALDI-TOF mass spectrometry for the high-throughput genotyping of a panel of 55 single nucleotide variants (SNVs) selected as markers for the identification of common gp60 subtypes of four Cryptosporidium species that infect humans. The method was applied to a panel of 608 human and 63 bovine isolates and the results were compared with control samples typed by Sanger sequencing. The method allowed the identification of species in 610 specimens (90·9%) and gp60 subtype in 605 (90·2%). It displayed excellent performance, with sensitivity and specificity values of 87·3 and 98·0%, respectively. Up to nine genotypes from four different Cryptosporidium species (C. hominis, C. parvum, C. meleagridis and C. felis) were detected in humans; the most common ones were C. hominis subtype Ib, and C. parvum IIa (61·3 and 28·3%, respectively). 96·5% of the bovine samples were typed as IIa. The method performs as well as the widely used Sanger sequencing and is more cost-effective and less time consuming.
NASA Astrophysics Data System (ADS)
Rotem, Asaf; Garraway, Levi; Su, Mei-Ju; Basu, Anindita; Regev, Aviv; Struhl, Kevin
2017-02-01
Three-dimensional growth conditions reflect the natural environment of cancer cells and are crucial to be performed at drug screens. We developed a 3D assay for cellular transformation that involves growth in low attachment (GILA) conditions and is strongly correlated with the 50-year old benchmark assay-soft agar. Using GILA, we performed high-throughput screens for drugs and genes that selectively inhibit or increase transformation, but not proliferation. This phenotypic approach is complementary to our genetic approach that utilizes single-cell RNA-sequencing of a patient sample to identify putative oncogenes that confer sensitivity to drugs designed to specifically inhibit the identified oncoprotein. Currently, we are dealing with a big challenge in our field- the limited number of cells that might be extracted from a biopsy. Small patient-derived samples are hard to test in the traditional multiwell plate and it will be helpful to minimize the culture area and the experimental system. We managed to design a suitable microfluidic device for limited number of cells and perform the assay using image analysis. We aim to test drugs on tumor cells, outside of the patient body- and recommend on the ideal treatment that is tailored to the individual. This device will help to minimize biopsy-sampling volumes and minimize interventions in the patient's tumor.
Bautista-de Los Santos, Quyen Melina; Schroeder, Joanna L; Blakemore, Oliver; Moses, Jonathan; Haffey, Mark; Sloan, William; Pinto, Ameet J
2016-03-01
High-throughput and deep DNA sequencing, particularly amplicon sequencing, is being increasingly utilized to reveal spatial and temporal dynamics of bacterial communities in drinking water systems. Whilst the sampling and methodological biases associated with PCR and sequencing have been studied in other environments, they have not been quantified for drinking water. These biases are likely to have the greatest effect on the ability to characterize subtle spatio-temporal patterns influenced by process/environmental conditions. In such cases, intra-sample variability may swamp any underlying small, systematic variation. To evaluate this, we undertook a study with replication at multiple levels including sampling sites, sample collection, PCR amplification, and high throughput sequencing of 16S rRNA amplicons. The variability inherent to the PCR amplification and sequencing steps is significant enough to mask differences between bacterial communities from replicate samples. This was largely driven by greater variability in detection of rare bacteria (relative abundance <0.01%) across PCR/sequencing replicates as compared to replicate samples. Despite this, we captured significant changes in bacterial community over diurnal time-scales and find that the extent and pattern of diurnal changes is specific to each sampling location. Further, we find diurnal changes in bacterial community arise due to differences in the presence/absence of the low abundance bacteria and changes in the relative abundance of dominant bacteria. Finally, we show that bacterial community composition is significantly different across sampling sites for time-periods during which there are typically rapid changes in water use. This suggests hydraulic changes (driven by changes in water demand) contribute to shaping the bacterial community in bulk drinking water over diurnal time-scales. Copyright © 2015 Elsevier Ltd. All rights reserved.
High-resolution, high-throughput imaging with a multibeam scanning electron microscope.
Eberle, A L; Mikula, S; Schalek, R; Lichtman, J; Knothe Tate, M L; Zeidler, D
2015-08-01
Electron-electron interactions and detector bandwidth limit the maximal imaging speed of single-beam scanning electron microscopes. We use multiple electron beams in a single column and detect secondary electrons in parallel to increase the imaging speed by close to two orders of magnitude and demonstrate imaging for a variety of samples ranging from biological brain tissue to semiconductor wafers. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
Under-sampling in a Multiple-Channel Laser Vibrometry System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corey, Jordan
2007-03-01
Laser vibrometry is a technique used to detect vibrations on objects using the interference of coherent light with itself. Most vibrometry systems process only one target location at a time, but processing multiple locations simultaneously provides improved detection capabilities. Traditional laser vibrometry systems employ oversampling to sample the incoming modulated-light signal, however as the number of channels increases in these systems, certain issues arise such a higher computational cost, excessive heat, increased power requirements, and increased component cost. This thesis describes a novel approach to laser vibrometry that utilizes undersampling to control the undesirable issues associated with over-sampled systems. Undersamplingmore » allows for significantly less samples to represent the modulated-light signals, which offers several advantages in the overall system design. These advantages include an improvement in thermal efficiency, lower processing requirements, and a higher immunity to the relative intensity noise inherent in laser vibrometry applications. A unique feature of this implementation is the use of a parallel architecture to increase the overall system throughput. This parallelism is realized using a hierarchical multi-channel architecture based on off-the-shelf programmable logic devices (PLDs).« less
Moret, Sabrina; Scolaro, Marianna; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S
2016-04-01
A high throughput, high-sensitivity procedure, involving simultaneous microwave-assisted extraction (MAS) and unsaponifiable extraction, followed by on-line liquid chromatography (LC)-gas chromatography (GC), has been optimised for rapid and efficient extraction and analytical determination of mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH) in cereal-based products of different composition. MAS has the advantage of eliminating fat before LC-GC analysis, allowing an increase in the amount of sample extract injected, and hence in sensitivity. The proposed method gave practically quantitative recoveries and good repeatability. Among the different cereal-based products analysed (dry semolina and egg pasta, bread, biscuits, and cakes), egg pasta packed in direct contact with recycled paperboard had on average the highest total MOSH level (15.9 mg kg(-1)), followed by cakes (10.4 mg kg(-1)) and bread (7.5 mg kg(-1)). About 50% of the pasta and bread samples and 20% of the biscuits and cake samples had detectable MOAH amounts. The highest concentrations were found in an egg pasta in direct contact with recycled paperboard (3.6 mg kg(-1)) and in a milk bread (3.6 mg kg(-1)). Copyright © 2015 Elsevier Ltd. All rights reserved.
Junqueira, João R C; de Araujo, William R; Salles, Maiara O; Paixão, Thiago R L C
2013-01-30
A simple and fast electrochemical method for quantitative analysis of picric acid explosive (nitro-explosive) based on its electrochemical reduction at copper surfaces is reported. To achieve a higher sample throughput, the electrochemical sensor was adapted in a flow injection system. Under optimal experimental conditions, the peak current response increases linearly with picric acid concentration over the range of 20-300 μmol L(-1). The repeatability of the electrode response in the flow injection analysis (FIA) configuration was evaluated as 3% (n=10), and the detection limit of the method was estimated to be 6.0 μmol L(-1) (S/N=3). The sample throughput under optimised conditions was estimated to be 550 samples h(-1). Peroxide explosives like triacetone triperoxide (TATP) and hexamethylene triperoxide diamine (HMTD) were tested as potential interfering substances for the proposed method, and no significant interference by these explosives was noticed. The proposed method has interesting analytical parameters, environmental applications, and low cost compared with other electroanalytical methods that have been reported for the quantification of picric acid. Additionally, the possibility to develop an in situ device for the detection of picric acid using a disposable sensor was evaluated. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.
A device for high-throughput monitoring of degradation in soft tissue samples.
Tzeranis, D S; Panagiotopoulos, I; Gkouma, S; Kanakaris, G; Georgiou, N; Vaindirlis, N; Vasileiou, G; Neidlin, M; Gkousioudi, A; Spitas, V; Macheras, G A; Alexopoulos, L G
2018-06-06
This work describes the design and validation of a novel device, the High-Throughput Degradation Monitoring Device (HDD), for monitoring the degradation of 24 soft tissue samples over incubation periods of several days inside a cell culture incubator. The device quantifies sample degradation by monitoring its deformation induced by a static gravity load. Initial instrument design and experimental protocol development focused on quantifying cartilage degeneration. Characterization of measurement errors, caused mainly by thermal transients and by translating the instrument sensor, demonstrated that HDD can quantify sample degradation with <6 μm precision and <10 μm temperature-induced errors. HDD capabilities were evaluated in a pilot study that monitored the degradation of fresh ex vivo human cartilage samples by collagenase solutions over three days. HDD could robustly resolve the effects of collagenase concentration as small as 0.5 mg/ml. Careful sample preparation resulted in measurements that did not suffer from donor-to-donor variation (coefficient of variance <70%). Due to its unique combination of sample throughput, measurement precision, temporal sampling and experimental versality, HDD provides a novel biomechanics-based experimental platform for quantifying the effects of proteins (cytokines, growth factors, enzymes, antibodies) or small molecules on the degradation of soft tissues or tissue engineering constructs. Thereby, HDD can complement established tools and in vitro models in important applications including drug screening and biomaterial development. Copyright © 2018 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Extraction of DNA from tissue samples can be expensive both in time and monetary resources and can often require handling and disposal of hazardous chemicals. We have developed a high throughput protocol for extracting DNA from honey bees that is of a high enough quality and quantity to enable hundr...
USDA-ARS?s Scientific Manuscript database
The Axiom® IStraw90 SNP (single nucleotide polymorphism) array was developed to enable high-throughput genotyping in allo-octoploid cultivated strawberry (Fragaria ×ananassa). However, high cost ($80-105 per sample) limits throughput for certain applications. On average the IStraw90 has yielded 50% ...
Daily, Neil J.; Du, Zhong-Wei
2017-01-01
Abstract Electrophysiology of excitable cells, including muscle cells and neurons, has been measured by making direct contact with a single cell using a micropipette electrode. To increase the assay throughput, optical devices such as microscopes and microplate readers have been used to analyze electrophysiology of multiple cells. We have established a high-throughput (HTP) analysis of action potentials (APs) in highly enriched motor neurons and cardiomyocytes (CMs) that are differentiated from human induced pluripotent stem cells (iPSCs). A multichannel electric field stimulation (EFS) device enabled the ability to electrically stimulate cells and measure dynamic changes in APs of excitable cells ultra-rapidly (>100 data points per second) by imaging entire 96-well plates. We found that the activities of both neurons and CMs and their response to EFS and chemicals are readily discerned by our fluorescence imaging-based HTP phenotyping assay. The latest generation of calcium (Ca2+) indicator dyes, FLIPR Calcium 6 and Cal-520, with the HTP device enables physiological analysis of human iPSC-derived samples highlighting its potential application for understanding disease mechanisms and discovering new therapeutic treatments. PMID:28525289
High-Throughput Light Sheet Microscopy for the Automated Live Imaging of Larval Zebrafish
NASA Astrophysics Data System (ADS)
Baker, Ryan; Logan, Savannah; Dudley, Christopher; Parthasarathy, Raghuveer
The zebrafish is a model organism with a variety of useful properties; it is small and optically transparent, it reproduces quickly, it is a vertebrate, and there are a large variety of transgenic animals available. Because of these properties, the zebrafish is well suited to study using a variety of optical technologies including light sheet fluorescence microscopy (LSFM), which provides high-resolution three-dimensional imaging over large fields of view. Research progress, however, is often not limited by optical techniques but instead by the number of samples one can examine over the course of an experiment, which in the case of light sheet imaging has so far been severely limited. Here we present an integrated fluidic circuit and microscope which provides rapid, automated imaging of zebrafish using several imaging modes, including LSFM, Hyperspectral Imaging, and Differential Interference Contrast Microscopy. Using this system, we show that we can increase our imaging throughput by a factor of 10 compared to previous techniques. We also show preliminary results visualizing zebrafish immune response, which is sensitive to gut microbiota composition, and which shows a strong variability between individuals that highlights the utility of high throughput imaging. National Science Foundation, Award No. DBI-1427957.
George, Iniga S; Fennell, Anne Y; Haynes, Paul A
2015-09-01
Protein sample preparation optimisation is critical for establishing reproducible high throughput proteomic analysis. In this study, two different fractionation sample preparation techniques (in-gel digestion and in-solution digestion) for shotgun proteomics were used to quantitatively compare proteins identified in Vitis riparia leaf samples. The total number of proteins and peptides identified were compared between filter aided sample preparation (FASP) coupled with gas phase fractionation (GPF) and SDS-PAGE methods. There was a 24% increase in the total number of reproducibly identified proteins when FASP-GPF was used. FASP-GPF is more reproducible, less expensive and a better method than SDS-PAGE for shotgun proteomics of grapevine samples as it significantly increases protein identification across biological replicates. Total peptide and protein information from the two fractionation techniques is available in PRIDE with the identifier PXD001399 (http://proteomecentral.proteomexchange.org/dataset/PXD001399). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ultrasensitive SERS Flow Detector Using Hydrodynamic Focusing
Negri, Pierre; Jacobs, Kevin T.; Dada, Oluwatosin O.; Schultz, Zachary D.
2013-01-01
Label-free, chemical specific detection in flow is important for high throughput characterization of analytes in applications such as flow injection analysis, electrophoresis, and chromatography. We have developed a surface-enhanced Raman scattering (SERS) flow detector capable of ultrasensitive optical detection on the millisecond time scale. The device employs hydrodynamic focusing to improve SERS detection in a flow channel where a sheath flow confines analyte molecules eluted from a fused silica capillary over a planar SERS-active substrate. Increased analyte interactions with the SERS substrate significantly improve detection sensitivity. The performance of this flow detector was investigated using a combination of finite element simulations, fluorescence imaging, and Raman experiments. Computational fluid dynamics based on finite element analysis was used to optimize the flow conditions. The modeling indicates that a number of factors, such as the capillary dimensions and the ratio of the sheath flow to analyte flow rates, are critical for obtaining optimal results. Sample confinement resulting from the flow dynamics was confirmed using wide-field fluorescence imaging of rhodamine 6G (R6G). Raman experiments at different sheath flow rates showed increased sensitivity compared with the modeling predictions, suggesting increased adsorption. Using a 50-millisecond acquisitions, a sheath flow rate of 180 μL/min, and a sample flow rate of 5 μL/min, a linear dynamic range from nanomolar to micromolar concentrations of R6G with a LOD of 1 nM is observed. At low analyte concentrations, rapid analyte desorption is observed, enabling repeated and high-throughput SERS detection. The flow detector offers substantial advantages over conventional SERS-based assays such as minimal sample volumes and high detection efficiency. PMID:24074461
Ogawa, Shoujiro; Kittaka, Hiroki; Nakata, Akiho; Komatsu, Kenji; Sugiura, Takahiro; Satoh, Mamoru; Nomura, Fumio; Higashi, Tatsuya
2017-03-20
The plasma/serum concentration of 25-hydroxyvitamin D 3 [25(OH)D 3 ] is a diagnostic index for vitamin D deficiency/insufficiency, which is associated with a wide range of diseases, such as rickets, cancer and diabetes. We have reported that the derivatization with 4-(4-dimethylaminophenyl)-1,2,4-triazoline-3,5-dione (DAPTAD) works well in the liquid chromatography/electrospray ionization-tandem mass spectrometry (LC/ESI-MS/MS) assay of the serum/plasma 25(OH)D 3 for enhancing the sensitivity and the separation from a potent interfering metabolite, 3-epi-25-hydroxyvitamin D 3 [3-epi-25(OH)D 3 ]. However, enhancing the analysis throughput remains an issue in the LC/ESI-MS/MS assay of 25(OH)D 3 . The most obvious restriction of the LC/MS/MS throughput is the chromatographic run time. In this study, we developed an enhanced throughput method for the determination of the plasma 25(OH)D 3 by LC/ESI-MS/MS combined with the derivatization using the triplex ( 2 H 0 -, 2 H 3 - and 2 H 6 -) DAPTAD isotopologues. After separate derivatization with 1 of 3 different isotopologues, the 3 samples were combined and injected together into LC/ESI-MS/MS. Based on the mass differences between the isotopologues, the derivatized 25(OH)D 3 in the 3 different samples were quantified within a single run. The developed method tripled the hourly analysis throughput without sacrificing assay performance, i.e., ease of pretreatment of plasma sample (only deproteinization), limit of quantification (1.0ng/mL when a 5μL-plasma was used), precision (intra-assay RSD≤5.9% and inter-assay RSD≤5.5%), accuracy (98.7-102.2%), matrix effects, and capability of separating from an interfering metabolite, 3-epi-25(OH)D 3 . The multiplexing of samples by the isotopologue derivatization was applied to the analysis of plasma samples of healthy subjects and the developed method was proven to have a satisfactory applicability. Copyright © 2016 Elsevier B.V. All rights reserved.
Patel, Rajesh; Tsan, Alison; Sumiyoshi, Teiko; Fu, Ling; Desai, Rupal; Schoenbrunner, Nancy; Myers, Thomas W.; Bauer, Keith; Smith, Edward; Raja, Rajiv
2014-01-01
Molecular profiling of tumor tissue to detect alterations, such as oncogenic mutations, plays a vital role in determining treatment options in oncology. Hence, there is an increasing need for a robust and high-throughput technology to detect oncogenic hotspot mutations. Although commercial assays are available to detect genetic alterations in single genes, only a limited amount of tissue is often available from patients, requiring multiplexing to allow for simultaneous detection of mutations in many genes using low DNA input. Even though next-generation sequencing (NGS) platforms provide powerful tools for this purpose, they face challenges such as high cost, large DNA input requirement, complex data analysis, and long turnaround times, limiting their use in clinical settings. We report the development of the next generation mutation multi-analyte panel (MUT-MAP), a high-throughput microfluidic, panel for detecting 120 somatic mutations across eleven genes of therapeutic interest (AKT1, BRAF, EGFR, FGFR3, FLT3, HRAS, KIT, KRAS, MET, NRAS, and PIK3CA) using allele-specific PCR (AS-PCR) and Taqman technology. This mutation panel requires as little as 2 ng of high quality DNA from fresh frozen or 100 ng of DNA from formalin-fixed paraffin-embedded (FFPE) tissues. Mutation calls, including an automated data analysis process, have been implemented to run 88 samples per day. Validation of this platform using plasmids showed robust signal and low cross-reactivity in all of the newly added assays and mutation calls in cell line samples were found to be consistent with the Catalogue of Somatic Mutations in Cancer (COSMIC) database allowing for direct comparison of our platform to Sanger sequencing. High correlation with NGS when compared to the SuraSeq500 panel run on the Ion Torrent platform in a FFPE dilution experiment showed assay sensitivity down to 0.45%. This multiplexed mutation panel is a valuable tool for high-throughput biomarker discovery in personalized medicine and cancer drug development. PMID:24658394
Song, Xu-Hong; Wang, Yu; Li, Long-Yun; Tan, Jun
2017-04-01
Illumina Hiseq 2500 high-throughput sequencing platform was used to study the bacteria richness and diversity, the soil enzyme activities, nutrients in unplanted soil, root-rot and healthy rhizophere soil of Coptis chinensis for deeply discussing the mechanism of the root-rot of C. chinensis. The high-throughput sequencing result showed that the artificial cultivation effected the bacteria community richness and diversity. The bacteria community richness in healthy and diseased rhizosphere soil showed significant lower than that of in unplanted soil (P<0.05) and declined bacteria diversity. The bacteria community richness in root-rot rhizosphere soil increased significantly than that of health and unplanted soil and the diversity was lower significant than that of unplanted soil (P<0.05). The results of soil nutrients and enzyme activities detected that the pH value, available phosphorus and urease activity decreased and the sucrase activity increased significantly (P<0.05). The content of organic carbon and alkaline hydrolysis nitrogen the catalase and urease activity in root rot soil samples was significantly lower than that of healthy soil samples (P<0.05). However, the contents of available phosphorus and available potassium were significantly in root-rot sample higher than that of healthy soil samples (P<0.05). Comprehensive analysis showed that the artificial cultivation declined the bacteria community richness and diversity. The bacteria community richness decreased significantly and the decreased diversity may be the cause of the root-rot. Meanwhile, the decrease of carbon and the catalase activity may be another cause of the root-rot in C. chinensis produced in Shizhu city, Chongqing province. Copyright© by the Chinese Pharmaceutical Association.
Towards High-Throughput, Simultaneous Characterization of Thermal and Thermoelectric Properties
NASA Astrophysics Data System (ADS)
Miers, Collier Stephen
The extension of thermoelectric generators to more general markets requires that the devices be affordable and practical (low $/Watt) to implement. A key challenge in this pursuit is the quick and accurate characterization of thermoelectric materials, which will allow researchers to tune and modify the material properties quickly. The goal of this thesis is to design and fabricate a high-throughput characterization system for the simultaneous characterization of thermal, electrical, and thermoelectric properties for device scale material samples. The measurement methodology presented in this thesis combines a custom designed measurement system created specifically for high-throughput testing with a novel device structure that permits simultaneous characterization of the material properties. The measurement system is based upon the 3o method for thermal conductivity measurements, with the addition of electrodes and voltage probes to measure the electrical conductivity and Seebeck coefficient. A device designed and optimized to permit the rapid characterization of thermoelectric materials is also presented. This structure is optimized to ensure 1D heat transfer within the sample, thus permitting rapid data analysis and fitting using a MATLAB script. Verification of the thermal portion of the system is presented using fused silica and sapphire materials for benchmarking. The fused silica samples yielded a thermal conductivity of 1.21 W/(m K), while a thermal conductivity of 31.2 W/(m K) was measured for the sapphire samples. The device and measurement system designed and developed in this thesis provide insight and serve as a foundation for the development of high throughput, simultaneous measurement platforms.
Gray, Nicola; Lewis, Matthew R; Plumb, Robert S; Wilson, Ian D; Nicholson, Jeremy K
2015-06-05
A new generation of metabolic phenotyping centers are being created to meet the increasing demands of personalized healthcare, and this has resulted in a major requirement for economical, high-throughput metabonomic analysis by liquid chromatography-mass spectrometry (LC-MS). Meeting these new demands represents an emerging bioanalytical problem that must be solved if metabolic phenotyping is to be successfully applied to large clinical and epidemiological sample sets. Ultraperformance (UP)LC-MS-based metabolic phenotyping, based on 2.1 mm i.d. LC columns, enables comprehensive metabolic phenotyping but, when employed for the analysis of thousands of samples, results in high solvent usage. The use of UPLC-MS employing 1 mm i.d. columns for metabolic phenotyping rather than the conventional 2.1 mm i.d. methodology shows that the resulting optimized microbore method provided equivalent or superior performance in terms of peak capacity, sensitivity, and robustness. On average, we also observed, when using the microbore scale separation, an increase in response of 2-3 fold over that obtained with the standard 2.1 mm scale method. When applied to the analysis of human urine, the 1 mm scale method showed no decline in performance over the course of 1000 analyses, illustrating that microbore UPLC-MS represents a viable alternative to conventional 2.1 mm i.d. formats for routine large-scale metabolic profiling studies while also resulting in a 75% reduction in solvent usage. The modest increase in sensitivity provided by this methodology also offers the potential to either reduce sample consumption or increase the number of metabolite features detected with confidence due to the increased signal-to-noise ratios obtained. Implementation of this miniaturized UPLC-MS method of metabolic phenotyping results in clear analytical, economic, and environmental benefits for large-scale metabolic profiling studies with similar or improved analytical performance compared to conventional UPLC-MS.
Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules
Panzeri, Francesco
2017-01-01
We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions. PMID:28419142
Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S
2011-11-01
With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.
Miescher Schwenninger, S; Freimüller Leischtfeld, S; Gantenbein-Demarchi, C
2016-11-01
Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a powerful biotyping tool increasingly used for high-throughput identification of clinical microbial isolates, however, in food fermentation research this approach is still not well established. This study examines the microbial biodiversity of cocoa bean fermentation based on the isolation of micro-organisms in cocoa-producing regions, followed by MALDI-TOF MS in Switzerland. A preceding 6-week storage test to mimic lengthy transport of microbial samples from cocoa-producing regions to Switzerland was performed with strains of Lactobacillus plantarum, Acetobacter pasteurianus and Saccharomyces cerevisiae. Weekly MALDI-TOF MS analysis was able to successfully identify microbiota to the species level after storing live cultures on slant agar at mild temperatures (7°C) and/or in 75% aqueous ethanol at differing temperatures (-20, 7 and 30°C). The efficacy of this method was confirmed by on-site recording of the microbial biodiversity in cocoa bean fermentation in Bolivia and Brazil, with a total of 1126 randomly selected isolates. MALDI-TOF MS analyses revealed known dominant cocoa bean fermentation species with Lact. plantarum and Lactobacillus fermentum in the lactic acid bacteria taxon, Hanseniaspora opuntiae and S. cerevisiae in the yeast taxon, and Acet. pasteurianus, Acetobacter fabarum, Acetobacter ghanensis and Acetobacter senegalensis in the acetic acid bacteria taxon. Microbial identification with MALDI-TOF MS has increased the number of samples that can be analysed in a given time, a prerequisite for high-throughput methods. This method is already widely used for the identification of clinical microbial isolates, whereas in food fermentation research, including cocoa bean fermentation, microbiota is mostly identified by time-consuming, biochemical-based phenotyping and molecular approaches. This study presents the use of MALDI-TOF MS for characterizing the microbial biodiversity of cocoa bean fermentation. The feasibility of MALDI-TOF MS identification of cocoa-specific microbiota has been shown with samples collected during on-site studies in two countries of origin, Bolivia and Brazil. © 2016 The Society for Applied Microbiology.
Screening of Compounds Toxicity against Human Monocytic cell line-THP-1 by Flow Cytometry
Pick, Neora; Cameron, Scott; Arad, Dorit
2004-01-01
The worldwide rapid increase in bacterial resistance to numerous antibiotics requires on-going development of new drugs to enter the market. As the development of new antibiotics is lengthy and costly, early monitoring of compound's toxicity is essential in the development of novel agents. Our interest is in a rapid, simple, high throughput screening method to assess cytotoxicity induced by potential agents. Some intracellular pathogens, such as Mycobacterium tuberculosis primary site of infection is human alveolar macrophages. Thus, evaluation of candidate drugs for macrophage toxicity is crucial. Protocols for high throughput drug toxicity screening of macrophages using flow cytometry are lacking in the literature. For this application we modified a preexisting technique, propidium iodide (PI) exclusion staining and utilized it for rapid toxicity tests. Samples were prepared in 96 well plates and analyzed by flow cytometry, which allowed for rapid, inexpensive and precise assessment of compound's toxicity associated with cell death. PMID:15472722
Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav
2015-07-01
Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.
Library Design-Facilitated High-Throughput Sequencing of Synthetic Peptide Libraries.
Vinogradov, Alexander A; Gates, Zachary P; Zhang, Chi; Quartararo, Anthony J; Halloran, Kathryn H; Pentelute, Bradley L
2017-11-13
A methodology to achieve high-throughput de novo sequencing of synthetic peptide mixtures is reported. The approach leverages shotgun nanoliquid chromatography coupled with tandem mass spectrometry-based de novo sequencing of library mixtures (up to 2000 peptides) as well as automated data analysis protocols to filter away incorrect assignments, noise, and synthetic side-products. For increasing the confidence in the sequencing results, mass spectrometry-friendly library designs were developed that enabled unambiguous decoding of up to 600 peptide sequences per hour while maintaining greater than 85% sequence identification rates in most cases. The reliability of the reported decoding strategy was additionally confirmed by matching fragmentation spectra for select authentic peptides identified from library sequencing samples. The methods reported here are directly applicable to screening techniques that yield mixtures of active compounds, including particle sorting of one-bead one-compound libraries and affinity enrichment of synthetic library mixtures performed in solution.
Report for the NGFA-5 project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaing, C; Jackson, P; Thissen, J
The objective of this project is to provide DHS a comprehensive evaluation of the current genomic technologies including genotyping, TaqMan PCR, multiple locus variable tandem repeat analysis (MLVA), microarray and high-throughput DNA sequencing in the analysis of biothreat agents from complex environmental samples. To effectively compare the sensitivity and specificity of the different genomic technologies, we used SNP TaqMan PCR, MLVA, microarray and high-throughput illumine and 454 sequencing to test various strains from B. anthracis, B. thuringiensis, BioWatch aerosol filter extracts or soil samples that were spiked with B. anthracis, and samples that were previously collected during DHS and EPAmore » environmental release exercises that were known to contain B. thuringiensis spores. The results of all the samples against the various assays are discussed in this report.« less
Bladergroen, Marco R.; van der Burgt, Yuri E. M.
2015-01-01
For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071
Ptolemy, Adam S; Britz-McKibbin, Philip
2006-02-17
New strategies for integrating sample pretreatment with chemical analyses under a single format is required for rapid, sensitive and enantioselective analyses of low abundance metabolites in complex biological samples. Capillary electrophoresis (CE) offers a unique environment for controlling analyte/reagent band dispersion and electromigration properties using discontinuous electrolyte systems. Recent work in our laboratory towards developing a high-throughput CE platform for low abundance metabolites via on-line sample preconcentration with chemical derivatization (SPCD) is primarily examined in this review, as there have been surprisingly only a few strategies reported in the literature to date. In-capillary sample preconcentration serves to enhance concentration sensitivity via electrokinetic focusing of long sample injection volumes for lower detection limits, whereas chemical derivatization by zone passing is used to expand detectability and selectivity, notably for enantiomeric resolution of metabolites lacking intrinsic chromophores using nanolitre volumes of reagent. Together, on-line SPCD-CE can provide over a 100-fold improvement in concentration sensitivity, shorter total analysis times, reduced sample handling and improved reliability for a variety of amino acid and amino sugar metabolites, which is also amenable to automated high-throughput screening. This review will highlight basic method development and optimization parameters relevant to SPCD-CE, including applications to bacterial metabolite flux and biomarker analyses. Insight into the mechanism of analyte focusing and labeling by SPCD-CE is also discussed, as well as future directions for continued research.
Advanced overlay: sampling and modeling for optimized run-to-run control
NASA Astrophysics Data System (ADS)
Subramany, Lokesh; Chung, WoongJae; Samudrala, Pavan; Gao, Haiyong; Aung, Nyan; Gomez, Juan Manuel; Gutjahr, Karsten; Park, DongSuk; Snow, Patrick; Garcia-Medina, Miguel; Yap, Lipkong; Demirer, Onur Nihat; Pierson, Bill; Robinson, John C.
2016-03-01
In recent years overlay (OVL) control schemes have become more complicated in order to meet the ever shrinking margins of advanced technology nodes. As a result, this brings up new challenges to be addressed for effective run-to- run OVL control. This work addresses two of these challenges by new advanced analysis techniques: (1) sampling optimization for run-to-run control and (2) bias-variance tradeoff in modeling. The first challenge in a high order OVL control strategy is to optimize the number of measurements and the locations on the wafer, so that the "sample plan" of measurements provides high quality information about the OVL signature on the wafer with acceptable metrology throughput. We solve this tradeoff between accuracy and throughput by using a smart sampling scheme which utilizes various design-based and data-based metrics to increase model accuracy and reduce model uncertainty while avoiding wafer to wafer and within wafer measurement noise caused by metrology, scanner or process. This sort of sampling scheme, combined with an advanced field by field extrapolated modeling algorithm helps to maximize model stability and minimize on product overlay (OPO). Second, the use of higher order overlay models means more degrees of freedom, which enables increased capability to correct for complicated overlay signatures, but also increases sensitivity to process or metrology induced noise. This is also known as the bias-variance trade-off. A high order model that minimizes the bias between the modeled and raw overlay signature on a single wafer will also have a higher variation from wafer to wafer or lot to lot, that is unless an advanced modeling approach is used. In this paper, we characterize the bias-variance trade off to find the optimal scheme. The sampling and modeling solutions proposed in this study are validated by advanced process control (APC) simulations to estimate run-to-run performance, lot-to-lot and wafer-to- wafer model term monitoring to estimate stability and ultimately high volume manufacturing tests to monitor OPO by densely measured OVL data.
Short-read, high-throughput sequencing technology for STR genotyping
Bornman, Daniel M.; Hester, Mark E.; Schuetter, Jared M.; Kasoji, Manjula D.; Minard-Smith, Angela; Barden, Curt A.; Nelson, Scott C.; Godbold, Gene D.; Baker, Christine H.; Yang, Boyu; Walther, Jacquelyn E.; Tornes, Ivan E.; Yan, Pearlly S.; Rodriguez, Benjamin; Bundschuh, Ralf; Dickens, Michael L.; Young, Brian A.; Faith, Seth A.
2013-01-01
DNA-based methods for human identification principally rely upon genotyping of short tandem repeat (STR) loci. Electrophoretic-based techniques for variable-length classification of STRs are universally utilized, but are limited in that they have relatively low throughput and do not yield nucleotide sequence information. High-throughput sequencing technology may provide a more powerful instrument for human identification, but is not currently validated for forensic casework. Here, we present a systematic method to perform high-throughput genotyping analysis of the Combined DNA Index System (CODIS) STR loci using short-read (150 bp) massively parallel sequencing technology. Open source reference alignment tools were optimized to evaluate PCR-amplified STR loci using a custom designed STR genome reference. Evaluation of this approach demonstrated that the 13 CODIS STR loci and amelogenin (AMEL) locus could be accurately called from individual and mixture samples. Sensitivity analysis showed that as few as 18,500 reads, aligned to an in silico referenced genome, were required to genotype an individual (>99% confidence) for the CODIS loci. The power of this technology was further demonstrated by identification of variant alleles containing single nucleotide polymorphisms (SNPs) and the development of quantitative measurements (reads) for resolving mixed samples. PMID:25621315
Akeroyd, Michiel; Olsthoorn, Maurien; Gerritsma, Jort; Gutker-Vermaas, Diana; Ekkelkamp, Laurens; van Rij, Tjeerd; Klaassen, Paul; Plugge, Wim; Smit, Ed; Strupat, Kerstin; Wenzel, Thibaut; van Tilborg, Marcel; van der Hoeven, Rob
2013-03-10
In the discovery of new enzymes genomic and cDNA expression libraries containing thousands of differential clones are generated to obtain biodiversity. These libraries need to be screened for the activity of interest. Removing so-called empty and redundant clones significantly reduces the size of these expression libraries and therefore speeds up new enzyme discovery. Here, we present a sensitive, generic workflow for high throughput screening of successful microbial protein over-expression in microtiter plates containing a complex matrix based on mass spectrometry techniques. MALDI-LTQ-Orbitrap screening followed by principal component analysis and peptide mass fingerprinting was developed to obtain a throughput of ∼12,000 samples per week. Alternatively, a UHPLC-MS(2) approach including MS(2) protein identification was developed for microorganisms with a complex protein secretome with a throughput of ∼2000 samples per week. TCA-induced protein precipitation enhanced by addition of bovine serum albumin is used for protein purification prior to MS detection. We show that this generic workflow can effectively reduce large expression libraries from fungi and bacteria to their minimal size by detection of successful protein over-expression using MS. Copyright © 2012 Elsevier B.V. All rights reserved.
David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat
2013-10-25
Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Development of a Kinetic Assay for Late Endosome Movement.
Esner, Milan; Meyenhofer, Felix; Kuhn, Michael; Thomas, Melissa; Kalaidzidis, Yannis; Bickle, Marc
2014-08-01
Automated imaging screens are performed mostly on fixed and stained samples to simplify the workflow and increase throughput. Some processes, such as the movement of cells and organelles or measuring membrane integrity and potential, can be measured only in living cells. Developing such assays to screen large compound or RNAi collections is challenging in many respects. Here, we develop a live-cell high-content assay for tracking endocytic organelles in medium throughput. We evaluate the added value of measuring kinetic parameters compared with measuring static parameters solely. We screened 2000 compounds in U-2 OS cells expressing Lamp1-GFP to label late endosomes. All hits have phenotypes in both static and kinetic parameters. However, we show that the kinetic parameters enable better discrimination of the mechanisms of action. Most of the compounds cause a decrease of motility of endosomes, but we identify several compounds that increase endosomal motility. In summary, we show that kinetic data help to better discriminate phenotypes and thereby obtain more subtle phenotypic clustering. © 2014 Society for Laboratory Automation and Screening.
Sanz, Jose Luis; Rojas, Patricia; Morato, Ana; Mendez, Lara; Ballesteros, Mercedes; González-Fernández, Cristina
2017-02-01
Microalgae biomasses are considered promising feedstocks for biofuel and methane productions. Two Continuously Stirred Tank Reactors (CSTR), fed with fresh (CSTR-C) and heat pre-treated (CSTR-T) Chlorella biomass were run in parallel in order to determine methane productions. The methane yield was 1.5 times higher in CSTR-T with regard to CSTR-C. Aiming to understand the microorganism roles within of the reactors, the sludge used as an inoculum (I), plus raw (CSTR-C) and heat pre-treated (CSTR-T) samples were analyzed by high-throughput pyrosequencing. The bacterial communities were dominated by Proteobacteria, Bacteroidetes, Chloroflexi and Firmicutes. Spirochaetae and Actinobacteria were only detected in sample I. Proteobacteria, mainly Alfaproteobacteria, were by far the dominant phylum within of the CSTR-C bioreactor. Many of the sequences retrieved were related to bacteria present in activated sludge treatment plants and they were absent after thermal pre-treatment. Most of the sequences affiliated to the Bacteroidetes were related to uncultured groups. Anaerolineaceae was the sole family found of the Chloroflexi phylum. All of the genera identified of the Firmicutes phylum carried out macromolecule hydrolysis and by-product fermentation. The proteolytic bacteria were prevalent over the saccharolytic microbes. The percentage of the proteolytic genera increased from the inoculum to the CSTR-T sample in a parallel fashion with an available protein increase owing to the high protein content of Chlorella. To relate the taxa identified by high-throughput sequencing to their functional roles remains a future challenge. Copyright © 2016 Elsevier Ltd. All rights reserved.
Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.
2016-01-01
Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502
A comparison of high-throughput techniques for assaying circadian rhythms in plants.
Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony
2015-01-01
Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.
Quality Control for Ambient Sampling of PCDD/PCDF from Open Combustion Sources
Both long duration (> 6 h) and high temperature (up to 139o C) sampling efforts were conducted using ambient air sampling methods to determine if either high volume throughput or higher than ambient sampling temperatures resulted in loss of target polychlorinated dibenzodioxins/d...
On-chip polarimetry for high-throughput screening of nanoliter and smaller sample volumes
NASA Technical Reports Server (NTRS)
Bachmann, Brian O. (Inventor); Bornhop, Darryl J. (Inventor); Dotson, Stephen (Inventor)
2012-01-01
A polarimetry technique for measuring optical activity that is particularly suited for high throughput screening employs a chip or substrate (22) having one or more microfluidic channels (26) formed therein. A polarized laser beam (14) is directed onto optically active samples that are disposed in the channels. The incident laser beam interacts with the optically active molecules in the sample, which slightly alter the polarization of the laser beam as it passes multiple times through the sample. Interference fringe patterns (28) are generated by the interaction of the laser beam with the sample and the channel walls. A photodetector (34) is positioned to receive the interference fringe patterns and generate an output signal that is input to a computer or other analyzer (38) for analyzing the signal and determining the rotation of plane polarized light by optically active material in the channel from polarization rotation calculations.
Fluorescent Cell Barcoding for Multiplex Flow Cytometry
Krutzik, Peter O.; Clutter, Matthew R.; Trejo, Angelica; Nolan, Garry P.
2011-01-01
Fluorescent Cell Barcoding (FCB) enables high throughput, i.e. high content flow cytometry by multiplexing samples prior to staining and acquisition on the cytometer. Individual cell samples are barcoded, or labeled, with unique signatures of fluorescent dyes so that they can be mixed together, stained, and analyzed as a single sample. By mixing samples prior to staining, antibody consumption is typically reduced 10 to 100-fold. In addition, data robustness is increased through the combination of control and treated samples, which minimizes pipetting error, staining variation, and the need for normalization. Finally, speed of acquisition is enhanced, enabling large profiling experiments to be run with standard cytometer hardware. In this unit, we outline the steps necessary to apply the FCB method to cell lines as well as primary peripheral blood samples. Important technical considerations such as choice of barcoding dyes, concentrations, labeling buffers, compensation, and software analysis are discussed. PMID:21207359
Bullard, K M; Hietpas, P B; Ewing, A G
1998-01-01
Polymerase chain reaction (PCR) amplified short tandem repeat (STR) samples from the HUMVWF locus have been analyzed using a unique sample introduction and separation technique. A single capillary is used to transfer samples onto an ultrathin slab gel (57 microm thin). This ultrathin nondenaturing polyacrylamide gel is used to separate the amplified fragments, and laser-induced fluorescence with ethidium bromide is used for detection. The feasibility of performing STR analysis using this system has been investigated by examining the reproducibility for repeated samples. Reproducibility is examined by comparing the migration of the 14 and 17 HUMVWF alleles on three consecutive separations on the ultrathin slab gel. Using one locus, separations match in migration time with the two alleles 42 s apart for each of the three consecutive separations. This technique shows potential to increase sample throughput in STR analysis techniques although separation resolution still needs to be improved.
High-throughput automated microfluidic sample preparation for accurate microbial genomics
Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B.; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P.; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C.
2017-01-01
Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications. PMID:28128213
Young, Susan M; Curry, Mark S; Ransom, John T; Ballesteros, Juan A; Prossnitz, Eric R; Sklar, Larry A; Edwards, Bruce S
2004-03-01
HyperCyt, an automated sample handling system for flow cytometry that uses air bubbles to separate samples sequentially introduced from multiwell plates by an autosampler. In a previously documented HyperCyt configuration, air bubble separated compounds in one sample line and a continuous stream of cells in another are mixed in-line for serial flow cytometric cell response analysis. To expand capabilities for high-throughput bioactive compound screening, the authors investigated using this system configuration in combination with automated cell sorting. Peptide ligands were sampled from a 96-well plate, mixed in-line with fluo-4-loaded, formyl peptide receptor-transfected U937 cells, and screened at a rate of 3 peptide reactions per minute with approximately 10,000 cells analyzed per reaction. Cell Ca(2+) responses were detected to as little as 10(-11) M peptide with no detectable carryover between samples at up to 10(-7) M peptide. After expansion in culture, cells sort-purified from the 10% highest responders exhibited enhanced sensitivity and more sustained responses to peptide. Thus, a highly responsive cell subset was isolated under high-throughput mixing and sorting conditions in which response detection capability spanned a 1000-fold range of peptide concentration. With single-cell readout systems for protein expression libraries, this technology offers the promise of screening millions of discrete compound interactions per day.
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
Lyon, Elaine; Laver, Thomas; Yu, Ping; Jama, Mohamed; Young, Keith; Zoccoli, Michael; Marlowe, Natalia
2010-01-01
Population screening has been proposed for Fragile X syndrome to identify premutation carrier females and affected newborns. We developed a PCR-based assay capable of quickly detecting the presence or absence of an expanded FMR1 allele with high sensitivity and specificity. This assay combines a triplet repeat primed PCR with high-throughput automated capillary electrophoresis. We evaluated assay performance using archived samples sent for Fragile X diagnostic testing representing a range of Fragile X CGG-repeat expansions. Two hundred five previously genotyped samples were tested with the new assay. Data were analyzed for the presence of a trinucleotide “ladder” extending beyond 55 repeats, which was set as a cut-off to identify expanded FMR1 alleles. We identified expanded FMR1 alleles in 132 samples (59 premutation, 71 full mutation, 2 mosaics) and normal FMR1 alleles in 73 samples. We found 100% concordance with previous results from PCR and Southern blot analyses. In addition, we show feasibility of using this assay with DNA extracted from dried-blood spots. Using a single PCR combined with high-throughput fragment analysis on the automated capillary electrophoresis instrument, we developed a rapid and reproducible PCR-based laboratory assay that meets many of the requirements for a first-tier test for population screening. PMID:20431035
Chen, Ming-Luan; Suo, Li-Li; Gao, Qiang; Feng, Yu-Qi
2011-08-01
Using magnetite/silica/poly(methacrylic acid-co-ethylene glycol dimethacrylate) (Fe(3)O(4)/SiO(2)/poly(MAA-co-EDMA)) magnetic microspheres, a rapid and high-throughput magnetic solid-phase extraction coupled with capillary zone electrophoresis (MSPE-CZE) method was developed for the determination of illegal drugs (ketamine, amphetamines, opiates, and metabolites). The MSPE of target analytes could be completed within 2 min, and the eight target analytes could be baseline separated within 15 min by CZE with 30 mM phosphate buffer solution (PBS, pH 2.0) containing 15% v/v ACN as background electrolyte. Furthermore, hydrodynamic injection with field-amplified sample stacking (FASS) was employed to enhance the sensitivity of this MSPE-CZE method. Under such optimal conditions, the limits of detection for the eight target analytes ranged from 0.015 to 0.105 μg/mL. The application feasibility of MSPE-CZE in illegal drugs monitoring was demonstrated by analyzing urine samples, and the recoveries of target drugs for the spiked sample ranging from 85.4 to 110.1%. The method reproducibility was tested by evaluating the intra- and interday precisions, and relative standard deviations of <10.3 and 12.4%, respectively, were obtained. To increase throughput of the analysis, a home-made MSPE array that has potential application to the treatment of 96 samples simultaneously was used. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-throughput imaging of heterogeneous cell organelles with an X-ray laser (CXIDB ID 25)
Hantke, Max, F.
2014-11-17
Preprocessed detector images that were used for the paper "High-throughput imaging of heterogeneous cell organelles with an X-ray laser". The CXI file contains the entire recorded data - including both hits and blanks. It also includes down-sampled images and LCLS machine parameters. Additionally, the Cheetah configuration file is attached that was used to create the pre-processed data.
Xiao, Yongli; Sheng, Zong-Mei; Taubenberger, Jeffery K.
2015-01-01
The vast majority of surgical biopsy and post-mortem tissue samples are formalin-fixed and paraffin-embedded (FFPE), but this process leads to RNA degradation that limits gene expression analysis. As an example, the viral RNA genome of the 1918 pandemic influenza A virus was previously determined in a 9-year effort by overlapping RT-PCR from post-mortem samples. Using the protocols described here, the full genome of the 1918 virus at high coverage was determined in one high-throughput sequencing run of a cDNA library derived from total RNA of a 1918 FFPE sample after duplex-specific nuclease treatments. This basic methodological approach should assist in the analysis of FFPE tissue samples isolated over the past century from a variety of infectious diseases. PMID:26344216
Palumbo, J D; O'Keeffe, T L; Fidelibus, M W
2016-12-01
Identification of populations of Aspergillus section Nigri species in environmental samples using traditional methods is laborious and impractical for large numbers of samples. We developed species-specific primers and probes for quantitative droplet digital PCR (ddPCR) to improve sample throughput and simultaneously detect multiple species in each sample. The ddPCR method was used to distinguish Aspergillus niger, Aspergillus welwitschiae, Aspergillus tubingensis and Aspergillus carbonarius in mixed samples of total DNA. Relative abundance of each species measured by ddPCR agreed with input ratios of template DNAs. Soil samples were collected at six time points over two growing seasons from two raisin vineyards in Fresno County, California. Aspergillus section Nigri strains were detected in these soils in the range of 10 2 -10 5 CFU g -1 . Relative abundance of each species varied widely among samples, but in 52 of 60 samples, A. niger was the most abundant species, ranging from 38 to 88% of the total population. In combination with total plate counts, this ddPCR method provides a high-throughput method for describing population dynamics of important potential mycotoxin-producing species in environmental samples. This is the first study to demonstrate the utility of ddPCR as a means to quantify species of Aspergillus section Nigri in soil. This method eliminates the need for isolation and sequence identification of individual fungal isolates, and allows for greater throughput in measuring relative population sizes of important (i.e. mycotoxigenic) Aspergillus species within a population of morphologically indistinguishable species. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Coherent imaging at the diffraction limit
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-01-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects. PMID:25177990
Coherent imaging at the diffraction limit.
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-09-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
Break-up of droplets in a concentrated emulsion flowing through a narrow constriction
NASA Astrophysics Data System (ADS)
Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team
2014-11-01
Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.
Prasad, Satendra; Wouters, Eloy R; Dunyach, Jean-Jacques
2015-08-18
Ion sampling from an electrospray ionization (ESI) source was improved by increasing gas conductance of the MS inlet by 4.3-fold. Converting the gas throughput (Q) into sensitivity improvement was dependent on ion desolvation and handling of the gas load. Desolvation was addressed by using a novel slot shaped inlet that exhibited desolvation properties identical to the 0.58 mm i.d capillary. An assay tailored for "small molecules" at high chromatographic flow rate (500 μL/min) yielded a compound dependent 6.5 to 14-fold signal gain while analysis at nano chromatographic flow rate (300 nL/min) showed 2 to 3.5-fold improvement for doubly charged peptides. Improvement exceeding the Q (4.3-fold) at high chromatographic flow rate was explained by superior sampling of the spatially dispersed ion spray when using the slot shaped capillary. Sensitivity improvement across a wide range of chromatographic flow rate confirmed no compromise in ion desolvation with the increase in Q. Another improvement included less overflow of gas into the mass analyzer from the foreline region owing to the slot shape of the capillary. By doubling the roughing pump capacity and operating the electrodynamic ion funnel (EDIF) at ∼4 Torr, a single pumping stage was sufficient to handle the gas load. The transport of solvent clusters from the LC effluent into the mass analyzer was prevented by a "wavy shaped" transfer quadrupole and was compared with a benchmark approach that delivered ions orthogonally into a differentially pumped dual EDIF at comparable gas Q.
Application of a Permethrin Immunosorbent Assay Method to Residential Soil and Dust Samples
A low-cost, high throughput bioanalytical screening method was developed for monitoring cis/trans-permethrin in dust and soil samples. The method consisted of a simple sample preparation procedure [sonication with dichloromethane followed by a solvent exchange into methanol:wate...
Clinical application of high throughput molecular screening techniques for pharmacogenomics
Wiita, Arun P; Schrijver, Iris
2011-01-01
Genetic analysis is one of the fastest-growing areas of clinical diagnostics. Fortunately, as our knowledge of clinically relevant genetic variants rapidly expands, so does our ability to detect these variants in patient samples. Increasing demand for genetic information may necessitate the use of high throughput diagnostic methods as part of clinically validated testing. Here we provide a general overview of our current and near-future abilities to perform large-scale genetic testing in the clinical laboratory. First we review in detail molecular methods used for high throughput mutation detection, including techniques able to monitor thousands of genetic variants for a single patient or to genotype a single genetic variant for thousands of patients simultaneously. These methods are analyzed in the context of pharmacogenomic testing in the clinical laboratories, with a focus on tests that are currently validated as well as those that hold strong promise for widespread clinical application in the near future. We further discuss the unique economic and clinical challenges posed by pharmacogenomic markers. Our ability to detect genetic variants frequently outstrips our ability to accurately interpret them in a clinical context, carrying implications both for test development and introduction into patient management algorithms. These complexities must be taken into account prior to the introduction of any pharmacogenomic biomarker into routine clinical testing. PMID:23226057
Shih, Tsung-Ting; Hsieh, Cheng-Chuan; Luo, Yu-Ting; Su, Yi-An; Chen, Ping-Hung; Chuang, Yu-Chen; Sun, Yuh-Chang
2016-04-15
Herein, a hyphenated system combining a high-throughput solid-phase extraction (htSPE) microchip with inductively coupled plasma-mass spectrometry (ICP-MS) for rapid determination of trace heavy metals was developed. Rather than performing multiple analyses in parallel for the enhancement of analytical throughput, we improved the processing speed for individual samples by increasing the operation flow rate during SPE procedures. To this end, an innovative device combining a micromixer and a multi-channeled extraction unit was designed. Furthermore, a programmable valve manifold was used to interface the developed microchip and ICP-MS instrumentation in order to fully automate the system, leading to a dramatic reduction in operation time and human error. Under the optimized operation conditions for the established system, detection limits of 1.64-42.54 ng L(-1) for the analyte ions were achieved. Validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Each analysis could be readily accomplished within just 186 s using the established system. This represents, to the best of our knowledge, an unprecedented speed for the analysis of trace heavy metal ions. Copyright © 2016 Elsevier B.V. All rights reserved.
Wilhide, Calvin; Hayes, John R; Farah, J Ramsay
2008-08-01
Participation rates are often viewed by vendors and employer-based disease management (DM) services as an important benchmark of successful program implementation. Although participation is commonly understood to vary widely between and within employer groups, little is known about the role of incentives on rates of participation and graduation from DM programs. This study examined the use of incentives, employer characteristics, and perceptions of employee-employer communication on participation and program throughput. The relationship between incentive use and rates of participation and throughput among 87 employer groups from the 2004 company portfolio were assessed using existing account information. Detailed information on the highest and lowest third of the sample was obtained through interviews with account representatives. Wilcoxon, chi square, and regression analyses were used to examine the influence of employer characteristics and incentive factors on enrollee participation rates and program completion. Fifty-two percent of the accounts offered incentives for participation. From 1% to 23% of the eligible employees enrolled and completed the DM program. Incentives had a direct impact on participation, with amounts greater than $50 the most effective. Participation increased with communication tools including e-mail, high-blast (repeated) communications, and health fairs. Results suggest that cash incentives and communication play a significant role in rates of participation and program completion.
Fishing on chips: up-and-coming technological advances in analysis of zebrafish and Xenopus embryos.
Zhu, Feng; Skommer, Joanna; Huang, Yushi; Akagi, Jin; Adams, Dany; Levin, Michael; Hall, Chris J; Crosier, Philip S; Wlodkowic, Donald
2014-11-01
Biotests performed on small vertebrate model organisms provide significant investigative advantages as compared with bioassays that employ cell lines, isolated primary cells, or tissue samples. The main advantage offered by whole-organism approaches is that the effects under study occur in the context of intact physiological milieu, with all its intercellular and multisystem interactions. The gap between the high-throughput cell-based in vitro assays and low-throughput, disproportionally expensive and ethically controversial mammal in vivo tests can be closed by small model organisms such as zebrafish or Xenopus. The optical transparency of their tissues, the ease of genetic manipulation and straightforward husbandry, explain the growing popularity of these model organisms. Nevertheless, despite the potential for miniaturization, automation and subsequent increase in throughput of experimental setups, the manipulation, dispensing and analysis of living fish and frog embryos remain labor-intensive. Recently, a new generation of miniaturized chip-based devices have been developed for zebrafish and Xenopus embryo on-chip culture and experimentation. In this work, we review the critical developments in the field of Lab-on-a-Chip devices designed to alleviate the limits of traditional platforms for studies on zebrafish and clawed frog embryo and larvae. © 2014 International Society for Advancement of Cytometry. © 2014 International Society for Advancement of Cytometry.
Howard, Dougal P; Marchand, Peter; McCafferty, Liam; Carmalt, Claire J; Parkin, Ivan P; Darr, Jawwad A
2017-04-10
High-throughput continuous hydrothermal flow synthesis was used to generate a library of aluminum and gallium-codoped zinc oxide nanoparticles of specific atomic ratios. Resistivities of the materials were determined by Hall Effect measurements on heat-treated pressed discs and the results collated into a conductivity-composition map. Optimal resistivities of ∼9 × 10 -3 Ω cm were reproducibly achieved for several samples, for example, codoped ZnO with 2 at% Ga and 1 at% Al. The optimum sample on balance of performance and cost was deemed to be ZnO codoped with 3 at% Al and 1 at% Ga.
Vinner, Lasse; Mourier, Tobias; Friis-Nielsen, Jens; Gniadecki, Robert; Dybkaer, Karen; Rosenberg, Jacob; Langhoff, Jill Levin; Cruz, David Flores Santa; Fonager, Jannik; Izarzugaza, Jose M G; Gupta, Ramneek; Sicheritz-Ponten, Thomas; Brunak, Søren; Willerslev, Eske; Nielsen, Lars Peter; Hansen, Anders Johannes
2015-08-19
Although nearly one fifth of all human cancers have an infectious aetiology, the causes for the majority of cancers remain unexplained. Despite the enormous data output from high-throughput shotgun sequencing, viral DNA in a clinical sample typically constitutes a proportion of host DNA that is too small to be detected. Sequence variation among virus genomes complicates application of sequence-specific, and highly sensitive, PCR methods. Therefore, we aimed to develop and characterize a method that permits sensitive detection of sequences despite considerable variation. We demonstrate that our low-stringency in-solution hybridization method enables detection of <100 viral copies. Furthermore, distantly related proviral sequences may be enriched by orders of magnitude, enabling discovery of hitherto unknown viral sequences by high-throughput sequencing. The sensitivity was sufficient to detect retroviral sequences in clinical samples. We used this method to conduct an investigation for novel retrovirus in samples from three cancer types. In accordance with recent studies our investigation revealed no retroviral infections in human B-cell lymphoma cells, cutaneous T-cell lymphoma or colorectal cancer biopsies. Nonetheless, our generally applicable method makes sensitive detection possible and permits sequencing of distantly related sequences from complex material.
OpenMSI Arrayed Analysis Tools v2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOWEN, BENJAMIN; RUEBEL, OLIVER; DE ROND, TRISTAN
2017-02-07
Mass spectrometry imaging (MSI) enables high-resolution spatial mapping of biomolecules in samples and is a valuable tool for the analysis of tissues from plants and animals, microbial interactions, high-throughput screening, drug metabolism, and a host of other applications. This is accomplished by desorbing molecules from the surface on spatially defined locations, using a laser or ion beam. These ions are analyzed by a mass spectrometry and collected into a MSI 'image', a dataset containing unique mass spectra from the sampled spatial locations. MSI is used in a diverse and increasing number of biological applications. The OpenMSI Arrayed Analysis Tool (OMAAT)more » is a new software method that addresses the challenges of analyzing spatially defined samples in large MSI datasets, by providing support for automatic sample position optimization and ion selection.« less
Xu, Xiaohui Sophia; Rose, Anne; Demers, Roger; Eley, Timothy; Ryan, John; Stouffer, Bruce; Cojocaru, Laura; Arnold, Mark
2014-01-01
The determination of drug-protein binding is important in the pharmaceutical development process because of the impact of protein binding on both the pharmacokinetics and pharmacodynamics of drugs. Equilibrium dialysis is the preferred method to measure the free drug fraction because it is considered to be more accurate. The throughput of equilibrium dialysis has recently been improved by implementing a 96-well format plate. Results/methodology: This manuscript illustrates the successful application of a 96-well rapid equilibrium dialysis (RED) device in the determination of atazanavir plasma-protein binding. This RED method of measuring free fraction was successfully validated and then applied to the analysis of clinical plasma samples taken from HIV-infected pregnant women administered atazanavir. Combined with LC-MS/MS detection, the 96-well format equilibrium dialysis device was suitable for measuring the free and bound concentration of pharmaceutical molecules in a high-throughput mode.
High-speed fixed-target serial virus crystallography
Roedig, Philip; Ginn, Helen M.; Pakendorf, Tim; ...
2017-06-19
Here, we report a method for serial X-ray crystallography at X-ray free-electron lasers (XFELs), which allows for full use of the current 120-Hz repetition rate of the Linear Coherent Light Source (LCLS). Using a micropatterned silicon chip in combination with the high-speed Roadrunner goniometer for sample delivery, we were able to determine the crystal structures of the picornavirus bovine enterovirus 2 (BEV2) and the cytoplasmic polyhedrosis virus type 18 polyhedrin, with total data collection times of less than 14 and 10 min, respectively. Our method requires only micrograms of sample and should therefore broaden the applicability of serial femtosecond crystallographymore » to challenging projects for which only limited sample amounts are available. By synchronizing the sample exchange to the XFEL repetition rate, our method allows for most efficient use of the limited beam time available at XFELs and should enable a substantial increase in sample throughput at these facilities.« less
High-speed fixed-target serial virus crystallography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roedig, Philip; Ginn, Helen M.; Pakendorf, Tim
Here, we report a method for serial X-ray crystallography at X-ray free-electron lasers (XFELs), which allows for full use of the current 120-Hz repetition rate of the Linear Coherent Light Source (LCLS). Using a micropatterned silicon chip in combination with the high-speed Roadrunner goniometer for sample delivery, we were able to determine the crystal structures of the picornavirus bovine enterovirus 2 (BEV2) and the cytoplasmic polyhedrosis virus type 18 polyhedrin, with total data collection times of less than 14 and 10 min, respectively. Our method requires only micrograms of sample and should therefore broaden the applicability of serial femtosecond crystallographymore » to challenging projects for which only limited sample amounts are available. By synchronizing the sample exchange to the XFEL repetition rate, our method allows for most efficient use of the limited beam time available at XFELs and should enable a substantial increase in sample throughput at these facilities.« less
High-speed fixed-target serial virus crystallography
Roedig, Philip; Ginn, Helen M.; Pakendorf, Tim; Sutton, Geoff; Harlos, Karl; Walter, Thomas S.; Meyer, Jan; Fischer, Pontus; Duman, Ramona; Vartiainen, Ismo; Reime, Bernd; Warmer, Martin; Brewster, Aaron S.; Young, Iris D.; Michels-Clark, Tara; Sauter, Nicholas K.; Kotecha, Abhay; Kelly, James; Rowlands, David J.; Sikorsky, Marcin; Nelson, Silke; Damiani, Daniel S.; Alonso-Mori, Roberto; Ren, Jingshan; Fry, Elizabeth E.; David, Christian; Stuart, David I.; Wagner, Armin; Meents, Alke
2017-01-01
We report a method for serial X-ray crystallography at X-ray free electron lasers (XFELs), which allows for full use of the current 120 Hz repetition rate of the Linear Coherent Light Source (LCLS). Using a micro-patterned silicon chip in combination with the high-speed Roadrunner goniometer for sample delivery we were able to determine the crystal structures of a picornavirus, bovine enterovirus 2 (BEV2), and the cytoplasmic polyhedrosis virus type 18 polyhedrin. Total data collection times were less than 14 and 10 minutes, respectively. Our method requires only micrograms of sample and will therefore broaden the applicability of serial femtosecond crystallography to challenging projects for which only limited sample amounts are available. By synchronizing the sample exchange to the XFEL repetition rate, our method allows for the most efficient use of the limited beamtime available at XFELs and should enable a substantial increase in sample throughput at these facilities. PMID:28628129
High Throughput Determination of Tetramine in Drinking ...
Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.
SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.
Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro
2012-12-01
Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.
Micro-patterned agarose gel devices for single-cell high-throughput microscopy of E. coli cells.
Priest, David G; Tanaka, Nobuyuki; Tanaka, Yo; Taniguchi, Yuichi
2017-12-21
High-throughput microscopy of bacterial cells elucidated fundamental cellular processes including cellular heterogeneity and cell division homeostasis. Polydimethylsiloxane (PDMS)-based microfluidic devices provide advantages including precise positioning of cells and throughput, however device fabrication is time-consuming and requires specialised skills. Agarose pads are a popular alternative, however cells often clump together, which hinders single cell quantitation. Here, we imprint agarose pads with micro-patterned 'capsules', to trap individual cells and 'lines', to direct cellular growth outwards in a straight line. We implement this micro-patterning into multi-pad devices called CapsuleHotel and LineHotel for high-throughput imaging. CapsuleHotel provides ~65,000 capsule structures per mm 2 that isolate individual Escherichia coli cells. In contrast, LineHotel provides ~300 line structures per mm that direct growth of micro-colonies. With CapsuleHotel, a quantitative single cell dataset of ~10,000 cells across 24 samples can be acquired and analysed in under 1 hour. LineHotel allows tracking growth of > 10 micro-colonies across 24 samples simultaneously for up to 4 generations. These easy-to-use devices can be provided in kit format, and will accelerate discoveries in diverse fields ranging from microbiology to systems and synthetic biology.
Evaluation of High-Throughput Chemical Exposure Models ...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer parent chemical exposures from biomonitoring measurements and forward models to predict multi-pathway exposures from chemical use information and/or residential media concentrations. Here, both forward and reverse modeling methods are used to characterize the relationship between matched near-field environmental (air and dust) and biomarker measurements. Indoor air, house dust, and urine samples from a sample of 120 females (aged 60 to 80 years) were analyzed. In the measured data, 78% of the residential media measurements (across 80 chemicals) and 54% of the urine measurements (across 21 chemicals) were censored, i.e. below the limit of quantification (LOQ). Because of the degree of censoring, we applied a Bayesian approach to impute censored values for 69 chemicals having at least 15% of measurements above LOQ. This resulted in 10 chemicals (5 phthalates, 5 pesticides) with matched air, dust, and urine metabolite measurements. The population medians of indoor air and dust concentrations were compared to population median exposures inferred from urine metabolites concentrations using a high-throughput reverse-dosimetry approach. Median air and dust concentrations were found to be correl
Chen, LiQin; Wang, Hui; Xu, Zhen; Zhang, QiuYue; Liu, Jia; Shen, Jun; Zhang, WanQi
2018-08-03
In the present study, we developed a simple and high-throughput solid phase extraction (SPE) procedure for selective extraction of catecholamines (CAs) in urine samples. The SPE adsorbents were electrospun composite fibers functionalized with 4-carboxybenzo-18-crown-6 ether modified XAD resin and polystyrene, which were packed into 96-well columns and used for high-throughput selective extraction of CAs in healthy human urine samples. Moreover, the extraction efficiency of packed-fiber SPE (PFSPE) was examined by high performance liquid chromatography coupled with fluorescence detector. The parameters affecting the extraction efficiency and impurity removal efficiency were optimized, and good linearity ranging from 0.5 to 400 ng/mL was obtained with a low limit of detection (LOD, 0.2-0.5 ng/mL) and a good repeatability (2.7%-3.7%, n = 6). The extraction recoveries of three CAs ranged from 70.5% to 119.5%. Furthermore, stable and reliable results obtained by the fluorescence detector were superior to those obtained by the electrochemical detector. Collectively, PFSPE coupled with 96-well columns was a simple, rapid, selective, high-throughput and cost-efficient method, and the proposed method could be applied in clinical chemistry. Copyright © 2018 Elsevier B.V. All rights reserved.
High-throughput screening based on label-free detection of small molecule microarrays
NASA Astrophysics Data System (ADS)
Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong
2017-02-01
Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.
Karmaus, Agnes L; Toole, Colleen M; Filer, Dayne L; Lewis, Kenneth C; Martin, Matthew T
2016-04-01
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
Toole, Colleen M.; Filer, Dayne L.; Lewis, Kenneth C.; Martin, Matthew T.
2016-01-01
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. PMID:26781511
A Microfluidic Platform for High-Throughput Multiplexed Protein Quantitation
Volpetti, Francesca; Garcia-Cordero, Jose; Maerkl, Sebastian J.
2015-01-01
We present a high-throughput microfluidic platform capable of quantitating up to 384 biomarkers in 4 distinct samples by immunoassay. The microfluidic device contains 384 unit cells, which can be individually programmed with pairs of capture and detection antibody. Samples are quantitated in each unit cell by four independent MITOMI detection areas, allowing four samples to be analyzed in parallel for a total of 1,536 assays per device. We show that the device can be pre-assembled and stored for weeks at elevated temperature and we performed proof-of-concept experiments simultaneously quantitating IL-6, IL-1β, TNF-α, PSA, and GFP. Finally, we show that the platform can be used to identify functional antibody combinations by screening 64 antibody combinations requiring up to 384 unique assays per device. PMID:25680117
Wang, H; Wu, Y; Zhao, Y; Sun, W; Ding, L; Guo, B; Chen, B
2012-08-01
Desorption corona beam ionisation (DCBI), the relatively novel ambient mass spectrometry (MS) technique, was utilised to screen for illicit additives in weight-loss food. The five usually abused chemicals - fenfluramine, N-di-desmethyl sibutramine, N-mono-desmethyl sibutramine, sibutramine and phenolphthalein - were detected with the proposed DCBI-MS method. Fast single-sample and high-throughput analysis was demonstrated. Semi-quantification was accomplished based on peak areas in the ion chromatograms. Four illicit additives were identified and semi-quantified in commercial samples. As there was no tedious sample pre-treatment compared with conventional HPLC methods, high-throughput analysis was achieved with DCBI. The results proved that DCBI-MS is a powerful tool for the rapid screening of illicit additives in weight-loss dietary supplements.
Shahini, Mehdi; Yeow, John T W
2011-08-12
We report on the enhancement of electrical cell lysis using carbon nanotubes (CNTs). Electrical cell lysis systems are widely utilized in microchips as they are well suited to integration into lab-on-a-chip devices. However, cell lysis based on electrical mechanisms has high voltage requirements. Here, we demonstrate that by incorporating CNTs into microfluidic electrolysis systems, the required voltage for lysis is reduced by half and the lysis throughput at low voltages is improved by ten times, compared to non-CNT microchips. In our experiment, E. coli cells are lysed while passing through an electric field in a microchannel. Based on the lightning rod effect, the electric field strengthened at the tip of the CNTs enhances cell lysis at lower voltage and higher throughput. This approach enables easy integration of cell lysis with other on-chip high-throughput sample-preparation processes.
USDA-ARS?s Scientific Manuscript database
Identification of populations of Aspergillus section Nigri species in environmental samples using traditional methods is laborious and impractical for large numbers of samples. We developed species-specific primers and probes for quantitative droplet digital PCR (ddPCR) to improve sample throughput ...
Xiong, Zheng; He, Yinyan; Hattrick-Simpers, Jason R; Hu, Jianjun
2017-03-13
The creation of composition-processing-structure relationships currently represents a key bottleneck for data analysis for high-throughput experimental (HTE) material studies. Here we propose an automated phase diagram attribution algorithm for HTE data analysis that uses a graph-based segmentation algorithm and Delaunay tessellation to create a crystal phase diagram from high throughput libraries of X-ray diffraction (XRD) patterns. We also propose the sample-pair based objective evaluation measures for the phase diagram prediction problem. Our approach was validated using 278 diffraction patterns from a Fe-Ga-Pd composition spread sample with a prediction precision of 0.934 and a Matthews Correlation Coefficient score of 0.823. The algorithm was then applied to the open Ni-Mn-Al thin-film composition spread sample to obtain the first predicted phase diagram mapping for that sample.
Throughput increase by adjustment of the BARC drying time with coat track process
NASA Astrophysics Data System (ADS)
Brakensiek, Nickolas L.; Long, Ryan
2005-05-01
Throughput of a coater module within the coater track is related to the solvent evaporation rate from the material that is being coated. Evaporation rate is controlled by the spin dynamics of the wafer and airflow dynamics over the wafer. Balancing these effects is the key to achieving very uniform coatings across a flat unpatterned wafer. As today"s coat tracks are being pushed to higher throughputs to match the scanner, the coat module throughput must be increased as well. For chemical manufacturers the evaporation rate of the material depends on the solvent used. One measure of relative evaporation rates is to compare flash points of a solvent. The lower the flash point, the quicker the solvent will evaporate. It is possible to formulate products with these volatile solvents although at a price. Shipping and manufacturing a more flammable product increase chances of fire, thereby increasing insurance premiums. Also, the end user of these chemicals will have to take extra precautions in the fab and in storage of these more flammable chemicals. An alternative coat process is possible which would allow higher throughput in a distinct coat module without sacrificing safety. A tradeoff is required for this process, that being a more complicated coat process and a higher viscosity chemical. The coat process uses the fact that evaporation rate depends on the spin dynamics of the wafer by utilizing a series of spin speeds that first would set the thickness of the material followed by a high spin speed to remove the residual solvent. This new process can yield a throughput of over 150 wafers per hour (wph) given two coat modules. The thickness uniformity of less than 2 nm (3 sigma) is still excellent, while drying times are shorter than 10 seconds to achieve the 150 wph throughput targets.
Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.
Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S
1994-01-01
The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.
Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S
2016-03-01
Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical 'large p, small n' problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the 'large p, small n' problem. Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT CONTACT: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Xu, Like; Ouyang, Weiying; Qian, Yanyun; Su, Chao; Su, Jianqiang; Chen, Hong
2016-06-01
Antibiotic resistance genes (ARGs) are present in surface water and often cannot be completely eliminated by drinking water treatment plants (DWTPs). Improper elimination of the ARG-harboring microorganisms contaminates the water supply and would lead to animal and human disease. Therefore, it is of utmost importance to determine the most effective ways by which DWTPs can eliminate ARGs. Here, we tested water samples from two DWTPs and distribution systems and detected the presence of 285 ARGs, 8 transposases, and intI-1 by utilizing high-throughput qPCR. The prevalence of ARGs differed in the two DWTPs, one of which employed conventional water treatments while the other had advanced treatment processes. The relative abundance of ARGs increased significantly after the treatment with biological activated carbon (BAC), raising the number of detected ARGs from 76 to 150. Furthermore, the final chlorination step enhanced the relative abundance of ARGs in the finished water generated from both DWTPs. The total enrichment of ARGs varied from 6.4-to 109.2-fold in tap water compared to finished water, among which beta-lactam resistance genes displayed the highest enrichment. Six transposase genes were detected in tap water samples, with the transposase gene TnpA-04 showing the greatest enrichment (up to 124.9-fold). We observed significant positive correlations between ARGs and mobile genetic elements (MGEs) during the distribution systems, indicating that transposases and intI-1 may contribute to antibiotic resistance in drinking water. To our knowledge, this is the first study to investigate the diversity and abundance of ARGs in drinking water treatment systems utilizing high-throughput qPCR techniques in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Characterization and screening of IgG binding to the neonatal Fc receptor
Neuber, Tobias; Frese, Katrin; Jaehrling, Jan; Jäger, Sebastian; Daubert, Daniela; Felderer, Karin; Linnemann, Mechthild; Höhne, Anne; Kaden, Stefan; Kölln, Johanna; Tiller, Thomas; Brocks, Bodo; Ostendorp, Ralf; Pabst, Stefan
2014-01-01
The neonatal Fc receptor (FcRn) protects immunoglobulin G (IgG) from degradation and increases the serum half-life of IgG, thereby contributing to a higher concentration of IgG in the serum. Because altered FcRn binding may result in a reduced or prolonged half-life of IgG molecules, it is advisable to characterize Fc receptor binding of therapeutic antibody lead candidates prior to the start of pre-clinical and clinical studies. In this study, we characterized the interactions between FcRn of different species (human, cynomolgus monkey, mouse and rat) and nine IgG molecules from different species and isotypes with common variable heavy (VH) and variable light chain (VL) domains. Binding was analyzed at acidic and neutral pH using surface plasmon resonance (SPR) and biolayer interferometry (BLI). Furthermore, we transferred the well-accepted, but low throughput SPR-based method for FcRn binding characterization to the BLI-based Octet platform to enable a higher sample throughput allowing the characterization of FcRn binding already during early drug discovery phase. We showed that the BLI-based approach is fit-for-purpose and capable of discriminating between IgG molecules with significant differences in FcRn binding affinities. Using this high-throughput approach we investigated FcRn binding of 36 IgG molecules that represented all VH/VL region combinations available in the fully human, recombinant antibody library Ylanthia®. Our results clearly showed normal FcRn binding profiles for all samples. Hence, the variations among the framework parts, complementarity-determining region (CDR) 1 and CDR2 of the fragment antigen binding (Fab) domain did not significantly change FcRn binding. PMID:24802048
Optimization of throughput in semipreparative chiral liquid chromatography using stacked injection.
Taheri, Mohammadreza; Fotovati, Mohsen; Hosseini, Seyed-Kiumars; Ghassempour, Alireza
2017-10-01
An interesting mode of chromatography for preparation of pure enantiomers from pure samples is the method of stacked injection as a pseudocontinuous procedure. Maximum throughput and minimal production costs can be achieved by the use of total chiral column length in this mode of chromatography. To maximize sample loading, often touching bands of the two enantiomers is automatically achieved. Conventional equations show direct correlation between touching-band loadability and the selectivity factor of two enantiomers. The important question for one who wants to obtain the highest throughput is "How to optimize different factors including selectivity, resolution, run time, and loading of the sample in order to save time without missing the touching-band resolution?" To answer this question, tramadol and propranolol were separated on cellulose 3,5-dimethyl phenyl carbamate, as two pure racemic mixtures with low and high solubilities in mobile phase, respectively. The mobile phase composition consisted of n-hexane solvent with alcohol modifier and diethylamine as the additive. A response surface methodology based on central composite design was used to optimize separation factors against the main responses. According to the stacked injection properties, two processes were investigated for maximizing throughput: one with a poorly soluble and another with a highly soluble racemic mixture. For each case, different optimization possibilities were inspected. It was revealed that resolution is a crucial response for separations of this kind. Peak area and run time are two critical parameters in optimization of stacked injection for binary mixtures which have low solubility in the mobile phase. © 2017 Wiley Periodicals, Inc.
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks
Claver, Jose M.; Ezpeleta, Santiago
2017-01-01
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments. PMID:28684666
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks.
Pérez-Solano, Juan J; Claver, Jose M; Ezpeleta, Santiago
2017-07-06
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments.
Khoo, Bee Luan; Warkiani, Majid Ebrahimi; Tan, Daniel Shao-Weng; Bhagat, Ali Asgar S; Irwin, Darryl; Lau, Dawn Pingxi; Lim, Alvin S T; Lim, Kiat Hon; Krisna, Sai Sakktee; Lim, Wan-Teck; Yap, Yoon Sim; Lee, Soo Chin; Soo, Ross A; Han, Jongyoon; Lim, Chwee Teck
2014-01-01
Circulating tumor cells (CTCs) are cancer cells that can be isolated via liquid biopsy from blood and can be phenotypically and genetically characterized to provide critical information for guiding cancer treatment. Current analysis of CTCs is hindered by the throughput, selectivity and specificity of devices or assays used in CTC detection and isolation. Here, we enriched and characterized putative CTCs from blood samples of patients with both advanced stage metastatic breast and lung cancers using a novel multiplexed spiral microfluidic chip. This system detected putative CTCs under high sensitivity (100%, n = 56) (Breast cancer samples: 12-1275 CTCs/ml; Lung cancer samples: 10-1535 CTCs/ml) rapidly from clinically relevant blood volumes (7.5 ml under 5 min). Blood samples were completely separated into plasma, CTCs and PBMCs components and each fraction were characterized with immunophenotyping (Pan-cytokeratin/CD45, CD44/CD24, EpCAM), fluorescence in-situ hybridization (FISH) (EML4-ALK) or targeted somatic mutation analysis. We used an ultra-sensitive mass spectrometry based system to highlight the presence of an EGFR-activating mutation in both isolated CTCs and plasma cell-free DNA (cf-DNA), and demonstrate concordance with the original tumor-biopsy samples. We have clinically validated our multiplexed microfluidic chip for the ultra high-throughput, low-cost and label-free enrichment of CTCs. Retrieved cells were unlabeled and viable, enabling potential propagation and real-time downstream analysis using next generation sequencing (NGS) or proteomic analysis.
Jiang, Yutao; Hascall, Daniel; Li, Delia; Pease, Joseph H
2015-09-11
In this paper, we introduce a high throughput LCMS/UV/CAD/CLND system that improves upon previously reported systems by increasing both the quantitation accuracy and the range of compounds amenable to testing, in particular, low molecular weight "fragment" compounds. This system consists of a charged aerosol detector (CAD) and chemiluminescent nitrogen detector (CLND) added to a LCMS/UV system. Our results show that the addition of CAD and CLND to LCMS/UV is more reliable for concentration determination for a wider range of compounds than either detector alone. Our setup also allows for the parallel analysis of each sample by all four detectors and so does not significantly increase run time per sample. Copyright © 2015 Elsevier B.V. All rights reserved.
Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli
2010-01-01
The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers
Fridén, Markus; Ducrozet, Frederic; Middleton, Brian; Antonsson, Madeleine; Bredberg, Ulf; Hammarlund-Udenaes, Margareta
2009-06-01
New, more efficient methods of estimating unbound drug concentrations in the central nervous system (CNS) combine the amount of drug in whole brain tissue samples measured by conventional methods with in vitro estimates of the unbound brain volume of distribution (V(u,brain)). Although the brain slice method is the most reliable in vitro method for measuring V(u,brain), it has not previously been adapted for the needs of drug discovery research. The aim of this study was to increase the throughput and optimize the experimental conditions of this method. Equilibrium of drug between the buffer and the brain slice within the 4 to 5 h of incubation is a fundamental requirement. However, it is difficult to meet this requirement for many of the extensively binding, lipophilic compounds in drug discovery programs. In this study, the dimensions of the incubation vessel and mode of stirring influenced the equilibration time, as did the amount of brain tissue per unit of buffer volume. The use of cassette experiments for investigating V(u,brain) in a linear drug concentration range increased the throughput of the method. The V(u,brain) for the model compounds ranged from 4 to 3000 ml . g brain(-1), and the sources of variability are discussed. The optimized setup of the brain slice method allows precise, robust estimation of V(u,brain) for drugs with diverse properties, including highly lipophilic compounds. This is a critical step forward for the implementation of relevant measurements of CNS exposure in the drug discovery setting.
Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E
2013-02-15
We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.
Hyperspectral imaging using the single-pixel Fourier transform technique
NASA Astrophysics Data System (ADS)
Jin, Senlin; Hui, Wangwei; Wang, Yunlong; Huang, Kaicheng; Shi, Qiushuai; Ying, Cuifeng; Liu, Dongqi; Ye, Qing; Zhou, Wenyuan; Tian, Jianguo
2017-03-01
Hyperspectral imaging technology is playing an increasingly important role in the fields of food analysis, medicine and biotechnology. To improve the speed of operation and increase the light throughput in a compact equipment structure, a Fourier transform hyperspectral imaging system based on a single-pixel technique is proposed in this study. Compared with current imaging spectrometry approaches, the proposed system has a wider spectral range (400-1100 nm), a better spectral resolution (1 nm) and requires fewer measurement data (a sample rate of 6.25%). The performance of this system was verified by its application to the non-destructive testing of potatoes.
Téllez-Sosa, Juan; Rodríguez, Mario Henry; Gómez-Barreto, Rosa E.; Valdovinos-Torres, Humberto; Hidalgo, Ana Cecilia; Cruz-Hervert, Pablo; Luna, René Santos; Carrillo-Valenzo, Erik; Ramos, Celso; García-García, Lourdes; Martínez-Barnetche, Jesús
2013-01-01
Background Influenza viruses display a high mutation rate and complex evolutionary patterns. Next-generation sequencing (NGS) has been widely used for qualitative and semi-quantitative assessment of genetic diversity in complex biological samples. The “deep sequencing” approach, enabled by the enormous throughput of current NGS platforms, allows the identification of rare genetic viral variants in targeted genetic regions, but is usually limited to a small number of samples. Methodology and Principal Findings We designed a proof-of-principle study to test whether redistributing sequencing throughput from a high depth-small sample number towards a low depth-large sample number approach is feasible and contributes to influenza epidemiological surveillance. Using 454-Roche sequencing, we sequenced at a rather low depth, a 307 bp amplicon of the neuraminidase gene of the Influenza A(H1N1) pandemic (A(H1N1)pdm) virus from cDNA amplicons pooled in 48 barcoded libraries obtained from nasal swab samples of infected patients (n = 299) taken from May to November, 2009 pandemic period in Mexico. This approach revealed that during the transition from the first (May-July) to second wave (September-November) of the pandemic, the initial genetic variants were replaced by the N248D mutation in the NA gene, and enabled the establishment of temporal and geographic associations with genetic diversity and the identification of mutations associated with oseltamivir resistance. Conclusions NGS sequencing of a short amplicon from the NA gene at low sequencing depth allowed genetic screening of a large number of samples, providing insights to viral genetic diversity dynamics and the identification of genetic variants associated with oseltamivir resistance. Further research is needed to explain the observed replacement of the genetic variants seen during the second wave. As sequencing throughput rises and library multiplexing and automation improves, we foresee that the approach presented here can be scaled up for global genetic surveillance of influenza and other infectious diseases. PMID:23843978
Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang
2017-04-01
Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.
Santas, Jonathan; Guzmán, Yeimmy J; Guardiola, Francesc; Rafecas, Magdalena; Bou, Ricard
2014-11-01
A fluorometric method for the determination of hydroperoxides (HP) in edible oils and fats using the reagent diphenyl-1-pyrenylphosphine (DPPP) was developed and validated. Two solvent media containing 100% butanol or a mixture of chloroform/methanol (2:1, v/v) can be used to solubilise lipid samples. Regardless of the solvent used to solubilise the sample, the DPPP method was precise, accurate, sensitive and easy to perform. The HP content of 43 oil and fat samples was determined and the results were compared with those obtained by means of the AOCS Official Method for the determination of peroxide value (PV) and the ferrous oxidation-xylenol orange (FOX) method. The proposed method not only correlates well with the PV and FOX methods, but also presents some advantages such as requiring low sample and solvent amounts and being suitable for high-throughput sample analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Recent developments in high-throughput sequencing technology have made low-cost sequencing an attractive approach for many genome analysis tasks. Increasing read lengths, improving quality and the production of increasingly larger numbers of usable sequences per instrument-run continue to make whole...
Hupert, Mateusz L; Jackson, Joshua M; Wang, Hong; Witek, Małgorzata A; Kamande, Joyce; Milowsky, Matthew I; Whang, Young E; Soper, Steven A
2014-10-01
Microsystem-based technologies are providing new opportunities in the area of in vitro diagnostics due to their ability to provide process automation enabling point-of-care operation. As an example, microsystems used for the isolation and analysis of circulating tumor cells (CTCs) from complex, heterogeneous samples in an automated fashion with improved recoveries and selectivity are providing new opportunities for this important biomarker. Unfortunately, many of the existing microfluidic systems lack the throughput capabilities and/or are too expensive to manufacture to warrant their widespread use in clinical testing scenarios. Here, we describe a disposable, all-polymer, microfluidic system for the high-throughput (HT) isolation of CTCs directly from whole blood inputs. The device employs an array of high aspect ratio (HAR), parallel, sinusoidal microchannels (25 µm × 150 µm; W × D; AR = 6.0) with walls covalently decorated with anti-EpCAM antibodies to provide affinity-based isolation of CTCs. Channel width, which is similar to an average CTC diameter (12-25 µm), plays a critical role in maximizing the probability of cell/wall interactions and allows for achieving high CTC recovery. The extended channel depth allows for increased throughput at the optimized flow velocity (2 mm/s in a microchannel); maximizes cell recovery, and prevents clogging of the microfluidic channels during blood processing. Fluidic addressing of the microchannel array with a minimal device footprint is provided by large cross-sectional area feed and exit channels poised orthogonal to the network of the sinusoidal capillary channels (so-called Z-geometry). Computational modeling was used to confirm uniform addressing of the channels in the isolation bed. Devices with various numbers of parallel microchannels ranging from 50 to 320 have been successfully constructed. Cyclic olefin copolymer (COC) was chosen as the substrate material due to its superior properties during UV-activation of the HAR microchannels surfaces prior to antibody attachment. Operation of the HT-CTC device has been validated by isolation of CTCs directly from blood secured from patients with metastatic prostate cancer. High CTC sample purities (low number of contaminating white blood cells, WBCs) allowed for direct lysis and molecular profiling of isolated CTCs.
Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L
2013-06-01
To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
High-throughput microscopy must re-invent the microscope rather than speed up its functions
Oheim, M
2007-01-01
Knowledge gained from the revolutions in genomics and proteomics has helped to identify many of the key molecules involved in cellular signalling. Researchers, both in academia and in the pharmaceutical industry, now screen, at a sub-cellular level, where and when these proteins interact. Fluorescence imaging and molecular labelling combine to provide a powerful tool for real-time functional biochemistry with molecular resolution. However, they traditionally have been work-intensive, required trained personnel, and suffered from low through-put due to sample preparation, loading and handling. The need for speeding up microscopy is apparent from the tremendous complexity of cellular signalling pathways, the inherent biological variability, as well as the possibility that the same molecule plays different roles in different sub-cellular compartments. Research institutes and companies have teamed up to develop imaging cytometers of ever-increasing complexity. However, to truly go high-speed, sub-cellular imaging must free itself from the rigid framework of current microscopes. PMID:17603553
Diroma, Maria Angela; Santorsola, Mariangela; Guttà, Cristiano; Gasparre, Giuseppe; Picardi, Ernesto; Pesole, Graziano; Attimonelli, Marcella
2014-01-01
Motivation: The increasing availability of mitochondria-targeted and off-target sequencing data in whole-exome and whole-genome sequencing studies (WXS and WGS) has risen the demand of effective pipelines to accurately measure heteroplasmy and to easily recognize the most functionally important mitochondrial variants among a huge number of candidates. To this purpose, we developed MToolBox, a highly automated pipeline to reconstruct and analyze human mitochondrial DNA from high-throughput sequencing data. Results: MToolBox implements an effective computational strategy for mitochondrial genomes assembling and haplogroup assignment also including a prioritization analysis of detected variants. MToolBox provides a Variant Call Format file featuring, for the first time, allele-specific heteroplasmy and annotation files with prioritized variants. MToolBox was tested on simulated samples and applied on 1000 Genomes WXS datasets. Availability and implementation: MToolBox package is available at https://sourceforge.net/projects/mtoolbox/. Contact: marcella.attimonelli@uniba.it Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028726
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.
Single-cell genomic profiling of acute myeloid leukemia for clinical use: A pilot study
Yan, Benedict; Hu, Yongli; Ban, Kenneth H.K.; Tiang, Zenia; Ng, Christopher; Lee, Joanne; Tan, Wilson; Chiu, Lily; Tan, Tin Wee; Seah, Elaine; Ng, Chin Hin; Chng, Wee-Joo; Foo, Roger
2017-01-01
Although bulk high-throughput genomic profiling studies have led to a significant increase in the understanding of cancer biology, there is increasing awareness that bulk profiling approaches do not completely elucidate tumor heterogeneity. Single-cell genomic profiling enables the distinction of tumor heterogeneity, and may improve clinical diagnosis through the identification and characterization of putative subclonal populations. In the present study, the challenges associated with a single-cell genomics profiling workflow for clinical diagnostics were investigated. Single-cell RNA-sequencing (RNA-seq) was performed on 20 cells from an acute myeloid leukemia bone marrow sample. Putative blasts were identified based on their gene expression profiles and principal component analysis was performed to identify outlier cells. Variant calling was performed on the single-cell RNA-seq data. The present pilot study demonstrates a proof of concept for clinical single-cell genomic profiling. The recognized limitations include significant stochastic RNA loss and the relatively low throughput of the current proposed platform. Although the results of the present study are promising, further technological advances and protocol optimization are necessary for single-cell genomic profiling to be clinically viable. PMID:28454300
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Kim, Joseph; Flick, Jeanette; Reimer, Michael T; Rodila, Ramona; Wang, Perry G; Zhang, Jun; Ji, Qin C; El-Shourbagy, Tawakol A
2007-11-01
As an effective DPP-IV inhibitor, 2-(4-((2-(2S,5R)-2-Cyano-5-ethynyl-1-pyrrolidinyl)-2-oxoethylamino)-4-methyl-1-piperidinyl)-4-pyridinecarboxylic acid (ABT-279), is an investigational drug candidate under development at Abbott Laboratories for potential treatment of type 2 diabetes. In order to support the development of ABT-279, multiple analytical methods for an accurate, precise and selective concentration determination of ABT-279 in different matrices were developed and validated in accordance with the US Food and Drug Administration Guidance on Bioanalytical Method Validation. The analytical method for ABT-279 in dog plasma was validated in parallel to other validations for ABT-279 determination in different matrices. In order to shorten the sample preparation time and increase method precision, an automated multi-channel liquid handler was used to perform high-throughput protein precipitation and all other liquid transfers. The separation was performed through a Waters YMC ODS-AQ column (2.0 x 150 mm, 5 microm, 120 A) with a mobile phase of 20 mm ammonium acetate in 20% acetonitrile at a flow rate of 0.3 mL/min. Data collection started at 2.2 min and continued for 2.0 min. The validated linear dynamic range in dog plasma was between 3.05 and 2033.64 ng/mL using a 50 microL sample volume. The achieved r(2) coefficient of determination from three consecutive runs was between 0.998625 and 0.999085. The mean bias was between -4.1 and 4.3% for all calibration standards including lower limit of quantitation. The mean bias was between -8.0 and 0.4% for the quality control samples. The precision, expressed as a coefficient of variation (CV), was < or =4.1% for all levels of quality control samples. The validation results demonstrated that the high-throughput method was accurate, precise and selective for the determination of ABT-279 in dog plasma. The validated method was also employed to support two toxicology studies. The passing rate was 100% for all 49 runs from one validation study and two toxicology studies. Copyright (c) 2007 John Wiley & Sons, Ltd.
Flores, Shahida; Sun, Jie; King, Jonathan; Budowle, Bruce
2014-05-01
The GlobalFiler™ Express PCR Amplification Kit uses 6-dye fluorescent chemistry to enable multiplexing of 21 autosomal STRs, 1 Y-STR, 1 Y-indel and the sex-determining marker amelogenin. The kit is specifically designed for processing reference DNA samples in a high throughput manner. Validation studies were conducted to assess the performance and define the limitations of this direct amplification kit for typing blood and buccal reference DNA samples on various punchable collection media. Studies included thermal cycling sensitivity, reproducibility, precision, sensitivity of detection, minimum detection threshold, system contamination, stochastic threshold and concordance. Results showed that optimal amplification and injection parameters for a 1.2mm punch from blood and buccal samples were 27 and 28 cycles, respectively, combined with a 12s injection on an ABI 3500xL Genetic Analyzer. Minimum detection thresholds were set at 100 and 120RFUs for 27 and 28 cycles, respectively, and it was suggested that data from positive amplification controls provided a better threshold representation. Stochastic thresholds were set at 250 and 400RFUs for 27 and 28 cycles, respectively, as stochastic effects increased with cycle number. The minimum amount of input DNA resulting in a full profile was 0.5ng, however, the optimum range determined was 2.5-10ng. Profile quality from the GlobalFiler™ Express Kit and the previously validated AmpFlSTR(®) Identifiler(®) Direct Kit was comparable. The validation data support that reliable DNA typing results from reference DNA samples can be obtained using the GlobalFiler™ Express PCR Amplification Kit. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
High-throughput microtitre plate-based assay for DNA topoisomerases.
Taylor, James A; Burton, Nicolas P; Maxwell, Anthony
2012-01-01
We have developed a rapid, high-throughput assay for measuring the catalytic activity (DNA supercoiling or relaxation) of DNA topoisomerases. The assay utilizes intermolecular triplex formation between an immobilized triplex-forming oligo (TFO) and a triplex-forming region inserted into the plasmid substrate (pNO1), and capitalizes on the observation that supercoiled DNA forms triplexes more readily than relaxed DNA. Thus, supercoiled DNA is preferentially retained by the TFO under triplex-forming conditions while relaxed DNA can be washed away. Due to its high speed of sample analysis and reduced sample handling over conventional gel-based techniques, this assay can be used to screen chemical libraries for novel inhibitors of topoisomerases.
Wright, Justin; Kirchner, Veronica; Bernard, William; Ulrich, Nikea; McLimans, Christopher; Campa, Maria F.; Hazen, Terry; Macbeth, Tamzen; Marabello, David; McDermott, Jacob; Mackelprang, Rachel; Roth, Kimberly; Lamendella, Regina
2017-01-01
The uncontrolled release of the industrial solvent methylene chloride, also known as dichloromethane (DCM), has resulted in widespread groundwater contamination in the United States. Here we investigate the role of groundwater bacterial communities in the natural attenuation of DCM at an undisclosed manufacturing site in New Jersey. This study investigates the bacterial community structure of groundwater samples differentially contaminated with DCM to better understand the biodegradation potential of these autochthonous bacterial communities. Bacterial community analysis was completed using high-throughput sequencing of the 16S rRNA gene of groundwater samples (n = 26) with DCM contamination ranging from 0.89 to 9,800,000 μg/L. Significant DCM concentration-driven shifts in overall bacterial community structure were identified between samples, including an increase in the abundance of Firmicutes within the most contaminated samples. Across all samples, a total of 6,134 unique operational taxonomic units (OTUs) were identified, with 16 taxa having strong correlations with increased DCM concentration. Putative DCM degraders such as Pseudomonas, Dehalobacterium and Desulfovibrio were present within groundwater across all levels of DCM contamination. Interestingly, each of these taxa dominated specific DCM contamination ranges respectively. Potential DCM degrading lineages yet to be cited specifically as a DCM degrading organisms, such as the Desulfosporosinus, thrived within the most heavily contaminated groundwater samples. Co-occurrence network analysis revealed aerobic and anaerobic bacterial taxa with DCM-degrading potential were present at the study site. Our 16S rRNA gene survey serves as the first in situ bacterial community assessment of contaminated groundwater harboring DCM concentrations ranging over seven orders of magnitude. Diversity analyses revealed known as well as potentially novel DCM degrading taxa within defined DCM concentration ranges, indicating niche-specific responses of these autochthonous populations. Altogether, our findings suggest that monitored natural attenuation is an appropriate remediation strategy for DCM contamination, and that high-throughput sequencing technologies are a robust method for assessing the potential role of biodegrading bacterial assemblages in the apparent reduction of DCM concentrations in environmental scenarios. PMID:29213257
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, Justin; Kirchner, Veronica; Bernard, William
The uncontrolled release of the industrial solvent methylene chloride, also known as dichloromethane (DCM), has resulted in widespread groundwater contamination in the United States. Here we investigate the role of groundwater bacterial communities in the natural attenuation of DCM at an undisclosed manufacturing site in New Jersey. Here, we investigate the bacterial community structure of groundwater samples differentially contaminated with DCM to better understand the biodegradation potential of these autochthonous bacterial communities. Bacterial community analysis was completed using high-throughput sequencing of the 16S rRNA gene of groundwater samples (n = 26) with DCM contamination ranging from 0.89 to 9,800,000 μg/L.more » Significant DCM concentration-driven shifts in overall bacterial community structure were identified between samples, including an increase in the abundance of Firmicutes within the most contaminated samples. And across all samples, a total of 6,134 unique operational taxonomic units (OTUs) were identified, with 16 taxa having strong correlations with increased DCM concentration. Putative DCM degraders such as Pseudomonas, Dehalobacterium and Desulfovibrio were present within groundwater across all levels of DCM contamination. Interestingly, each of these taxa dominated specific DCM contamination ranges respectively. Potential DCM degrading lineages yet to be cited specifically as a DCM degrading organisms, such as the Desulfosporosinus, thrived within the most heavily contaminated groundwater samples. Co-occurrence network analysis revealed aerobic and anaerobic bacterial taxa with DCM-degrading potential were present at the study site. Our 16S rRNA gene survey serves as the first in situ bacterial community assessment of contaminated groundwater harboring DCM concentrations ranging over seven orders of magnitude. Diversity analyses revealed known as well as potentially novel DCM degrading taxa within defined DCM concentration ranges, indicating niche-specific responses of these autochthonous populations. Altogether, our findings suggest that monitored natural attenuation is an appropriate remediation strategy for DCM contamination, and that high-throughput sequencing technologies are a robust method for assessing the potential role of biodegrading bacterial assemblages in the apparent reduction of DCM concentrations in environmental scenarios.« less
Wright, Justin; Kirchner, Veronica; Bernard, William; ...
2017-11-22
The uncontrolled release of the industrial solvent methylene chloride, also known as dichloromethane (DCM), has resulted in widespread groundwater contamination in the United States. Here we investigate the role of groundwater bacterial communities in the natural attenuation of DCM at an undisclosed manufacturing site in New Jersey. Here, we investigate the bacterial community structure of groundwater samples differentially contaminated with DCM to better understand the biodegradation potential of these autochthonous bacterial communities. Bacterial community analysis was completed using high-throughput sequencing of the 16S rRNA gene of groundwater samples (n = 26) with DCM contamination ranging from 0.89 to 9,800,000 μg/L.more » Significant DCM concentration-driven shifts in overall bacterial community structure were identified between samples, including an increase in the abundance of Firmicutes within the most contaminated samples. And across all samples, a total of 6,134 unique operational taxonomic units (OTUs) were identified, with 16 taxa having strong correlations with increased DCM concentration. Putative DCM degraders such as Pseudomonas, Dehalobacterium and Desulfovibrio were present within groundwater across all levels of DCM contamination. Interestingly, each of these taxa dominated specific DCM contamination ranges respectively. Potential DCM degrading lineages yet to be cited specifically as a DCM degrading organisms, such as the Desulfosporosinus, thrived within the most heavily contaminated groundwater samples. Co-occurrence network analysis revealed aerobic and anaerobic bacterial taxa with DCM-degrading potential were present at the study site. Our 16S rRNA gene survey serves as the first in situ bacterial community assessment of contaminated groundwater harboring DCM concentrations ranging over seven orders of magnitude. Diversity analyses revealed known as well as potentially novel DCM degrading taxa within defined DCM concentration ranges, indicating niche-specific responses of these autochthonous populations. Altogether, our findings suggest that monitored natural attenuation is an appropriate remediation strategy for DCM contamination, and that high-throughput sequencing technologies are a robust method for assessing the potential role of biodegrading bacterial assemblages in the apparent reduction of DCM concentrations in environmental scenarios.« less
Are Uncultivated Bacteria Really Uncultivable?
Puspita, Indun Dewi; Kamagata, Yoichi; Tanaka, Michiko; Asano, Kozo; Nakatsu, Cindy H.
2012-01-01
Many strategies have been used to increase the number of bacterial cells that can be grown from environmental samples but cultivation efficiency remains a challenge for microbial ecologists. The difficulty of cultivating a fraction of bacteria in environmental samples can be classified into two non-exclusive categories. Bacterial taxa with no cultivated representatives for which appropriate laboratory conditions necessary for growth are yet to be identified. The other class is cells in a non-dividing state (also known as dormant or viable but not culturable cells) that require the removal or addition of certain factors to re-initiate growth. A number of strategies, from simple to high throughput techniques, are reviewed that have been used to increase the cultivation efficiency of environmental samples. Some of the underlying mechanisms that contribute to the success of these cultivation strategies are described. Overall this review emphasizes the need of researchers to first understand the factors that are hindering cultivation to identify the best strategies to improve cultivation efficiency. PMID:23059723
NASA Astrophysics Data System (ADS)
Yu, Hao Yun; Liu, Chun-Hung; Shen, Yu Tian; Lee, Hsuan-Ping; Tsai, Kuen Yu
2014-03-01
Line edge roughness (LER) influencing the electrical performance of circuit components is a key challenge for electronbeam lithography (EBL) due to the continuous scaling of technology feature sizes. Controlling LER within an acceptable tolerance that satisfies International Technology Roadmap for Semiconductors requirements while achieving high throughput become a challenging issue. Although lower dosage and more-sensitive resist can be used to improve throughput, they would result in serious LER-related problems because of increasing relative fluctuation in the incident positions of electrons. Directed self-assembly (DSA) is a promising technique to relax LER-related pattern fidelity (PF) requirements because of its self-healing ability, which may benefit throughput. To quantify the potential of throughput improvement in EBL by introducing DSA for post healing, rigorous numerical methods are proposed to simultaneously maximize throughput by adjusting writing parameters of EBL systems subject to relaxed LER-related PF requirements. A fast, continuous model for parameter sweeping and a hybrid model for more accurate patterning prediction are employed for the patterning simulation. The tradeoff between throughput and DSA self-healing ability is investigated. Preliminary results indicate that significant throughput improvements are achievable at certain process conditions.
NASA Astrophysics Data System (ADS)
Hatzenbuhler, Chelsea; Kelly, John R.; Martinson, John; Okum, Sara; Pilgrim, Erik
2017-04-01
High-throughput DNA metabarcoding has gained recognition as a potentially powerful tool for biomonitoring, including early detection of aquatic invasive species (AIS). DNA based techniques are advancing, but our understanding of the limits to detection for metabarcoding complex samples is inadequate. For detecting AIS at an early stage of invasion when the species is rare, accuracy at low detection limits is key. To evaluate the utility of metabarcoding in future fish community monitoring programs, we conducted several experiments to determine the sensitivity and accuracy of routine metabarcoding methods. Experimental mixes used larval fish tissue from multiple “common” species spiked with varying proportions of tissue from an additional “rare” species. Pyrosequencing of genetic marker, COI (cytochrome c oxidase subunit I) and subsequent sequence data analysis provided experimental evidence of low-level detection of the target “rare” species at biomass percentages as low as 0.02% of total sample biomass. Limits to detection varied interspecifically and were susceptible to amplification bias. Moreover, results showed some data processing methods can skew sequence-based biodiversity measurements from corresponding relative biomass abundances and increase false absences. We suggest caution in interpreting presence/absence and relative abundance in larval fish assemblages until metabarcoding methods are optimized for accuracy and precision.
Yang, Seung Hak; Lim, Joung Soo; Khan, Modabber Ahmed; Kim, Bong Soo; Choi, Dong Yoon; Lee, Eun Young; Ahn, Hee Kwon
2015-01-01
The leachate generated by the decomposition of animal carcass has been implicated as an environmental contaminant surrounding the burial site. High-throughput nucleotide sequencing was conducted to investigate the bacterial communities in leachates from the decomposition of pig carcasses. We acquired 51,230 reads from six different samples (1, 2, 3, 4, 6 and 14 week-old carcasses) and found that sequences representing the phylum Firmicutes predominated. The diversity of bacterial 16S rRNA gene sequences in the leachate was the highest at 6 weeks, in contrast to those at 2 and 14 weeks. The relative abundance of Firmicutes was reduced, while the proportion of Bacteroidetes and Proteobacteria increased from 3–6 weeks. The representation of phyla was restored after 14 weeks. However, the community structures between the samples taken at 1–2 and 14 weeks differed at the bacterial classification level. The trend in pH was similar to the changes seen in bacterial communities, indicating that the pH of the leachate could be related to the shift in the microbial community. The results indicate that the composition of bacterial communities in leachates of decomposing pig carcasses shifted continuously during the study period and might be influenced by the burial site. PMID:26500442
The cost of large numbers of hypothesis tests on power, effect size and sample size.
Lazzeroni, L C; Ray, A
2012-01-01
Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.
High Resolution Separations and Improved Ion Production and Transmission in Metabolomics
Metz, Thomas O.; Page, Jason S.; Baker, Erin S.; Tang, Keqi; Ding, Jie; Shen, Yufeng; Smith, Richard D.
2008-01-01
The goal of metabolomics analyses is the detection and quantitation of as many sample components as reasonably possible in order to identify compounds or “features” that can be used to characterize the samples under study. When utilizing electrospray ionization to produce ions for analysis by mass spectrometry (MS), it is important that metabolome sample constituents be efficiently separated prior to ion production, in order to minimize ionization suppression and thereby extend the dynamic range of the measurement, as well as the coverage of the metabolome. Similarly, optimization of the MS inlet and interface can lead to increased measurement sensitivity. This perspective review will focus on the role of high resolution liquid chromatography (LC) separations in conjunction with improved ion production and transmission for LC-MS-based metabolomics. Additional emphasis will be placed on the compromise between metabolome coverage and sample analysis throughput. PMID:19255623
Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.
2014-01-01
To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545
Marques, Sara S.; Magalhães, Luís M.; Tóth, Ildikó V.; Segundo, Marcela A.
2014-01-01
Total antioxidant capacity assays are recognized as instrumental to establish antioxidant status of biological samples, however the varying experimental conditions result in conclusions that may not be transposable to other settings. After selection of the complexing agent, reagent addition order, buffer type and concentration, copper reducing assays were adapted to a high-throughput scheme and validated using model biological antioxidant compounds of ascorbic acid, Trolox (a soluble analogue of vitamin E), uric acid and glutathione. A critical comparison was made based on real samples including NIST-909c human serum certified sample, and five study samples. The validated method provided linear range up to 100 µM Trolox, (limit of detection 2.3 µM; limit of quantification 7.7 µM) with recovery results above 85% and precision <5%. The validated developed method with an increased sensitivity is a sound choice for assessment of TAC in serum samples. PMID:24968275
High-throughput hyperpolarized 13C metabolic investigations using a multi-channel acquisition system
NASA Astrophysics Data System (ADS)
Lee, Jaehyuk; Ramirez, Marc S.; Walker, Christopher M.; Chen, Yunyun; Yi, Stacey; Sandulache, Vlad C.; Lai, Stephen Y.; Bankson, James A.
2015-11-01
Magnetic resonance imaging and spectroscopy of hyperpolarized (HP) compounds such as [1-13C]-pyruvate have shown tremendous potential for offering new insight into disease and response to therapy. New applications of this technology in clinical research and care will require extensive validation in cells and animal models, a process that may be limited by the high cost and modest throughput associated with dynamic nuclear polarization. Relatively wide spectral separation between [1-13C]-pyruvate and its chemical endpoints in vivo are conducive to simultaneous multi-sample measurements, even in the presence of a suboptimal global shim. Multi-channel acquisitions could conserve costs and accelerate experiments by allowing acquisition from multiple independent samples following a single dissolution. Unfortunately, many existing preclinical MRI systems are equipped with only a single channel for broadband acquisitions. In this work, we examine the feasibility of this concept using a broadband multi-channel digital receiver extension and detector arrays that allow concurrent measurement of dynamic spectroscopic data from ex vivo enzyme phantoms, in vitro anaplastic thyroid carcinoma cells, and in vivo in tumor-bearing mice. Throughput and the cost of consumables were improved by up to a factor of four. These preliminary results demonstrate the potential for efficient multi-sample studies employing hyperpolarized agents.
Fast-mode duplex qPCR for BCR-ABL1 molecular monitoring: innovation, automation, and harmonization.
Gerrard, Gareth; Mudge, Katherine; Foskett, Pierre; Stevens, David; Alikian, Mary; White, Helen E; Cross, Nicholas C P; Apperley, Jane; Foroni, Letizia
2012-07-01
Reverse transcription quantitative polymerase chain reaction (RTqPCR)is currently the most sensitive tool available for the routine monitoring of disease level in patients undergoing treatment for BCRABL1 associated malignancies. Considerable effort has been invested at both the local and international levels to standardise the methodology and reporting criteria used to assess this critical metric. In an effort to accommodate the demands of increasing sample throughput and greater standardization, we adapted the current best-practice guidelines to encompass automation platforms and improved multiplex RT-qPCR technology.
Lee, Ju Yeon; Kim, Jin Young; Cheon, Mi Hee; Park, Gun Wook; Ahn, Yeong Hee; Moon, Myeong Hee; Yoo, Jong Shin
2014-02-26
A rapid, simple, and reproducible MRM-based validation method for serological glycoprotein biomarkers in clinical use was developed by targeting the nonglycosylated tryptic peptides adjacent to N-glycosylation sites. Since changes in protein glycosylation are known to be associated with a variety of diseases, glycoproteins have been major targets in biomarker discovery. We previously found that nonglycosylated tryptic peptides adjacent to N-glycosylation sites differed in concentration between normal and hepatocellular carcinoma (HCC) plasma due to differences in steric hindrance of the glycan moiety in N-glycoproteins to tryptic digestion (Lee et al., 2011). To increase the feasibility and applicability of clinical validation of biomarker candidates (nonglycosylated tryptic peptides), we developed a method to effectively monitor nonglycosylated tryptic peptides from a large number of plasma samples and to reduce the total analysis time with maximizing the effect of steric hindrance by the glycans during digestion of glycoproteins. The AUC values of targeted nonglycosylated tryptic peptides were excellent (0.955 for GQYCYELDEK, 0.880 for FEDGVLDPDYPR and 0.907 for TEDTIFLR), indicating that these could be effective biomarkers for hepatocellular carcinoma. This method provides the necessary throughput required to validate glycoprotein biomarkers, as well as quantitative accuracy for human plasma analysis, and should be amenable to clinical use. Difficulties in verifying and validating putative protein biomarkers are often caused by complex sample preparation procedures required to determine their concentrations in a large number of plasma samples. To solve the difficulties, we developed MRM-based protein biomarker assays that greatly reduce complex, time-consuming, and less reproducible sample pretreatment steps in plasma for clinical implementation. First, we used undepleted human plasma samples without any enrichment procedures. Using nanoLC/MS/MS, we targeted nonglycosylated tryptic peptides adjacent to N-linked glycosylation sites in N-linked glycoprotein biomarkers, which could be detected in human plasma samples without depleting highly abundant proteins. Second, human plasma proteins were digested with trypsin without reduction and alkylation procedures to minimize sample preparation. Third, trypsin digestion times were shortened so as to obtain reproducible results with maximization of the steric hindrance effect of the glycans during enzyme digestion. Finally, this rapid and simple sample preparation method was applied to validate targeted nonglycosylated tryptic peptides as liver cancer biomarker candidates for diagnosis in 40 normal and 41 hepatocellular carcinoma (HCC) human plasma samples. This strategy provided the necessary throughput required to monitor protein biomarkers, as well as quantitative accuracy in human plasma analysis. From biomarker discovery to clinical implementation, our method will provide a biomarker study platform that is suitable for clinical deployment, and can be applied to high-throughput approaches. Copyright © 2014 Elsevier B.V. All rights reserved.
MIPHENO: Data normalization for high throughput metabolic analysis.
High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...
Alhusban, Ala A; Gaudry, Adam J; Breadmore, Michael C; Gueven, Nuri; Guijt, Rosanne M
2014-01-03
Cell culture has replaced many in vivo studies because of ethical and regulatory measures as well as the possibility of increased throughput. Analytical assays to determine (bio)chemical changes are often based on end-point measurements rather than on a series of sequential determinations. The purpose of this work is to develop an analytical system for monitoring cell culture based on sequential injection-capillary electrophoresis (SI-CE) with capacitively coupled contactless conductivity detection (C(4)D). The system was applied for monitoring lactate production, an important metabolic indicator, during mammalian cell culture. Using a background electrolyte consisting of 25mM tris(hydroxymethyl)aminomethane, 35mM cyclohexyl-2-aminoethanesulfonic acid with 0.02% poly(ethyleneimine) (PEI) at pH 8.65 and a multilayer polymer coated capillary, lactate could be resolved from other compounds present in media with relative standard deviations 0.07% for intraday electrophoretic mobility and an analysis time of less than 10min. Using the human embryonic kidney cell line HEK293, lactate concentrations in the cell culture medium were measured every 20min over 3 days, requiring only 8.73μL of sample per run. Combining simplicity, portability, automation, high sample throughput, low limits of detection, low sample consumption and the ability to up- and outscale, this new methodology represents a promising technique for near real-time monitoring of chemical changes in diverse cell culture applications. Copyright © 2013 Elsevier B.V. All rights reserved.
The salivary microbiome for differentiating individuals: proof of principle.
Leake, Sarah L; Pagni, Marco; Falquet, Laurent; Taroni, Franco; Greub, Gilbert
2016-06-01
Human identification has played a prominent role in forensic science for the past two decades. Identification based on unique genetic traits is driving the field. However, this may have limitations, for instance, for twins. Moreover, high-throughput sequencing techniques are now available and may provide a high amount of data likely useful in forensic science. This study investigates the potential for bacteria found in the salivary microbiome to be used to differentiate individuals. Two different targets (16S rRNA and rpoB) were chosen to maximise coverage of the salivary microbiome and when combined, they increase the power of differentiation (identification). Paired-end Illumina high-throughput sequencing was used to analyse the bacterial composition of saliva from two different people at four different time points (t = 0 and t = 28 days and then one year later at t = 0 and t = 28 days). Five major phyla dominate the samples: Firmicutes, Proteobacteria, Actinobacteria, Bacteroidetes and Fusobacteria. Streptococcus, a Firmicutes, is one of the most abundant aerobic genera found in saliva and targeting Streptococcus rpoB has enabled a deeper characterisation of the different streptococci species, which cannot be differentiated using 16S rRNA alone. We have observed that samples from the same person group together regardless of time of sampling. The results indicate that it is possible to distinguish two people using the bacterial microbiota present in their saliva. Copyright © 2016 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc
2016-03-01
One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.
The high throughput virtual slit enables compact, inexpensive Raman spectral imagers
NASA Astrophysics Data System (ADS)
Gooding, Edward; Deutsch, Erik R.; Huehnerhoff, Joseph; Hajian, Arsen R.
2018-02-01
Raman spectral imaging is increasingly becoming the tool of choice for field-based applications such as threat, narcotics and hazmat detection; air, soil and water quality monitoring; and material ID. Conventional fiber-coupled point source Raman spectrometers effectively interrogate a small sample area and identify bulk samples via spectral library matching. However, these devices are very slow at mapping over macroscopic areas. In addition, the spatial averaging performed by instruments that collect binned spectra, particularly when used in combination with orbital raster scanning, tends to dilute the spectra of trace particles in a mixture. Our design, employing free space line illumination combined with area imaging, reveals both the spectral and spatial content of heterogeneous mixtures. This approach is well suited to applications such as detecting explosives and narcotics trace particle detection in fingerprints. The patented High Throughput Virtual Slit1 is an innovative optical design that enables compact, inexpensive handheld Raman spectral imagers. HTVS-based instruments achieve significantly higher spectral resolution than can be obtained with conventional designs of the same size. Alternatively, they can be used to build instruments with comparable resolution to large spectrometers, but substantially smaller size, weight and unit cost, all while maintaining high sensitivity. When used in combination with laser line imaging, this design eliminates sample photobleaching and unwanted photochemistry while greatly enhancing mapping speed, all with high selectivity and sensitivity. We will present spectral image data and discuss applications that are made possible by low cost HTVS-enabled instruments.
Pre-amplification in the context of high-throughput qPCR gene expression experiment.
Korenková, Vlasta; Scott, Justin; Novosadová, Vendula; Jindřichová, Marie; Langerová, Lucie; Švec, David; Šídová, Monika; Sjöback, Robert
2015-03-11
With the introduction of the first high-throughput qPCR instrument on the market it became possible to perform thousands of reactions in a single run compared to the previous hundreds. In the high-throughput reaction, only limited volumes of highly concentrated cDNA or DNA samples can be added. This necessity can be solved by pre-amplification, which became a part of the high-throughput experimental workflow. Here, we focused our attention on the limits of the specific target pre-amplification reaction and propose the optimal, general setup for gene expression experiment using BioMark instrument (Fluidigm). For evaluating different pre-amplification factors following conditions were combined: four human blood samples from healthy donors and five transcripts having high to low expression levels; each cDNA sample was pre-amplified at four cycles (15, 18, 21, and 24) and five concentrations (equivalent to 0.078 ng, 0.32 ng, 1.25 ng, 5 ng, and 20 ng of total RNA). Factors identified as critical for a success of cDNA pre-amplification were cycle of pre-amplification, total RNA concentration, and type of gene. The selected pre-amplification reactions were further tested for optimal Cq distribution in a BioMark Array. The following concentrations combined with pre-amplification cycles were optimal for good quality samples: 20 ng of total RNA with 15 cycles of pre-amplification, 20x and 40x diluted; and 5 ng and 20 ng of total RNA with 18 cycles of pre-amplification, both 20x and 40x diluted. We set up upper limits for the bulk gene expression experiment using gene expression Dynamic Array and provided an easy-to-obtain tool for measuring of pre-amplification success. We also showed that variability of the pre-amplification, introduced into the experimental workflow of reverse transcription-qPCR, is lower than variability caused by the reverse transcription step.
Chen, Zhidan; Coy, Stephen L; Pannkuk, Evan L; Laiakis, Evagelia C; Fornace, Albert J; Vouros, Paul
2018-05-07
High-throughput methods to assess radiation exposure are a priority due to concerns that include nuclear power accidents, the spread of nuclear weapon capability, and the risk of terrorist attacks. Metabolomics, the assessment of small molecules in an easily accessible sample, is the most recent method to be applied for the identification of biomarkers of the biological radiation response with a useful dose-response profile. Profiling for biomarker identification is frequently done using an LC-MS platform which has limited throughput due to the time-consuming nature of chromatography. We present here a chromatography-free simplified method for quantitative analysis of seven metabolites in urine with radiation dose-response using urine samples provided from the Pannkuk et al. (2015) study of long-term (7-day) radiation response in nonhuman primates (NHP). The stable isotope dilution (SID) analytical method consists of sample preparation by strong cation exchange-solid phase extraction (SCX-SPE) to remove interferences and concentrate the metabolites of interest, followed by differential mobility spectrometry (DMS) ion filtration to select the ion of interest and reduce chemical background, followed by mass spectrometry (overall SID-SPE-DMS-MS). Since no chromatography is used, calibration curves were prepared rapidly, in under 2 h (including SPE) for six simultaneously analyzed radiation biomarkers. The seventh, creatinine, was measured separately after 2500× dilution. Creatinine plays a dual role, measuring kidney glomerular filtration rate (GFR), and indicating kidney damage at high doses. The current quantitative method using SID-SPE-DMS-MS provides throughput which is 7.5 to 30 times higher than that of LC-MS and provides a path to pre-clinical radiation dose estimation. Graphical Abstract.
NASA Astrophysics Data System (ADS)
Chen, Zhidan; Coy, Stephen L.; Pannkuk, Evan L.; Laiakis, Evagelia C.; Fornace, Albert J.; Vouros, Paul
2018-05-01
High-throughput methods to assess radiation exposure are a priority due to concerns that include nuclear power accidents, the spread of nuclear weapon capability, and the risk of terrorist attacks. Metabolomics, the assessment of small molecules in an easily accessible sample, is the most recent method to be applied for the identification of biomarkers of the biological radiation response with a useful dose-response profile. Profiling for biomarker identification is frequently done using an LC-MS platform which has limited throughput due to the time-consuming nature of chromatography. We present here a chromatography-free simplified method for quantitative analysis of seven metabolites in urine with radiation dose-response using urine samples provided from the Pannkuk et al. (2015) study of long-term (7-day) radiation response in nonhuman primates (NHP). The stable isotope dilution (SID) analytical method consists of sample preparation by strong cation exchange-solid phase extraction (SCX-SPE) to remove interferences and concentrate the metabolites of interest, followed by differential mobility spectrometry (DMS) ion filtration to select the ion of interest and reduce chemical background, followed by mass spectrometry (overall SID-SPE-DMS-MS). Since no chromatography is used, calibration curves were prepared rapidly, in under 2 h (including SPE) for six simultaneously analyzed radiation biomarkers. The seventh, creatinine, was measured separately after 2500× dilution. Creatinine plays a dual role, measuring kidney glomerular filtration rate (GFR), and indicating kidney damage at high doses. The current quantitative method using SID-SPE-DMS-MS provides throughput which is 7.5 to 30 times higher than that of LC-MS and provides a path to pre-clinical radiation dose estimation. [Figure not available: see fulltext.
Optimal Time Advance In Terminal Area Arrivals: Throughput vs. Fuel Savings
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V .; Swenson, Harry N.; Haskell, William B.; Rakas, Jasenka
2011-01-01
The current operational practice in scheduling air traffic arriving at an airport is to adjust flight schedules by delay, i.e. a postponement of an aircrafts arrival at a scheduled location, to manage safely the FAA-mandated separation constraints between aircraft. To meet the observed and forecast growth in traffic demand, however, the practice of time advance (speeding up an aircraft toward a scheduled location) is envisioned for future operations as a practice additional to delay. Time advance has two potential advantages. The first is the capability to minimize, or at least reduce, the excess separation (the distances between pairs of aircraft immediately in-trail) and thereby to increase the throughput of the arriving traffic. The second is to reduce the total traffic delay when the traffic sample is below saturation density. A cost associated with time advance is the fuel expenditure required by an aircraft to speed up. We present an optimal control model of air traffic arriving in a terminal area and solve it using the Pontryagin Maximum Principle. The admissible controls allow time advance, as well as delay, some of the way. The cost function reflects the trade-off between minimizing two competing objectives: excess separation (negatively correlated with throughput) and fuel burn. A number of instances are solved using three different methods, to demonstrate consistency of solutions.
Perera, Rushini S.; Ding, Xavier C.; Tully, Frank; Oliver, James; Bright, Nigel; Bell, David; Chiodini, Peter L.; Gonzalez, Iveth J.; Polley, Spencer D.
2017-01-01
Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP system utilised dried blood spots (DBS) and liquid whole blood (WB), with parallel sample processing of 94 samples per run. The system was evaluated using 699 samples of known infection status pre-determined by gold standard nested PCR. Results The sensitivity and specificity of WB-HTP-LAMP was 98.6% (95% CI, 95.7–100), and 99.7% (95% CI, 99.2–100); sensitivity of DBS-HTP-LAMP was 97.1% (95% CI, 93.1–100), and specificity 100% against PCR. At parasite densities greater or equal to 2 parasites/μL, WB and DBS HTP-LAMP showed 100% sensitivity and specificity against PCR. At densities less than 2 p/μL, WB-HTP-LAMP sensitivity was 88.9% (95% CI, 77.1–100) and specificity was 99.7% (95% CI, 99.2–100); sensitivity and specificity of DBS-HTP-LAMP was 77.8% (95% CI, 54.3–99.5) and 100% respectively. Conclusions The HTP-LAMP system is a highly sensitive diagnostic test, with the potential to allow large scale population screening in malaria elimination campaigns. PMID:28166235
Mason, Annaliese S; Zhang, Jing; Tollenaere, Reece; Vasquez Teuber, Paula; Dalton-Morgan, Jessica; Hu, Liyong; Yan, Guijun; Edwards, David; Redden, Robert; Batley, Jacqueline
2015-09-01
Germplasm collections provide an extremely valuable resource for breeders and researchers. However, misclassification of accessions by species often hinders the effective use of these collections. We propose that use of high-throughput genotyping tools can provide a fast, efficient and cost-effective way of confirming species in germplasm collections, as well as providing valuable genetic diversity data. We genotyped 180 Brassicaceae samples sourced from the Australian Grains Genebank across the recently released Illumina Infinium Brassica 60K SNP array. Of these, 76 were provided on the basis of suspected misclassification and another 104 were sourced independently from the germplasm collection. Presence of the A- and C-genomes combined with principle components analysis clearly separated Brassica rapa, B. oleracea, B. napus, B. carinata and B. juncea samples into distinct species groups. Several lines were further validated using chromosome counts. Overall, 18% of samples (32/180) were misclassified on the basis of species. Within these 180 samples, 23/76 (30%) supplied on the basis of suspected misclassification were misclassified, and 9/105 (9%) of the samples randomly sourced from the Australian Grains Genebank were misclassified. Surprisingly, several individuals were also found to be the product of interspecific hybridization events. The SNP (single nucleotide polymorphism) array proved effective at confirming species, and provided useful information related to genetic diversity. As similar genomic resources become available for different crops, high-throughput molecular genotyping will offer an efficient and cost-effective method to screen germplasm collections worldwide, facilitating more effective use of these valuable resources by breeders and researchers. © 2015 John Wiley & Sons Ltd.
3D imaging of optically cleared tissue using a simplified CLARITY method and on-chip microscopy
Zhang, Yibo; Shin, Yoonjung; Sung, Kevin; Yang, Sam; Chen, Harrison; Wang, Hongda; Teng, Da; Rivenson, Yair; Kulkarni, Rajan P.; Ozcan, Aydogan
2017-01-01
High-throughput sectioning and optical imaging of tissue samples using traditional immunohistochemical techniques can be costly and inaccessible in resource-limited areas. We demonstrate three-dimensional (3D) imaging and phenotyping in optically transparent tissue using lens-free holographic on-chip microscopy as a low-cost, simple, and high-throughput alternative to conventional approaches. The tissue sample is passively cleared using a simplified CLARITY method and stained using 3,3′-diaminobenzidine to target cells of interest, enabling bright-field optical imaging and 3D sectioning of thick samples. The lens-free computational microscope uses pixel super-resolution and multi-height phase recovery algorithms to digitally refocus throughout the cleared tissue and obtain a 3D stack of complex-valued images of the sample, containing both phase and amplitude information. We optimized the tissue-clearing and imaging system by finding the optimal illumination wavelength, tissue thickness, sample preparation parameters, and the number of heights of the lens-free image acquisition and implemented a sparsity-based denoising algorithm to maximize the imaging volume and minimize the amount of the acquired data while also preserving the contrast-to-noise ratio of the reconstructed images. As a proof of concept, we achieved 3D imaging of neurons in a 200-μm-thick cleared mouse brain tissue over a wide field of view of 20.5 mm2. The lens-free microscope also achieved more than an order-of-magnitude reduction in raw data compared to a conventional scanning optical microscope imaging the same sample volume. Being low cost, simple, high-throughput, and data-efficient, we believe that this CLARITY-enabled computational tissue imaging technique could find numerous applications in biomedical diagnosis and research in low-resource settings. PMID:28819645
Xiang, Chengxiang; Haber, Joel; Marcin, Martin; Mitrovic, Slobodan; Jin, Jian; Gregoire, John M
2014-03-10
Combinatorial synthesis and screening of light absorbers are critical to material discoveries for photovoltaic and photoelectrochemical applications. One of the most effective ways to evaluate the energy-conversion properties of a semiconducting light absorber is to form an asymmetric junction and investigate the photogeneration, transport and recombination processes at the semiconductor interface. This standard photoelectrochemical measurement is readily made on a semiconductor sample with a back-side metallic contact (working electrode) and front-side solution contact. In a typical combinatorial material library, each sample shares a common back contact, requiring novel instrumentation to provide spatially resolved and thus sample-resolved measurements. We developed a multiplexing counter electrode with a thin layer assembly, in which a rectifying semiconductor/liquid junction was formed and the short-circuit photocurrent was measured under chopped illumination for each sample in a material library. The multiplexing counter electrode assembly demonstrated a photocurrent sensitivity of sub-10 μA cm(-2) with an external quantum yield sensitivity of 0.5% for each semiconductor sample under a monochromatic ultraviolet illumination source. The combination of cell architecture and multiplexing allows high-throughput modes of operation, including both fast-serial and parallel measurements. To demonstrate the performance of the instrument, the external quantum yields of 1819 different compositions from a pseudoquaternary metal oxide library, (Fe-Zn-Sn-Ti)Ox, at 385 nm were collected in scanning serial mode with a throughput of as fast as 1 s per sample. Preliminary screening results identified a promising ternary composition region centered at Fe0.894Sn0.103Ti0.0034Ox, with an external quantum yield of 6.7% at 385 nm.
Gómez-Ríos, Germán Augusto; Liu, Chang; Tascon, Marcos; Reyes-Garcés, Nathaly; Arnold, Don W; Covey, Thomas R; Pawliszyn, Janusz
2017-04-04
In recent years, the direct coupling of solid phase microextraction (SPME) and mass spectrometry (MS) has shown its great potential to improve limits of quantitation, accelerate analysis throughput, and diminish potential matrix effects when compared to direct injection to MS. In this study, we introduce the open port probe (OPP) as a robust interface to couple biocompatible SPME (Bio-SPME) fibers to MS systems for direct electrospray ionization. The presented design consisted of minimal alterations to the front-end of the instrument and provided better sensitivity, simplicity, speed, wider compound coverage, and high-throughput in comparison to the LC-MS based approach. Quantitative determination of clenbuterol, fentanyl, and buprenorphine was successfully achieved in human urine. Despite the use of short extraction/desorption times (5 min/5 s), limits of quantitation below the minimum required performance levels (MRPL) set by the world antidoping agency (WADA) were obtained with good accuracy (≥90%) and linearity (R 2 > 0.99) over the range evaluated for all analytes using sample volumes of 300 μL. In-line technologies such as multiple reaction monitoring with multistage fragmentation (MRM 3 ) and differential mobility spectrometry (DMS) were used to enhance the selectivity of the method without compromising analysis speed. On the basis of calculations, once coupled to high throughput, this method can potentially yield preparation times as low as 15 s per sample based on the 96-well plate format. Our results demonstrated that Bio-SPME-OPP-MS efficiently integrates sampling/sample cleanup and atmospheric pressure ionization, making it an advantageous configuration for several bioanalytical applications, including doping in sports, in vivo tissue sampling, and therapeutic drug monitoring.
NASA Astrophysics Data System (ADS)
Lagus, Todd P.; Edd, Jon F.
2013-03-01
Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.
Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W
2001-07-01
The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.
TCP Throughput Profiles Using Measurements over Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less
AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.
As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...
Shimada, Tsutomu; Kelly, Joan; LaMarr, William A; van Vlies, Naomi; Yasuda, Eriko; Mason, Robert W.; Mackenzie, William; Kubaski, Francyne; Giugliani, Roberto; Chinen, Yasutsugu; Yamaguchi, Seiji; Suzuki, Yasuyuki; Orii, Kenji E.; Fukao, Toshiyuki; Orii, Tadao; Tomatsu, Shunji
2014-01-01
Mucopolysaccharidoses (MPS) are caused by deficiency of one of a group of specific lysosomal enzymes, resulting in excessive accumulation of glycosaminoglycans (GAGs). We previously developed GAG assay methods using liquid chromatography tandem mass spectrometry (LC-MS/MS); however, it takes 4–5 min per sample for analysis. For the large numbers of samples in a screening program, a more rapid process is desirable. The automated high-throughput mass spectrometry (HT-MS/MS) system (RapidFire) integrates a solid phase extraction robot to concentrate and desalt samples prior to direction into the MS/MS without chromatographic separation; thereby allowing each sample to be processed within ten seconds (enabling screening of more than one million samples per year). The aim of this study was to develop a higher throughput system to assay heparan sulfate (HS) using HT-MS/MS, and to compare its reproducibility, sensitivity and specificity with conventional LC-MS/MS. HS levels were measured in blood (plasma and serum) from control subjects and patients with MPS II, III, or IV and in dried blood spots (DBS) from newborn controls and patients with MPS I, II, or III. Results obtained from HT-MS/MS showed 1) that there was a strong correlation of levels of disaccharides derived from HS in blood, between those calculated using conventional LC-MS/MS and HT-MS/MS, 2) that levels of HS in blood were significantly elevated in patients with MPS II and III, but not in MPS IVA, 3) that the level of HS in patients with a severe form of MPS II was higher than that in an attenuated form, 4) that reduction of blood HS level was observed in MPS II patients treated with enzyme replacement therapy or hematopoietic stem cell transplantation, and 5) that levels of HS in newborn DBS were elevated in patients with MPS I, II or III, compared to control newborns. In conclusion, HT-MS/MS provides much higher throughput than LC-MS/MS-based methods with similar sensitivity and specificity in an HS assay, indicating that HT-MS/MS may be feasible for diagnosis, monitoring, and newborn screening of MPS. PMID:25092413
Abdiche, Yasmina Noubia; Miles, Adam; Eckman, Josh; Foletti, Davide; Van Blarcom, Thomas J.; Yeung, Yik Andy; Pons, Jaume; Rajpal, Arvind
2014-01-01
Here, we demonstrate how array-based label-free biosensors can be applied to the multiplexed interaction analysis of large panels of analyte/ligand pairs, such as the epitope binning of monoclonal antibodies (mAbs). In this application, the larger the number of mAbs that are analyzed for cross-blocking in a pairwise and combinatorial manner against their specific antigen, the higher the probability of discriminating their epitopes. Since cross-blocking of two mAbs is necessary but not sufficient for them to bind an identical epitope, high-resolution epitope binning analysis determined by high-throughput experiments can enable the identification of mAbs with similar but unique epitopes. We demonstrate that a mAb's epitope and functional activity are correlated, thereby strengthening the relevance of epitope binning data to the discovery of therapeutic mAbs. We evaluated two state-of-the-art label-free biosensors that enable the parallel analysis of 96 unique analyte/ligand interactions and nearly ten thousand total interactions per unattended run. The IBIS-MX96 is a microarray-based surface plasmon resonance imager (SPRi) integrated with continuous flow microspotting technology whereas the Octet-HTX is equipped with disposable fiber optic sensors that use biolayer interferometry (BLI) detection. We compared their throughput, versatility, ease of sample preparation, and sample consumption in the context of epitope binning assays. We conclude that the main advantages of the SPRi technology are its exceptionally low sample consumption, facile sample preparation, and unparalleled unattended throughput. In contrast, the BLI technology is highly flexible because it allows for the simultaneous interaction analysis of 96 independent analyte/ligand pairs, ad hoc sensor replacement and on-line reloading of an analyte- or ligand-array. Thus, the complementary use of these two platforms can expedite applications that are relevant to the discovery of therapeutic mAbs, depending upon the sample availability, and the number and diversity of the interactions being studied. PMID:24651868
Evaluation of sequencing approaches for high-throughput ...
Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platforms for potential application to high-throughput screening: 1. TempO-Seq utilizing custom designed paired probes per gene; 2. Targeted sequencing (TSQ) utilizing Illumina’s TruSeq RNA Access Library Prep Kit containing tiled exon-specific probe sets; 3. Low coverage whole transcriptome sequencing (LSQ) using Illumina’s TruSeq Stranded mRNA Kit. Each platform was required to cover the ~20,000 genes of the full transcriptome, operate directly with cell lysates, and be automatable with 384-well plates. Technical reproducibility was assessed using MAQC control RNA samples A and B, while functional utility for chemical screening was evaluated using six treatments at a single concentration after 6 hr in MCF7 breast cancer cells: 10 µM chlorpromazine, 10 µM ciclopriox, 10 µM genistein, 100 nM sirolimus, 1 µM tanespimycin, and 1 µM trichostatin A. All RNA samples and chemical treatments were run with 5 technical replicates. The three platforms achieved different read depths, with the TempO-Seq having ~34M mapped reads per sample, while TSQ and LSQ averaged 20M and 11M aligned reads per sample, respectively. Inter-replicate correlation averaged ≥0.95 for raw log2 expression values i
Morris, Ulrika; Ding, Xavier C.; Jovel, Irina; Msellem, Mwinyi I.; Bergman, Daniel; Islam, Atiqul; Ali, Abdullah S.; Polley, Spencer; Gonzalez, Iveth J.; Mårtensson, Andreas; Björkman, Anders
2017-01-01
Background New field applicable diagnostic tools are needed for highly sensitive detection of residual malaria infections in pre-elimination settings. Field performance of a high throughput DNA extraction system for loop mediated isothermal amplification (HTP-LAMP) was therefore evaluated for detecting malaria parasites among asymptomatic individuals in Zanzibar. Methods HTP-LAMP performance was evaluated against real-time PCR on 3008 paired blood samples collected on filter papers in a community-based survey in 2015. Results The PCR and HTP-LAMP determined malaria prevalences were 1.6% (95%CI 1.3–2.4) and 0.7% (95%CI 0.4–1.1), respectively. The sensitivity of HTP-LAMP compared to PCR was 40.8% (CI95% 27.0–55.8) and the specificity was 99.9% (CI95% 99.8–100). For the PCR positive samples, there was no statistically significant difference between the geometric mean parasite densities among the HTP-LAMP positive (2.5 p/μL, range 0.2–770) and HTP-LAMP negative (1.4 p/μL, range 0.1–7) samples (p = 0.088). Two lab technicians analysed up to 282 samples per day and the HTP-LAMP method was experienced as user friendly. Conclusions Although field applicable, this high throughput format of LAMP as used here was not sensitive enough to be recommended for detection of asymptomatic low-density infections in areas like Zanzibar, approaching malaria elimination. PMID:28095434
Aydin-Schmidt, Berit; Morris, Ulrika; Ding, Xavier C; Jovel, Irina; Msellem, Mwinyi I; Bergman, Daniel; Islam, Atiqul; Ali, Abdullah S; Polley, Spencer; Gonzalez, Iveth J; Mårtensson, Andreas; Björkman, Anders
2017-01-01
New field applicable diagnostic tools are needed for highly sensitive detection of residual malaria infections in pre-elimination settings. Field performance of a high throughput DNA extraction system for loop mediated isothermal amplification (HTP-LAMP) was therefore evaluated for detecting malaria parasites among asymptomatic individuals in Zanzibar. HTP-LAMP performance was evaluated against real-time PCR on 3008 paired blood samples collected on filter papers in a community-based survey in 2015. The PCR and HTP-LAMP determined malaria prevalences were 1.6% (95%CI 1.3-2.4) and 0.7% (95%CI 0.4-1.1), respectively. The sensitivity of HTP-LAMP compared to PCR was 40.8% (CI95% 27.0-55.8) and the specificity was 99.9% (CI95% 99.8-100). For the PCR positive samples, there was no statistically significant difference between the geometric mean parasite densities among the HTP-LAMP positive (2.5 p/μL, range 0.2-770) and HTP-LAMP negative (1.4 p/μL, range 0.1-7) samples (p = 0.088). Two lab technicians analysed up to 282 samples per day and the HTP-LAMP method was experienced as user friendly. Although field applicable, this high throughput format of LAMP as used here was not sensitive enough to be recommended for detection of asymptomatic low-density infections in areas like Zanzibar, approaching malaria elimination.
Wen Lin; Asko Noormets; John S. King; Ge Sun; Steve McNulty; Jean-Christophe Domec; Lucas Cernusak
2017-01-01
Stable isotope ratios (δ13C and δ18O) of tree-ring α-cellulose are important tools in paleoclimatology, ecology, plant physiology and genetics. The Multiple Sample Isolation System for Solids (MSISS) was a major advance in the tree-ring α-cellulose extraction methods, offering greater throughput and reduced labor input compared to traditional alternatives. However, the...
A high throughput spectral image microscopy system
NASA Astrophysics Data System (ADS)
Gesley, M.; Puri, R.
2018-01-01
A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.
ECONOMICS OF SAMPLE COMPOSITING AS A SCREENING TOOL IN GROUND WATER QUALITY MONITORING
Recent advances in high throughput/automated compositing with robotics/field-screening methods offer seldom-tapped opportunities for achieving cost-reduction in ground water quality monitoring programs. n economic framework is presented in this paper for the evaluation of sample ...
NASA Astrophysics Data System (ADS)
Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki
In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.
The planning and establishment of a sample preparation laboratory for drug discovery
Dufresne, Claude
2000-01-01
Nature has always been a productive source of new drugs. With the advent of high-throughput screening, it has now become possible to rapidly screen large sample collections. In addition to seeking greater diversity from natural product sources (micro-organisms, plants, etc.), fractionation of the crude extracts prior to screening is becoming a more important part of our efforts. As sample preparation protocols become more involved, automation can help to achieve and maintain a desired sample throughput. To address the needs of our screening program, two robotic systems were designed. The first system processes crude extracts all the way to 96-well plates, containing solutions suitable for screening in biological and biochemical assays. The system can dissolve crude extracts, fractionate them on solid-phase extraction cartridges, dry and weigh each fraction, re-dissolve them to a known concentration, and prepare mother plates. The second system replicates mother plates into a number of daughter plates. PMID:18924691
NASA Astrophysics Data System (ADS)
Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun
2016-03-01
Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.
Zaromb, Solomon
2004-07-13
Air is sampled at a rate in excess of 100 L/min, preferably at 200-300 L/min, so as to collect therefrom a substantial fraction, i.e., at least 20%, preferably 60-100%, of airborne particulates. A substance of interest (analyte), such as lead, is rapidly solubilized from the the collected particulates into a sample of liquid extractant, and the concentration of the analyte in the extractant sample is determined. The high-rate air sampling and particulate collection may be effected with a high-throughput filter cartridge or with a recently developed portable high-throughput liquid-absorption air sampler. Rapid solubilization of lead is achieved by a liquid extractant comprising 0.1-1 M of acetic acid or acetate, preferably at a pH of 5 or less and preferably with inclusion of 1-10% of hydrogen peroxide. Rapid determination of the lead content in the liquid extractant may be effected with a colorimetric or an electroanalytical analyzer.
Kellogg, Elizabeth H; Leaver-Fay, Andrew; Baker, David
2011-03-01
The prediction of changes in protein stability and structure resulting from single amino acid substitutions is both a fundamental test of macromolecular modeling methodology and an important current problem as high throughput sequencing reveals sequence polymorphisms at an increasing rate. In principle, given the structure of a wild-type protein and a point mutation whose effects are to be predicted, an accurate method should recapitulate both the structural changes and the change in the folding-free energy. Here, we explore the performance of protocols which sample an increasing diversity of conformations. We find that surprisingly similar performances in predicting changes in stability are achieved using protocols that involve very different amounts of conformational sampling, provided that the resolution of the force field is matched to the resolution of the sampling method. Methods involving backbone sampling can in some cases closely recapitulate the structural changes accompanying mutations but not surprisingly tend to do more harm than good in cases where structural changes are negligible. Analysis of the outliers in the stability change calculations suggests areas needing particular improvement; these include the balance between desolvation and the formation of favorable buried polar interactions, and unfolded state modeling. Copyright © 2010 Wiley-Liss, Inc.
Zimmerlin, Alfred; Kiffe, Michael
2013-01-01
New enabling MS technologies have made it possible to elucidate metabolic pathways present in ex vivo (blood, bile and/or urine) or in vitro (liver microsomes, hepatocytes and/or S9) samples. When investigating samples from high throughput assays the challenge that the user is facing now is to extract the appropriate information and compile it so that it is understandable to all. Medicinal chemist may then design the next generation of (better) drug candidates combining the needs for potency and metabolic stability and their synthetic creativity. This review focuses on the comparison of these enabling MS technologies and the IT tools developed for their interpretation.
Allele quantification using molecular inversion probes (MIP)
Wang, Yuker; Moorhead, Martin; Karlin-Neumann, George; Falkowski, Matthew; Chen, Chunnuan; Siddiqui, Farooq; Davis, Ronald W.; Willis, Thomas D.; Faham, Malek
2005-01-01
Detection of genomic copy number changes has been an important research area, especially in cancer. Several high-throughput technologies have been developed to detect these changes. Features that are important for the utility of technologies assessing copy number changes include the ability to interrogate regions of interest at the desired density as well as the ability to differentiate the two homologs. In addition, assessing formaldehyde fixed and paraffin embedded (FFPE) samples allows the utilization of the vast majority of cancer samples. To address these points we demonstrate the use of molecular inversion probe (MIP) technology to the study of copy number. MIP is a high-throughput genotyping technology capable of interrogating >20 000 single nucleotide polymorphisms in the same tube. We have shown the ability of MIP at this multiplex level to provide copy number measurements while obtaining the allele information. In addition we have demonstrated a proof of principle for copy number analysis in FFPE samples. PMID:16314297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gentry, T.; Schadt, C.; Zhou, J.
Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. Thismore » review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.« less
Kondrashova, Olga; Love, Clare J.; Lunke, Sebastian; Hsu, Arthur L.; Waring, Paul M.; Taylor, Graham R.
2015-01-01
Whilst next generation sequencing can report point mutations in fixed tissue tumour samples reliably, the accurate determination of copy number is more challenging. The conventional Multiplex Ligation-dependent Probe Amplification (MLPA) assay is an effective tool for measurement of gene dosage, but is restricted to around 50 targets due to size resolution of the MLPA probes. By switching from a size-resolved format, to a sequence-resolved format we developed a scalable, high-throughput, quantitative assay. MLPA-seq is capable of detecting deletions, duplications, and amplifications in as little as 5ng of genomic DNA, including from formalin-fixed paraffin-embedded (FFPE) tumour samples. We show that this method can detect BRCA1, BRCA2, ERBB2 and CCNE1 copy number changes in DNA extracted from snap-frozen and FFPE tumour tissue, with 100% sensitivity and >99.5% specificity. PMID:26569395
An inexpensive autosampler for a DART/TOFMS provides mass spectra from analytes absorbed on 76 cotton swab, wipe samples in 7.5 min. A field sample carrier simplifies sample collection and provides swabs nearly ready for analysis to the lab. Applications of the high throughput pr...
Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...
Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...
Niu, Chenqi; Xu, Yuancong; Zhang, Chao; Zhu, Pengyu; Huang, Kunlun; Luo, Yunbo; Xu, Wentao
2018-05-01
As genetically modified (GM) technology develops and genetically modified organisms (GMOs) become more available, GMOs face increasing regulations and pressure to adhere to strict labeling guidelines. A singleplex detection method cannot perform the high-throughput analysis necessary for optimal GMO detection. Combining the advantages of multiplex detection and droplet digital polymerase chain reaction (ddPCR), a single universal primer-multiplex-ddPCR (SUP-M-ddPCR) strategy was proposed for accurate broad-spectrum screening and quantification. The SUP increases efficiency of the primers in PCR and plays an important role in establishing a high-throughput, multiplex detection method. Emerging ddPCR technology has been used for accurate quantification of nucleic acid molecules without a standard curve. Using maize as a reference point, four heterologous sequences ( 35S, NOS, NPTII, and PAT) were selected to evaluate the feasibility and applicability of this strategy. Surprisingly, these four genes cover more than 93% of the transgenic maize lines and serve as preliminary screening sequences. All screening probes were labeled with FAM fluorescence, which allows the signals from the samples with GMO content and those without to be easily differentiated. This fiveplex screening method is a new development in GMO screening. Utilizing an optimal amplification assay, the specificity, limit of detection (LOD), and limit of quantitation (LOQ) were validated. The LOD and LOQ of this GMO screening method were 0.1% and 0.01%, respectively, with a relative standard deviation (RSD) < 25%. This method could serve as an important tool for the detection of GM maize from different processed, commercially available products. Further, this screening method could be applied to other fields that require reliable and sensitive detection of DNA targets.
Carmona, Santiago J.; Nielsen, Morten; Schafer-Nielsen, Claus; Mucci, Juan; Altcheh, Jaime; Balouz, Virginia; Tekiel, Valeria; Frasch, Alberto C.; Campetella, Oscar; Buscaglia, Carlos A.; Agüero, Fernán
2015-01-01
Complete characterization of antibody specificities associated to natural infections is expected to provide a rich source of serologic biomarkers with potential applications in molecular diagnosis, follow-up of chemotherapeutic treatments, and prioritization of targets for vaccine development. Here, we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than 175,000 overlapping 15mer peptides derived from T. cruzi proteins. Peptides were synthesized in situ on microarray slides, spanning the complete length of 457 parasite proteins with fully overlapped 15mers (1 residue shift). Screening of these slides with antibodies purified from infected patients and healthy donors demonstrated both a high technical reproducibility as well as epitope mapping consistency when compared with earlier low-throughput technologies. Using a conservative signal threshold to classify positive (reactive) peptides we identified 2,031 disease-specific peptides and 97 novel parasite antigens, effectively doubling the number of known antigens and providing a 10-fold increase in the number of fine mapped antigenic determinants for this disease. Finally, further analysis of the chip data showed that optimizing the amount of sequence overlap of displayed peptides can increase the protein space covered in a single chip by at least ∼threefold without sacrificing sensitivity. In conclusion, we show the power of high-density peptide chips for the discovery of pathogen-specific linear B-cell epitopes from clinical samples, thus setting the stage for high-throughput biomarker discovery screenings and proteome-wide studies of immune responses against pathogens. PMID:25922409
Carmona, Santiago J; Nielsen, Morten; Schafer-Nielsen, Claus; Mucci, Juan; Altcheh, Jaime; Balouz, Virginia; Tekiel, Valeria; Frasch, Alberto C; Campetella, Oscar; Buscaglia, Carlos A; Agüero, Fernán
2015-07-01
Complete characterization of antibody specificities associated to natural infections is expected to provide a rich source of serologic biomarkers with potential applications in molecular diagnosis, follow-up of chemotherapeutic treatments, and prioritization of targets for vaccine development. Here, we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than 175,000 overlapping 15 mer peptides derived from T. cruzi proteins. Peptides were synthesized in situ on microarray slides, spanning the complete length of 457 parasite proteins with fully overlapped 15 mers (1 residue shift). Screening of these slides with antibodies purified from infected patients and healthy donors demonstrated both a high technical reproducibility as well as epitope mapping consistency when compared with earlier low-throughput technologies. Using a conservative signal threshold to classify positive (reactive) peptides we identified 2,031 disease-specific peptides and 97 novel parasite antigens, effectively doubling the number of known antigens and providing a 10-fold increase in the number of fine mapped antigenic determinants for this disease. Finally, further analysis of the chip data showed that optimizing the amount of sequence overlap of displayed peptides can increase the protein space covered in a single chip by at least ∼ threefold without sacrificing sensitivity. In conclusion, we show the power of high-density peptide chips for the discovery of pathogen-specific linear B-cell epitopes from clinical samples, thus setting the stage for high-throughput biomarker discovery screenings and proteome-wide studies of immune responses against pathogens. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Cummings, Kevin J.; Warnick, Lorin D.; Schukken, Ynte H.; Siler, Julie D.; Gröhn, Yrjo T.; Davis, Margaret A.; Besser, Tom E.; Wiedmann, Martin
2011-01-01
Abstract Data generated using different antimicrobial testing methods often have to be combined, but the equivalence of such results is difficult to assess. Here we compared two commonly used antimicrobial susceptibility testing methods, automated microbroth dilution and agar disk diffusion, for 8 common drugs, using 222 Salmonella isolates of serotypes Newport, Typhimurium, and 4,5,12:i-, which had been isolated from clinical salmonellosis cases among cattle and humans. Isolate classification corresponded well between tests, with 95% overall category agreement. Test results were significantly negatively correlated, and Spearman's correlation coefficients ranged from −0.98 to −0.38. Using Cox's proportional hazards model we determined that for most drugs, a 1 mm increase in zone diameter resulted in an estimated 20%–40% increase in the hazard of growth inhibition. However, additional parameters such as isolation year or serotype often impacted the hazard of growth inhibition as well. Comparison of economical feasibility showed that agar disk diffusion is clearly more cost-effective if the average sample throughput is small but that both methods are comparable at high sample throughput. In conclusion, for the Salmonella serotypes and antimicrobial drugs analyzed here, antimicrobial susceptibility data generated based on either test are qualitatively very comparable, and the current published break points for both methods are in excellent agreement. Economic feasibility clearly depends on the specific laboratory settings, and disk diffusion might be an attractive alternative for certain applications such as surveillance studies. PMID:21877930
High throughput single cell counting in droplet-based microfluidics.
Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie
2017-05-02
Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.
Jackson, Colin R.; Tyler, Heather L.; Millar, Justin J.
2013-01-01
Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample. PMID:24121617
Jackson, Colin R; Tyler, Heather L; Millar, Justin J
2013-10-01
Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Li, Fumin; Wang, Jun; Jenkins, Rand
2016-05-01
There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, M; Bronk, L; Guan, F
Purpose: To investigate the biologic effects of scanned protons by evenly sampling dose-averaged LET (LETd) values. Methods: Our previous high-throughput clonogenic study demonstrated a distinct relationship between RBE and LETd. However, our initial experimental design resulted in over-sampling the low LETd values in the plateau region of the Bragg curve while under-sampling in the region proximal to the Bragg peak as well as the high LETd values in the distal edge of the Bragg curve. To further examine the relationship between RBE and LETd, we refined the experimental design to more evenly sample proton LETd values from 1 to 20more » keV/µm by optimizing the thicknesses of the irradiation jig steps. We used the clonogenic survival as the biological endpoint for the H460 lung cancer cell line cultured in 96-well plates (12 columns by 8 rows). In the irradiation, the 8 wells in each column received a uniform dose-LETd pair. The dose-LETd pairs of the 12 different columns were sampled along the Bragg curve of 81.4 MeV scanned protons. Five peak dose levels from 1.5 Gy to 7.5 Gy were delivered with an increment of 1.5 Gy in the preliminary test. Two 96-well plates were irradiated simultaneously to decrease the statistical uncertainties. Results: In the proximal region, for LETd = 5 keV/µm and 8 keV/µm, we did not observe any distinct differential biologic effects between the survival curves. At the Bragg peak (LETd = 9.5 keV/µm) and in the distal edge, irradiation with increasing LET values resulted in decreasing cell survival. Conclusion: The survival curves from the new experimental design support our previous findings that below 10 keV/µm, the LET effect in cell kill is obscured, but above 10 keV/µm, the biologic effects increase with LETd. Funding Support: U19 CA021239-35 and R21 CA187484-01.« less
Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba
2014-11-01
Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.
Drummond, A; Rodrigo, A G
2000-12-01
Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.
Characterizing the bioactivity of complex environmental samples using high throughput toxicology
Bioassays can be employed to evaluate the integrated effects of complex mixtures of both known and unidentified contaminants present in environmental samples. However, such methods have typically focused on one or a few bioactivities despite the fact that the chemicals in a mixtu...
Utilizing high throughput bioassays to characterize the bioactivity of complex environmental samples
Bioassays can be employed to evaluate the integrated effects of complex mixtures of both known and unidentified contaminants present in environmental samples. However, such methods have typically focused on one or a few bioactivities despite the fact that the chemicals in a mixtu...
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
A compressive-sensing Fourier-transform on-chip Raman spectrometer
NASA Astrophysics Data System (ADS)
Podmore, Hugh; Scott, Alan; Lee, Regina
2018-02-01
We demonstrate a novel compressive sensing Fourier-transform spectrometer (FTS) for snapshot Raman spectroscopy in a compact format. The on-chip FTS consists of a set of planar-waveguide Mach-Zehnder interferometers (MZIs) arrayed on a photonic chip, effecting a discrete Fourier-transform of the input spectrum. Incoherence between the sampling domain (time), and the spectral domain (frequency) permits compressive sensing retrieval using undersampled interferograms for sparse spectra such as Raman emission. In our fabricated device we retain our chosen bandwidth and resolution while reducing the number of MZIs, e.g. the size of the interferogram, to 1/4th critical sampling. This architecture simultaneously reduces chip footprint and concentrates the interferogram in fewer pixels to improve the signal to noise ratio. Our device collects interferogram samples simultaneously, therefore a time-gated detector may be used to separate Raman peaks from sample fluorescence. A challenge for FTS waveguide spectrometers is to achieve multi-aperture high throughput broadband coupling to a large number of single-mode waveguides. A multi-aperture design allows one to increase the bandwidth and spectral resolution without sacrificing optical throughput. In this device, multi-aperture coupling is achieved using an array of microlenses bonded to the surface of the chip, and aligned with a grid of vertically illuminated waveguide apertures. The microlens array accepts a collimated beam with near 100% fill-factor, and the resulting spherical wavefronts are coupled into the single-mode waveguides using 45& mirrors etched into the waveguide layer via focused ion-beam (FIB). The interferogram from the waveguide outputs is imaged using a CCD, and inverted via l1-norm minimization to correctly retrieve a sparse input spectrum.
Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.
Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
2016-09-06
Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies.
Lehotay, Steven J; Lightfield, Alan R
2018-01-01
The way to maximize scope of analysis, sample throughput, and laboratory efficiency in the monitoring of veterinary drug residues in food animals is to determine as many analytes as possible as fast as possible in as few methods as possible. Capital and overhead expenses are also reduced by using fewer instruments in the overall monitoring scheme. Traditionally, the highly polar aminoglycoside antibiotics require different chromatographic conditions from other classes of drugs, but in this work, we demonstrate that an ion-pairing reagent (sodium 1-heptanesulfonate) added to the combined final extracts from two sample preparation methods attains good separation of 174 targeted drugs, including 9 aminoglycosides, in the same 10.5-min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis. The full method was validated in bovine kidney, liver, and muscle tissues according to US regulatory protocols, and 137-146 (79-84%) of the drugs gave between 70 and 120% average recoveries with ≤ 25% RSDs in the different types of tissues spiked at 0.5, 1, and 2 times the regulatory levels of interest (10-1000 ng/g depending on the drug). This method increases sample throughput and the possible number of drugs monitored in the US National Residue Program, and requires only one UHPLC-MS/MS method and instrument for analysis rather than two by the previous scheme. Graphical abstract Outline of the streamlined approach to monitor 174 veterinary drugs, including aminoglycosides, in bovine tissues by combining two extracts of the same sample with an ion-pairing reagent for analysis by UHPLC-MS/MS.
NASA Astrophysics Data System (ADS)
Campbell, T. L.; Geller, J. B.; Heller, P.; Ruiz, G.; Chang, A.; McCann, L.; Ceballos, L.; Marraffini, M.; Ashton, G.; Larson, K.; Havard, S.; Meagher, K.; Wheelock, M.; Drake, C.; Rhett, G.
2016-02-01
The Ballast Water Management Act, the Marine Invasive Species Act, and the Coastal Ecosystem Protection Act require the California Department of Fish and Wildlife to monitor and evaluate the extent of biological invasions in the state's marine and estuarine waters. This has been performed statewide, using a variety of methodologies. Conventional sample collection and processing is laborious, slow and costly, and may require considerable taxonomic expertise requiring detailed time-consuming microscopic study of multiple specimens. These factors limit the volume of biomass that can be searched for introduced species. New technologies continue to reduce the cost and increase the throughput of genetic analyses, which become efficient alternatives to traditional morphological analysis for identification, monitoring and surveillance of marine invasive species. Using next-generation sequencing of mitochondrial Cytochrome c oxidase subunit I (COI) and nuclear large subunit ribosomal RNA (LSU), we analyzed over 15,000 individual marine invertebrates collected in Californian waters. We have created sequence databases of California native and non-native species to assist in molecular identification and surveillance in North American waters. Metagenetics, the next-generation sequencing of environmental samples with comparison to DNA sequence databases, is a faster and cost-effective alternative to individual sample analysis. We have sequenced from biomass collected from whole settlement plates and plankton in California harbors, and used our introduced species database to create species lists. We can combine these species lists for individual marinas with collected environmental data, such as temperature, salinity, and dissolved oxygen to understand the ecology of marine invasions. Here we discuss high throughput sampling, sequencing, and COASTLINE, our data analysis answer to challenges working with hundreds of millions of sequencing reads from tens of thousands of specimens.
Doyle, Conor J; Gleeson, David; O'Toole, Paul W; Cotter, Paul D
2017-01-15
In pasture-based systems, changes in dairy herd habitat due to seasonality results in the exposure of animals to different environmental niches. These niches contain distinct microbial communities that may be transferred to raw milk, with potentially important food quality and safety implications for milk producers. It is postulated that the extent to which these microorganisms are transferred could be limited by the inclusion of a teat preparation step prior to milking. High-throughput sequencing on a variety of microbial niches on farms was used to study the patterns of microbial movement through the dairy production chain and, in the process, to investigate the impact of seasonal housing and the inclusion/exclusion of a teat preparation regime on the raw milk microbiota from the same herd over two sampling periods, i.e., indoor and outdoor. Beta diversity and network analyses showed that environmental and milk microbiotas separated depending on whether they were sourced from an indoor or outdoor environment. Within these respective habitats, similarities between the milk microbiota and that of teat swab samples and, to a lesser extent, fecal samples were apparent. Indeed, SourceTracker identified the teat surface as the most significant source of contamination, with herd feces being the next most prevalent source of contamination. In milk from cows grazing outdoors, teat prep significantly increased the numbers of total bacteria present. In summary, sequence-based microbiota analysis identified possible sources of raw milk contamination and highlighted the influence of environment and farm management practices on the raw milk microbiota. The composition of the raw milk microbiota is an important consideration from both a spoilage perspective and a food safety perspective and has implications for milk targeted for direct consumption and for downstream processing. Factors that influence contamination have been examined previously, primarily through the use of culture-based techniques. We describe here the extensive application of high-throughput DNA sequencing technologies to study the relationship between the milk production environment and the raw milk microbiota. The results show that the environment in which the herd was kept was the primary driver of the composition of the milk microbiota composition. Copyright © 2016 American Society for Microbiology.
Doyle, Conor J.; Gleeson, David; O'Toole, Paul W.
2016-01-01
ABSTRACT In pasture-based systems, changes in dairy herd habitat due to seasonality results in the exposure of animals to different environmental niches. These niches contain distinct microbial communities that may be transferred to raw milk, with potentially important food quality and safety implications for milk producers. It is postulated that the extent to which these microorganisms are transferred could be limited by the inclusion of a teat preparation step prior to milking. High-throughput sequencing on a variety of microbial niches on farms was used to study the patterns of microbial movement through the dairy production chain and, in the process, to investigate the impact of seasonal housing and the inclusion/exclusion of a teat preparation regime on the raw milk microbiota from the same herd over two sampling periods, i.e., indoor and outdoor. Beta diversity and network analyses showed that environmental and milk microbiotas separated depending on whether they were sourced from an indoor or outdoor environment. Within these respective habitats, similarities between the milk microbiota and that of teat swab samples and, to a lesser extent, fecal samples were apparent. Indeed, SourceTracker identified the teat surface as the most significant source of contamination, with herd feces being the next most prevalent source of contamination. In milk from cows grazing outdoors, teat prep significantly increased the numbers of total bacteria present. In summary, sequence-based microbiota analysis identified possible sources of raw milk contamination and highlighted the influence of environment and farm management practices on the raw milk microbiota. IMPORTANCE The composition of the raw milk microbiota is an important consideration from both a spoilage perspective and a food safety perspective and has implications for milk targeted for direct consumption and for downstream processing. Factors that influence contamination have been examined previously, primarily through the use of culture-based techniques. We describe here the extensive application of high-throughput DNA sequencing technologies to study the relationship between the milk production environment and the raw milk microbiota. The results show that the environment in which the herd was kept was the primary driver of the composition of the milk microbiota composition. PMID:27815277
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
High-throughput determination of biochemical oxygen demand (BOD) by a microplate-based biosensor.
Pang, Hei-Leung; Kwok, Nga-Yan; Chan, Pak-Ho; Yeung, Chi-Hung; Lo, Waihung; Wong, Kwok-Yin
2007-06-01
The use of the conventional 5-day biochemical oxygen demand (BOD5) method in BOD determination is greatly hampered by its time-consuming sampling procedure and its technical difficulty in the handling of a large pool of wastewater samples. Thus, it is highly desirable to develop a fast and high-throughput biosensor for BOD measurements. This paper describes the construction of a microplate-based biosensor consisting of an organically modified silica (ORMOSIL) oxygen sensing film for high-throughput determination of BOD in wastewater. The ORMOSIL oxygen sensing film was prepared by reacting tetramethoxysilane with dimethyldimethoxysilane in the presence of the oxygen-sensitive dye tris(4,7-diphenyl-1,10-phenanthroline)ruthenium-(II) chloride. The silica composite formed a homogeneous, crack-free oxygen sensing film on polystyrene microtiter plates with high stability, and the embedded ruthenium dye interacted with the dissolved oxygen in wastewater according to the Stern-Volmer relation. The bacterium Stenotrophomonas maltophilia was loaded into the ORMOSIL/ PVA composite (deposited on the top of the oxygen sensing film) and used to metabolize the organic compounds in wastewater. This BOD biosensor was found to be able to determine the BOD values of wastewater samples within 20 min by monitoring the dissolved oxygen concentrations. Moreover, the BOD values determined by the BOD biosensor were in good agreement with those obtained by the conventional BOD5 method.
Segat, Ludovica; Padovan, Lara; Doc, Darja; Petix, Vincenzo; Morgutti, Marcello; Crovella, Sergio; Ricci, Giuseppe
2012-12-01
We describe a real-time polymerase chain reaction (PCR) protocol based on the fluorescent molecule SYBR Green chemistry, for a low- to medium-throughput analysis of Y-chromosome microdeletions, optimized according to the European guidelines and aimed at making the protocol faster, avoiding post-PCR processing, and simplifying the results interpretation. We screened 156 men from the Assisted Reproduction Unit, Department of Obstetrics and Gynecology, Institute for Maternal and Child Health IRCCS Burlo Garofolo (Trieste, Italy), 150 not presenting Y-chromosome microdeletion, and 6 with microdeletions in different azoospermic factor (AZF) regions. For each sample, the Zinc finger Y-chromosomal protein (ZFY), sex-determining region Y (SRY), sY84, sY86, sY127, sY134, sY254, and sY255 loci were analyzed by performing one reaction for each locus. AZF microdeletions were successfully detected in six individuals, confirming the results obtained with commercial kits. Our real-time PCR protocol proved to be a rapid, safe, and relatively cheap method that was suitable for a low- to medium-throughput diagnosis of Y-chromosome microdeletion, which allows an analysis of approximately 10 samples (with the addition of positive and negative controls) in a 96-well plate format, or approximately 46 samples in a 384-well plate for all markers simultaneously, in less than 2 h without the need of post-PCR manipulation.
NASA Astrophysics Data System (ADS)
Hayasaki, Yoshio
2017-02-01
Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.
Evaluation of a High Throughput Starch Analysis Optimised for Wood
Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco
2014-01-01
Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863
NASA Astrophysics Data System (ADS)
Cannon, M. V.; Hester, J.; Shalkhauser, A.; Chan, E. R.; Logue, K.; Small, S. T.; Serre, D.
2016-03-01
Analysis of environmental DNA (eDNA) enables the detection of species of interest from water and soil samples, typically using species-specific PCR. Here, we describe a method to characterize the biodiversity of a given environment by amplifying eDNA using primer pairs targeting a wide range of taxa and high-throughput sequencing for species identification. We tested this approach on 91 water samples of 40 mL collected along the Cuyahoga River (Ohio, USA). We amplified eDNA using 12 primer pairs targeting mammals, fish, amphibians, birds, bryophytes, arthropods, copepods, plants and several microorganism taxa and sequenced all PCR products simultaneously by high-throughput sequencing. Overall, we identified DNA sequences from 15 species of fish, 17 species of mammals, 8 species of birds, 15 species of arthropods, one turtle and one salamander. Interestingly, in addition to aquatic and semi-aquatic animals, we identified DNA from terrestrial species that live near the Cuyahoga River. We also identified DNA from one Asian carp species invasive to the Great Lakes but that had not been previously reported in the Cuyahoga River. Our study shows that analysis of eDNA extracted from small water samples using wide-range PCR amplification combined with high-throughput sequencing can provide a broad perspective on biological diversity.
Hou, Weiguo; Wang, Shang; Briggs, Brandon R; Li, Gaoyuan; Xie, Wei; Dong, Hailiang
2018-01-01
Myocyanophages, a group of viruses infecting cyanobacteria, are abundant and play important roles in elemental cycling. Here we investigated the particle-associated viral communities retained on 0.2 μm filters and in sediment samples (representing ancient cyanophage communities) from four ocean and three lake locations, using high-throughput sequencing and a newly designed primer pair targeting a gene fragment (∼145-bp in length) encoding the cyanophage gp23 major capsid protein (MCP). Diverse viral communities were detected in all samples. The fragments of 142-, 145-, and 148-bp in length were most abundant in the amplicons, and most sequences (>92%) belonged to cyanophages. Additionally, different sequencing depths resulted in different diversity estimates of the viral community. Operational taxonomic units obtained from deep sequencing of the MCP gene covered the majority of those obtained from shallow sequencing, suggesting that deep sequencing exhibited a more complete picture of cyanophage community than shallow sequencing. Our results also revealed a wide geographic distribution of marine myocyanophages, i.e., higher dissimilarities of the myocyanophage communities corresponded with the larger distances between the sampling sites. Collectively, this study suggests that the newly designed primer pair can be effectively used to study the community and diversity of myocyanophage from different environments, and the high-throughput sequencing represents a good method to understand viral diversity.
Continuous flow real-time PCR device using multi-channel fluorescence excitation and detection.
Hatch, Andrew C; Ray, Tathagata; Lintecum, Kelly; Youngbull, Cody
2014-02-07
High throughput automation is greatly enhanced using techniques that employ conveyor belt strategies with un-interrupted streams of flow. We have developed a 'conveyor belt' analog for high throughput real-time quantitative Polymerase Chain Reaction (qPCR) using droplet emulsion technology. We developed a low power, portable device that employs LED and fiber optic fluorescence excitation in conjunction with a continuous flow thermal cycler to achieve multi-channel fluorescence detection for real-time fluorescence measurements. Continuously streaming fluid plugs or droplets pass through tubing wrapped around a two-temperature zone thermal block with each wrap of tubing fluorescently coupled to a 64-channel multi-anode PMT. This work demonstrates real-time qPCR of 0.1-10 μL droplets or fluid plugs over a range of 7 orders of magnitude concentration from 1 × 10(1) to 1 × 10(7). The real-time qPCR analysis allows dynamic range quantification as high as 1 × 10(7) copies per 10 μL reaction, with PCR efficiencies within the range of 90-110% based on serial dilution assays and a limit of detection of 10 copies per rxn. The combined functionality of continuous flow, low power thermal cycling, high throughput sample processing, and real-time qPCR improves the rates at which biological or environmental samples can be continuously sampled and analyzed.
Hou, Weiguo; Wang, Shang; Briggs, Brandon R.; Li, Gaoyuan; Xie, Wei; Dong, Hailiang
2018-01-01
Myocyanophages, a group of viruses infecting cyanobacteria, are abundant and play important roles in elemental cycling. Here we investigated the particle-associated viral communities retained on 0.2 μm filters and in sediment samples (representing ancient cyanophage communities) from four ocean and three lake locations, using high-throughput sequencing and a newly designed primer pair targeting a gene fragment (∼145-bp in length) encoding the cyanophage gp23 major capsid protein (MCP). Diverse viral communities were detected in all samples. The fragments of 142-, 145-, and 148-bp in length were most abundant in the amplicons, and most sequences (>92%) belonged to cyanophages. Additionally, different sequencing depths resulted in different diversity estimates of the viral community. Operational taxonomic units obtained from deep sequencing of the MCP gene covered the majority of those obtained from shallow sequencing, suggesting that deep sequencing exhibited a more complete picture of cyanophage community than shallow sequencing. Our results also revealed a wide geographic distribution of marine myocyanophages, i.e., higher dissimilarities of the myocyanophage communities corresponded with the larger distances between the sampling sites. Collectively, this study suggests that the newly designed primer pair can be effectively used to study the community and diversity of myocyanophage from different environments, and the high-throughput sequencing represents a good method to understand viral diversity.
Cannon, M. V.; Hester, J.; Shalkhauser, A.; Chan, E. R.; Logue, K.; Small, S. T.; Serre, D.
2016-01-01
Analysis of environmental DNA (eDNA) enables the detection of species of interest from water and soil samples, typically using species-specific PCR. Here, we describe a method to characterize the biodiversity of a given environment by amplifying eDNA using primer pairs targeting a wide range of taxa and high-throughput sequencing for species identification. We tested this approach on 91 water samples of 40 mL collected along the Cuyahoga River (Ohio, USA). We amplified eDNA using 12 primer pairs targeting mammals, fish, amphibians, birds, bryophytes, arthropods, copepods, plants and several microorganism taxa and sequenced all PCR products simultaneously by high-throughput sequencing. Overall, we identified DNA sequences from 15 species of fish, 17 species of mammals, 8 species of birds, 15 species of arthropods, one turtle and one salamander. Interestingly, in addition to aquatic and semi-aquatic animals, we identified DNA from terrestrial species that live near the Cuyahoga River. We also identified DNA from one Asian carp species invasive to the Great Lakes but that had not been previously reported in the Cuyahoga River. Our study shows that analysis of eDNA extracted from small water samples using wide-range PCR amplification combined with high-throughput sequencing can provide a broad perspective on biological diversity. PMID:26965911
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).
Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio)
Usui, Takuji; Noble, Daniel W.A.; O’Dea, Rose E.; Fangmeier, Melissa L.; Lagisz, Malgorzata; Hesselson, Daniel
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34–0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes. PMID:29372124
Ultra high-throughput nucleic acid sequencing as a tool for virus discovery in the turkey gut.
USDA-ARS?s Scientific Manuscript database
Recently, the use of the next generation of nucleic acid sequencing technology (i.e., 454 pyrosequencing, as developed by Roche/454 Life Sciences) has allowed an in-depth look at the uncultivated microorganisms present in complex environmental samples, including samples with agricultural importance....
Tsiliyannis, Christos Aristeides
2013-09-01
Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.
Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Liu, Ni; Tsao, Ming; Zhang, Tong; Kamel-Reid, Suzanne; da Cunha Santos, Gilda
2012-06-25
Novel high-throughput molecular technologies have made the collection and storage of cells and small tissue specimens a critical issue. The FTA card provides an alternative to cryopreservation for biobanking fresh unfixed cells. The current study compared the quality and integrity of the DNA obtained from 2 types of FTA cards (Classic and Elute) using 2 different extraction protocols ("Classic" and "Elute") and assessed the feasibility of performing multiplex mutational screening using fine-needle aspiration (FNA) biopsy samples. Residual material from 42 FNA biopsies was collected in the cards (21 Classic and 21 Elute cards). DNA was extracted using the Classic protocol for Classic cards and both protocols for Elute cards. Polymerase chain reaction for p53 (1.5 kilobase) and CARD11 (500 base pair) was performed to assess DNA integrity. Successful p53 amplification was achieved in 95.2% of the samples from the Classic cards and in 80.9% of the samples from the Elute cards using the Classic protocol and 28.5% using the Elute protocol (P = .001). All samples (both cards) could be amplified for CARD11. There was no significant difference in the DNA concentration or 260/280 purity ratio when the 2 types of cards were compared. Five samples were also successfully analyzed by multiplex MassARRAY spectrometry, with a mutation in KRAS found in 1 case. High molecular weight DNA was extracted from the cards in sufficient amounts and quality to perform high-throughput multiplex mutation assays. The results of the current study also suggest that FTA Classic cards preserve better DNA integrity for molecular applications compared with the FTA Elute cards. Copyright © 2012 American Cancer Society.
External evaluation of the Dimension Vista 1500® intelligent lab system.
Bruneel, Arnaud; Dehoux, Monique; Barnier, Anne; Boutten, Anne
2012-09-01
Dimension Vista® analyzer combines four technologies (photometry, nephelometry, V-LYTE® integrated multisensor potentiometry, and LOCI® chemiluminescence) into one high-throughput system. We assessed analytical performance of assays routinely performed in our emergency laboratory according to the VALTEC protocol, and practicability. Precision was good for most parameters. Analytical domain was large and suitable for undiluted analysis in most clinical settings encountered in our hospital. Data were comparable and correlated to our routine analyzers (Roche Modular DP®, Abbott AXSYM®, Siemens Dimension® RxL, and BN ProSpec®). Performance of nephelometric and LOCI modules was excellent. Functional sensitivity of high-sensitivity C-reactive protein and cardiac troponin I were 0.165 mg/l and 0.03 ng/ml, respectively (coefficient of variation; CV < 10%). The influence of interfering substances (i.e., hemoglobin, bilirubin, or lipids) was moderate, and Dimension Vista® specifically alerted for interference according to HIL (hemolysis, icterus, lipemia) indices. Good instrument performance and full functionality (no reagent or sample carryover in the conditions evaluated, effective sample-volume detection, and clot detection) were confirmed. Simulated routine testing demonstrated excellent practicability, throughput, ease of use of software and security. Performance and practicability of Dimension Vista® are highly suitable for both routine and emergency use. Since no volume detection and thus no warning is available on limited sample racks, pediatric samples require special caution to the Siemens protocol to be analyzed in secured conditions. Our experience in routine practice is also discussed, i.e., the impact of daily workload, "manual" steps resulting from dilutions and pediatric samples, maintenances, flex hydration on instrument's performance on throughput and turnaround time. © 2012 Wiley Periodicals, Inc.
Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations.
Trnka, Hjalte; Wu, Jian X; Van De Weert, Marco; Grohganz, Holger; Rantanen, Jukka
2013-12-01
Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilarity concept complicate the development phase of safe and cost-effective drug products. To streamline the development phase and to make high-throughput formulation screening possible, efficient solutions for analyzing critical quality attributes such as cake quality with minimal material consumption are needed. The aim of this study was to develop a fuzzy logic system based on image analysis (IA) for analyzing cake quality. Freeze-dried samples with different visual quality attributes were prepared in well plates. Imaging solutions together with image analytical routines were developed for extracting critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed fuzzy logic-based system was found to give comparable quality scores with visual evaluation, making high-throughput classification of cake quality possible. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal
2012-09-01
Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Rapid processing of 85Kr/Kr ratios using Atom Trap Trace Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zappala, J. C.; Bailey, K.; Mueller, P.
In this paper, we report a methodology for measuring 85Kr/Kr isotopic abundances using Atom Trap Trace Analysis (ATTA) that increases sample measurement throughput by over an order of magnitude to six samples per 24 h. The noble gas isotope 85Kr (half-life = 10.7 years) is a useful tracer for young groundwater in the age range of 5–50 years. ATTA, an efficient and selective laser-based atom counting method, has recently been applied to 85Kr/Kr isotopic abundance measurements, requiring 5–10 μL of krypton gas at STP extracted from 50 to 100 L of water. Previously, a single such measurement required 48 h.more » In conclusion, our new method demonstrates that we can measure 85Kr/Kr ratios with 3–5% relative uncertainty every 4 h, on average, with the same sample requirements.« less
Rapid processing of 85Kr/Kr ratios using Atom Trap Trace Analysis
Zappala, J. C.; Bailey, K.; Mueller, P.; ...
2017-03-11
In this paper, we report a methodology for measuring 85Kr/Kr isotopic abundances using Atom Trap Trace Analysis (ATTA) that increases sample measurement throughput by over an order of magnitude to six samples per 24 h. The noble gas isotope 85Kr (half-life = 10.7 years) is a useful tracer for young groundwater in the age range of 5–50 years. ATTA, an efficient and selective laser-based atom counting method, has recently been applied to 85Kr/Kr isotopic abundance measurements, requiring 5–10 μL of krypton gas at STP extracted from 50 to 100 L of water. Previously, a single such measurement required 48 h.more » In conclusion, our new method demonstrates that we can measure 85Kr/Kr ratios with 3–5% relative uncertainty every 4 h, on average, with the same sample requirements.« less
Surveillance of wild birds for avian influenza virus.
Hoye, Bethany J; Munster, Vincent J; Nishiura, Hiroshi; Klaassen, Marcel; Fouchier, Ron A M
2010-12-01
Recent demand for increased understanding of avian influenza virus in its natural hosts, together with the development of high-throughput diagnostics, has heralded a new era in wildlife disease surveillance. However, survey design, sampling, and interpretation in the context of host populations still present major challenges. We critically reviewed current surveillance to distill a series of considerations pertinent to avian influenza virus surveillance in wild birds, including consideration of what, when, where, and how many to sample in the context of survey objectives. Recognizing that wildlife disease surveillance is logistically and financially constrained, we discuss pragmatic alternatives for achieving probability-based sampling schemes that capture this host-pathogen system. We recommend hypothesis-driven surveillance through standardized, local surveys that are, in turn, strategically compiled over broad geographic areas. Rethinking the use of existing surveillance infrastructure can thereby greatly enhance our global understanding of avian influenza and other zoonotic diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Shuyu; Yu, Shifeng; Siedler, Michael
This study presented a power compensated MEMS differential scanning calorimeter (DSC) for protein stability characterization. In this microfabricated sensor, PDMS (Polydimethylsiloxane) and polyimide were used to construct the adiabatic chamber (1 μL) and temperature sensitive vanadium oxide was used as the thermistor material. A power compensation system was implemented to maintain the sample and reference at the same temperature. The resolution study and step response characterization indicated the high sensitivity (6 V/W) and low noise level (60 μk) of the device. The test with IgG1 antibody (mAb1) samples showed clear phase transitions and the data was confirmed to be reasonablemore » by comparing it with the results of commercial DSC’s test. Finally, this device used ~1uL sample amount and could complete the scanning process in 4 min, significantly increasing the throughput of the bimolecular thermodynamics study like drug formulation process.« less
A power compensated differential scanning calorimeter for protein stability characterization
Wang, Shuyu; Yu, Shifeng; Siedler, Michael; ...
2017-10-07
This study presented a power compensated MEMS differential scanning calorimeter (DSC) for protein stability characterization. In this microfabricated sensor, PDMS (Polydimethylsiloxane) and polyimide were used to construct the adiabatic chamber (1 μL) and temperature sensitive vanadium oxide was used as the thermistor material. A power compensation system was implemented to maintain the sample and reference at the same temperature. The resolution study and step response characterization indicated the high sensitivity (6 V/W) and low noise level (60 μk) of the device. The test with IgG1 antibody (mAb1) samples showed clear phase transitions and the data was confirmed to be reasonablemore » by comparing it with the results of commercial DSC’s test. Finally, this device used ~1uL sample amount and could complete the scanning process in 4 min, significantly increasing the throughput of the bimolecular thermodynamics study like drug formulation process.« less
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Li, Xingnan; Franke, Adrian A.
2015-01-01
An affordable and fast liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed for the accurate and precise determination of global DNA methylation levels in peripheral blood. Global DNA methylation extent was expressed as the ratio of methylated 2′-deoxycytidine (5MedC) to 2′-deoxyguanosine (dG), which were obtained after DNA extraction and hydrolysis and determined by positive electrospray LC–ESI-MS/MS. The cost-effective internal standards 15N3-dC and 15N5-dG were incorporated for the accurate quantification of 5MedC and dG, respectively. The desired nucleoside analytes were separated and eluted by LC within 2.5 min on a reverse phase column with a limit of detection of 1.4 femtomole on column for 5MedC. Sample preparation in 96-well format has significantly increased the assay throughput and filtration was found to be a necessary step to assure precision. Precision was performed with repeated analysis of four DNA QC sample over 12 days, with mean intra- and inter-day CVs of 6% and 11%, respectively. Accuracy was evaluated by comparison with a previously reported method showing a mean CV of 4% for 5 subjects analyzed. Furthermore, application of the assay using a benchtop orbitrap LCMS in exact mass full scan mode showed comparable sensitivity to tandem LCMS using multiple reaction monitoring. PMID:21843675
Increasing throughput of multiplexed electrical bus in pipe-lined architecture
Asaad, Sameh; Brezzo, Bernard V; Kapur, Mohit
2014-05-27
Techniques are disclosed for increasing the throughput of a multiplexed electrical bus by exploiting available pipeline stages of a computer or other system. For example, a method for increasing a throughput of an electrical bus that connects at least two devices in a system comprises introducing at least one signal hold stage in a signal-receiving one of the two devices, such that a maximum frequency at which the two devices are operated is not limited by a number of cycles of an operating frequency of the electrical bus needed for a signal to propagate from a signal-transmitting one of the two devices to the signal-receiving one of the two devices. Preferably, the signal hold stage introduced in the signal-receiving one of the two devices is a pipeline stage re-allocated from the signal-transmitting one of the two devices.
L-edge spectroscopy of dilute, radiation-sensitive systems using a transition-edge-sensor array
NASA Astrophysics Data System (ADS)
Titus, Charles J.; Baker, Michael L.; Lee, Sang Jun; Cho, Hsiao-Mei; Doriese, William B.; Fowler, Joseph W.; Gaffney, Kelly; Gard, Johnathon D.; Hilton, Gene C.; Kenney, Chris; Knight, Jason; Li, Dale; Marks, Ronald; Minitti, Michael P.; Morgan, Kelsey M.; O'Neil, Galen C.; Reintsema, Carl D.; Schmidt, Daniel R.; Sokaras, Dimosthenis; Swetz, Daniel S.; Ullom, Joel N.; Weng, Tsu-Chien; Williams, Christopher; Young, Betty A.; Irwin, Kent D.; Solomon, Edward I.; Nordlund, Dennis
2017-12-01
We present X-ray absorption spectroscopy and resonant inelastic X-ray scattering (RIXS) measurements on the iron L-edge of 0.5 mM aqueous ferricyanide. These measurements demonstrate the ability of high-throughput transition-edge-sensor (TES) spectrometers to access the rich soft X-ray (100-2000 eV) spectroscopy regime for dilute and radiation-sensitive samples. Our low-concentration data are in agreement with high-concentration measurements recorded by grating spectrometers. These results show that soft-X-ray RIXS spectroscopy acquired by high-throughput TES spectrometers can be used to study the local electronic structure of dilute metal-centered complexes relevant to biology, chemistry, and catalysis. In particular, TES spectrometers have a unique ability to characterize frozen solutions of radiation- and temperature-sensitive samples.
Case management redesign in an urban facility.
Almaden, Stefany; Freshman, Brenda; Quaye, Beverly
2011-01-01
To explore strategies for improving patient throughput and to redesign case management processes to facilitate level of care transitions and safe discharges. Large Urban Medical Center in South Los Angeles County, with 384 licensed beds that services poor, underserved communities. Both qualitative and quantitative methods were applied. Combined theoretical frameworks were used for needs assessment, intervention strategies, and change management. Observations, interviews, surveys, and database extraction methods were used. The sample consisted of case management staff members and several other staff from nursing, social work, and emergency department staff. Postintervention measures indicated improvement in reimbursements for services, reduction in length of stay, increased productivity, improved patients' access to care, and avoiding unnecessary readmission or emergency department visits. Effective change management strategies must consider multiple factors that influence daily operations and service delivery. Creating accountability by using performance measures associated with patient transitions is highlighted by the case study results. The authors developed a process model to assist in identifying and tracking outcome measures related to patient throughput, front-end assessments, and effective patient care transitions. This model can be used in future research to further investigate best case management practices.
High Throughput Multispectral Image Processing with Applications in Food Science.
Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John
2015-01-01
Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.
2015-01-01
Layer-by-layer (LbL) assembly is a powerful tool with increasing real world applications in energy, biomaterials, active surfaces, and membranes; however, the current state of the art requires individual sample construction using large quantities of material. Here we describe a technique using capillary flow within a microfluidic device to drive high-throughput assembly of LbL film libraries. This capillary flow layer-by-layer (CF-LbL) method significantly reduces material waste, improves quality control, and expands the potential applications of LbL into new research spaces. The method can be operated as a simple lab benchtop apparatus or combined with liquid-handling robotics to extend the library size. Here we describe and demonstrate the technique and establish its ability to recreate and expand on the known literature for film growth and morphology. We use the same platform to assay biological properties such as cell adhesion and proliferation and ultimately provide an example of the use of this approach to identify LbL films for surface-based DNA transfection of commonly used cell types. PMID:24836460
Automation in biological crystallization.
Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen
2014-06-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.
Automation in biological crystallization
Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen
2014-01-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074
Rapid determination of enantiomeric excess: a focus on optical approaches.
Leung, Diana; Kang, Sung Ok; Anslyn, Eric V
2012-01-07
High-throughput screening (HTS) methods are becoming increasingly essential in discovering chiral catalysts or auxiliaries for asymmetric transformations due to the advent of parallel synthesis and combinatorial chemistry. Both parallel synthesis and combinatorial chemistry can lead to the exploration of a range of structural candidates and reaction conditions as a means to obtain the highest enantiomeric excess (ee) of a desired transformation. One current bottleneck in these approaches to asymmetric reactions is the determination of ee, which has led researchers to explore a wide range of HTS techniques. To be truly high-throughput, it has been proposed that a technique that can analyse a thousand or more samples per day is needed. Many of the current approaches to this goal are based on optical methods because they allow for a rapid determination of ee due to quick data collection and their parallel analysis capabilities. In this critical review these techniques are reviewed with a discussion of their respective advantages and drawbacks, and with a contrast to chromatographic methods (180 references). This journal is © The Royal Society of Chemistry 2012
Jiang, Guangli; Liu, Leibo; Zhu, Wenping; Yin, Shouyi; Wei, Shaojun
2015-09-04
This paper proposes a real-time feature extraction VLSI architecture for high-resolution images based on the accelerated KAZE algorithm. Firstly, a new system architecture is proposed. It increases the system throughput, provides flexibility in image resolution, and offers trade-offs between speed and scaling robustness. The architecture consists of a two-dimensional pipeline array that fully utilizes computational similarities in octaves. Secondly, a substructure (block-serial discrete-time cellular neural network) that can realize a nonlinear filter is proposed. This structure decreases the memory demand through the removal of data dependency. Thirdly, a hardware-friendly descriptor is introduced in order to overcome the hardware design bottleneck through the polar sample pattern; a simplified method to realize rotation invariance is also presented. Finally, the proposed architecture is designed in TSMC 65 nm CMOS technology. The experimental results show a performance of 127 fps in full HD resolution at 200 MHz frequency. The peak performance reaches 181 GOPS and the throughput is double the speed of other state-of-the-art architectures.
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2007-10-01
A simple model is presented of a possible inspection regimen applied to each leg of a cargo containers' journey between its point of origin and destination. Several candidate modalities are proposed to be used at multiple remote locations to act as a pre-screen inspection as the target approaches a perimeter and as the primary inspection modality at the portal. Information from multiple data sets are fused to optimize the costs and performance of a network of such inspection systems. A series of image processing algorithms are presented that automatically process X-ray images of containerized cargo. The goal of this processing is to locate the container in a real time stream of traffic traversing a portal without impeding the flow of commerce. Such processing may facilitate the inclusion of unmanned/unattended inspection systems in such a network. Several samples of the processing applied to data collected from deployed systems are included. Simulated data from a notional cargo inspection system with multiple sensor modalities and advanced data fusion algorithms are also included to show the potential increased detection and throughput performance of such a configuration.
Asif, Muhammad; Guo, Xiangzhou; Zhang, Jing; Miao, Jungang
2018-04-17
Digital cross-correlation is central to many applications including but not limited to Digital Image Processing, Satellite Navigation and Remote Sensing. With recent advancements in digital technology, the computational demands of such applications have increased enormously. In this paper we are presenting a high throughput digital cross correlator, capable of processing 1-bit digitized stream, at the rate of up to 2 GHz, simultaneously on 64 channels i.e., approximately 4 Trillion correlation and accumulation operations per second. In order to achieve higher throughput, we have focused on frequency based partitioning of our design and tried to minimize and localize high frequency operations. This correlator is designed for a Passive Millimeter Wave Imager intended for the detection of contraband items concealed on human body. The goals are to increase the system bandwidth, achieve video rate imaging, improve sensitivity and reduce the size. Design methodology is detailed in subsequent sections, elaborating the techniques enabling high throughput. The design is verified for Xilinx Kintex UltraScale device in simulation and the implementation results are given in terms of device utilization and power consumption estimates. Our results show considerable improvements in throughput as compared to our baseline design, while the correlator successfully meets the functional requirements.
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing
In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. ...
The toxicity-testing paradigm has evolved to include high-throughput (HT) methods for addressing the increasing need to screen hundreds to thousands of chemicals rapidly. Approaches that involve in vitro screening assays, in silico predictions of exposure concentrations, and phar...
Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...
Bekeschus, Sander; Schmidt, Anke; Kramer, Axel; Metelmann, Hans-Robert; Adler, Frank; von Woedtke, Thomas; Niessner, Felix; Weltmann, Klaus-Dieter; Wende, Kristian
2018-05-01
Promising cold physical plasma sources have been developed in the field of plasma medicine. An important prerequisite to their clinical use is lack of genotoxic effects in cells. During optimization of one or even different plasma sources for a specific application, large numbers of samples need to be analyzed. There are soft and easy-to-assess markers for genotoxic stress such as phosphorylation of histone H2AX (γH2AX) but only few tests are accredited by the OECD with regard to mutagenicity detection. The micronucleus (MN) assay is among them but often requires manual counting of many thousands of cells per sample under the microscope. A high-throughput MN assay is presented using image flow cytometry and image analysis software. A human lymphocyte cell line was treated with plasma generated with ten different feed gas conditions corresponding to distinct reactive species patterns that were investigated for their genotoxic potential. Several millions of cells were automatically analyzed by a MN quantification strategy outlined in detail in this work. Our data demonstrates the absence of newly formed MN in any feed gas condition using the atmospheric pressure plasma jet kINPen. As positive control, ionizing radiation gave a significant 5-fold increase in micronucleus frequency. Thus, this assay is suitable to assess the genotoxic potential in large sample sets of cells exposed chemical or physical agents including plasmas in an efficient, reliable, and semiautomated manner. Environ. Mol. Mutagen. 59:268-277, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Sul, Woo Jun; Cole, James R.; Jesus, Ederson da C.; Wang, Qiong; Farris, Ryan J.; Fish, Jordan A.; Tiedje, James M.
2011-01-01
High-throughput sequencing of 16S rRNA genes has increased our understanding of microbial community structure, but now even higher-throughput methods to the Illumina scale allow the creation of much larger datasets with more samples and orders-of-magnitude more sequences that swamp current analytic methods. We developed a method capable of handling these larger datasets on the basis of assignment of sequences into an existing taxonomy using a supervised learning approach (taxonomy-supervised analysis). We compared this method with a commonly used clustering approach based on sequence similarity (taxonomy-unsupervised analysis). We sampled 211 different bacterial communities from various habitats and obtained ∼1.3 million 16S rRNA sequences spanning the V4 hypervariable region by pyrosequencing. Both methodologies gave similar ecological conclusions in that β-diversity measures calculated by using these two types of matrices were significantly correlated to each other, as were the ordination configurations and hierarchical clustering dendrograms. In addition, our taxonomy-supervised analyses were also highly correlated with phylogenetic methods, such as UniFrac. The taxonomy-supervised analysis has the advantages that it is not limited by the exhaustive computation required for the alignment and clustering necessary for the taxonomy-unsupervised analysis, is more tolerant of sequencing errors, and allows comparisons when sequences are from different regions of the 16S rRNA gene. With the tremendous expansion in 16S rRNA data acquisition underway, the taxonomy-supervised approach offers the potential to provide more rapid and extensive community comparisons across habitats and samples. PMID:21873204
High-throughput NIR spectroscopic (NIRS) detection of microplastics in soil.
Paul, Andrea; Wander, Lukas; Becker, Roland; Goedecke, Caroline; Braun, Ulrike
2018-05-12
The increasing pollution of terrestrial and aquatic ecosystems with plastic debris leads to the accumulation of microscopic plastic particles of still unknown amount. To monitor the degree of contamination, analytical methods are urgently needed, which help to quantify microplastics (MP). Currently, time-costly purified materials enriched on filters are investigated both by micro-infrared spectroscopy and/or micro-Raman. Although yielding precise results, these techniques are time consuming, and are restricted to the analysis of a small part of the sample in the order of few micrograms. To overcome these problems, we tested a macroscopic dimensioned near-infrared (NIR) process-spectroscopic method in combination with chemometrics. For calibration, artificial MP/ soil mixtures containing defined ratios of polyethylene, polyethylene terephthalate, polypropylene, and polystyrene with diameters < 125 μm were prepared and measured by a process FT-NIR spectrometer equipped with a fiber-optic reflection probe. The resulting spectra were processed by chemometric models including support vector machine regression (SVR), and partial least squares discriminant analysis (PLS-DA). Validation of models by MP mixtures, MP-free soils, and real-world samples, e.g., fermenter residue, suggests a reliable detection and a possible classification of MP at levels above 0.5 to 1.0 mass% depending on the polymer. The benefit of the combined NIRS chemometric approach lies in the rapid assessment whether soil contains MP, without any chemical pretreatment. The method can be used with larger sample volumes and even allows for an online prediction and thus meets the demand of a high-throughput method.
High-Throughput Quantitation of Neonicotinoids in Lyophilized Surface Water by LC-APCI-MS/MS.
Morrison, Lucas M; Renaud, Justin B; Sabourin, Lyne; Sumarah, Mark W; Yeung, Ken K C; Lapen, David R
2018-05-21
Background : Neonicotinoids are among the most widely used insecticides. Recently, there has been concern associated with unintended adverse effects on honeybees and aquatic invertebrates at low parts-per-trillion levels. Objective : There is a need for LC-MS/MS methods that are capable of high-throughput measurements of the most widely used neonicotinoids at environmentally relevant concentrations in surface water. Methods : This method allows for quantitation of acetamiprid, clothianidin, imidacloprid, dinotefuran, nitenpyram, thiacloprid, and thiamethoxam in surface water. Deuterated internal standards are added to 20 mL environmental samples, which are concentrated by lyophilisation and reconstituted with methanol followed by acetonitrile. Results : A large variation of mean recovery efficiencies across five different surface water sampling sites within this study was observed, ranging from 45 to 74%. This demonstrated the need for labelled internal standards to compensate for these differences. Atmospheric pressure chemical ionization (APCI) performed better than electrospray ionization (ESI) with limited matrix suppression, achieving 71-110% of the laboratory fortified blank signal. Neonicotinoids were resolved on a C18 column using a 5 min LC method, in which MQL ranged between 0.93 and 4.88 ng/L. Conclusions : This method enables cost effective, accurate, and reproducible monitoring of these pesticides in the aquatic environment. Highlights : Lyophilization is used for high throughput concentration of neonicotinoids in surface water. Variations in matrix effects between samples was greatly reduced by using APCI compared with ESI. Clothianidin and thiamethoxam were detected in all samples with levels ranging from below method quantitation limit to 65 ng/L.
Problems in processing Rheinische Braunkohle (soft coal) (in German)
DOE Office of Scientific and Technical Information (OSTI.GOV)
von Hartmann, G.B.
At Wesseling, difficulties were encountered with the hydrogenation of Rhine brown coal. The hydrogenation reaction was proceeding too rapidly at 600 atm pressure under relatively low temperature and throughput conditions. This caused a build-up of ''caviar'' deposits containing ash and asphalts. This flocculation of asphalt seemed to arise because the rapid reaction produced a liquid medium unable to hold the heavy asphalt particles in suspension. A stronger paraffinic character of the oil was also a result. To obtain practical, problem-free yields, throughput had to be increased (from .4 kg/liter/hr to more than .5), and temperature had to be increased (frommore » 24.0 MV to 24,8 MV). Further, a considerable increase in sludge recycling was recommended. The Wesseling plant was unable to increase the temperature and throughput. However, more sludge was recycled, producing a paste better able to hold higher-molecular-weight particles in suspension. If this were not to solve the ''caviar'' deposit problems, further recommendations were suggested including addition of more heavy oil.« less
A Robust Framework for Microbial Archaeology
Warinner, Christina; Herbig, Alexander; Mann, Allison; Yates, James A. Fellows; Weiβ, Clemens L.; Burbano, Hernán A.; Orlando, Ludovic; Krause, Johannes
2017-01-01
Microbial archaeology is flourishing in the era of high-throughput sequencing, revealing the agents behind devastating historical plagues, identifying the cryptic movements of pathogens in prehistory, and reconstructing the ancestral microbiota of humans. Here, we introduce the fundamental concepts and theoretical framework of the discipline, then discuss applied methodologies for pathogen identification and microbiome characterization from archaeological samples. We give special attention to the process of identifying, validating, and authenticating ancient microbes using high-throughput DNA sequencing data. Finally, we outline standards and precautions to guide future research in the field. PMID:28460196
Three applications of backscatter x-ray imaging technology to homeland defense
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2005-05-01
A brief review of backscatter x-ray imaging and a description of three systems currently applying it to homeland defense missions (BodySearch, ZBV and ZBP). These missions include detection of concealed weapons, explosives and contraband on personnel, in vehicles and large cargo containers. An overview of the x-ray imaging subsystems is provided as well as sample images from each system. Key features such as x-ray safety, throughput and detection are discussed. Recent trends in operational modes are described that facilitate 100% inspection at high throughput chokepoints.
Droplet microfluidic technology for single-cell high-throughput screening.
Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L
2009-08-25
We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.
Handheld Fluorescence Microscopy based Flow Analyzer.
Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva
2016-03-01
Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.
A Memory Efficient Network Encryption Scheme
NASA Astrophysics Data System (ADS)
El-Fotouh, Mohamed Abo; Diepold, Klaus
In this paper, we studied the two widely used encryption schemes in network applications. Shortcomings have been found in both schemes, as these schemes consume either more memory to gain high throughput or low memory with low throughput. The need has aroused for a scheme that has low memory requirements and in the same time possesses high speed, as the number of the internet users increases each day. We used the SSM model [1], to construct an encryption scheme based on the AES. The proposed scheme possesses high throughput together with low memory requirements.
An analysis of the development of port operation in Da Nang Port, Vietnam
NASA Astrophysics Data System (ADS)
Nguyen, T. D. H.; Cools, M.
2018-04-01
This paper presents the current operating status in Da Nang Port, Vietnam in the period 2012-2016. The port operation had positive changes that were reflected by a significant increase in total throughputs, especially containerized cargo volumes. Classical decomposition techniques are used to find trend-cycle and seasonal components of monthly throughput flows. Appropriate predictive models of different kinds of throughputs are proposed. Finally, a development strategy towards containerization and investment policies in facilities, equipment, and infrastructure are suggested based on the predictive results.
Recent advances in quantitative high throughput and high content data analysis.
Moutsatsos, Ioannis K; Parker, Christian N
2016-01-01
High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.
Mobli, Mehdi; Stern, Alan S.; Bermel, Wolfgang; King, Glenn F.; Hoch, Jeffrey C.
2010-01-01
One of the stiffest challenges in structural studies of proteins using NMR is the assignment of sidechain resonances. Typically, a panel of lengthy 3D experiments are acquired in order to establish connectivities and resolve ambiguities due to overlap. We demonstrate that these experiments can be replaced by a single 4D experiment that is time-efficient, yields excellent resolution, and captures unique carbon-proton connectivity information. The approach is made practical by the use of non-uniform sampling in the three indirect time dimensions and maximum entropy reconstruction of the corresponding 3D frequency spectrum. This 4D method will facilitate automated resonance assignment procedures and it should be particularly beneficial for increasing throughput in NMR-based structural genomics initiatives. PMID:20299257
Toward reliable and repeatable automated STEM-EDS metrology with high throughput
NASA Astrophysics Data System (ADS)
Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii
2018-03-01
New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.
A high-throughput assay for DNA topoisomerases and other enzymes, based on DNA triplex formation.
Burrell, Matthew R; Burton, Nicolas P; Maxwell, Anthony
2010-01-01
We have developed a rapid, high-throughput assay for measuring the catalytic activity (DNA supercoiling or relaxation) of topoisomerase enzymes that is also capable of monitoring the activity of other enzymes that alter the topology of DNA. The assay utilises intermolecular triplex formation to resolve supercoiled and relaxed forms of DNA, the principle being the greater efficiency of a negatively supercoiled plasmid to form an intermolecular triplex with an immobilised oligonucleotide than the relaxed form. The assay provides a number of advantages over the standard gel-based methods, including greater speed of analysis, reduced sample handling, better quantitation and improved reliability and accuracy of output data. The assay is performed in microtitre plates and can be adapted to high-throughput screening of libraries of potential inhibitors of topoisomerases including bacterial DNA gyrase.
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; ...
2015-05-12
Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with anymore » transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative D-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. A large challenge in microbiology is the functional assessment of the millions of uncharacterized genes identified by genome sequencing. Transposon mutagenesis coupled to next-generation sequencing (TnSeq) is a powerful approach to assign phenotypes and functions to genes. However, the current strategies for TnSeq are too laborious to be applied to hundreds of experimental conditions across multiple bacteria. Here, we describe an approach, random bar code transposon-site sequencing (RB-TnSeq), which greatly simplifies the measurement of gene fitness by using bar code sequencing (BarSeq) to monitor the abundance of mutants. We performed 387 genome-wide fitness assays across five bacteria and identified phenotypes for over 5,000 genes. RB-TnSeq can be applied to diverse bacteria and is a powerful tool to annotate uncharacterized genes using phenotype data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.
Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with anymore » transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative D-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. A large challenge in microbiology is the functional assessment of the millions of uncharacterized genes identified by genome sequencing. Transposon mutagenesis coupled to next-generation sequencing (TnSeq) is a powerful approach to assign phenotypes and functions to genes. However, the current strategies for TnSeq are too laborious to be applied to hundreds of experimental conditions across multiple bacteria. Here, we describe an approach, random bar code transposon-site sequencing (RB-TnSeq), which greatly simplifies the measurement of gene fitness by using bar code sequencing (BarSeq) to monitor the abundance of mutants. We performed 387 genome-wide fitness assays across five bacteria and identified phenotypes for over 5,000 genes. RB-TnSeq can be applied to diverse bacteria and is a powerful tool to annotate uncharacterized genes using phenotype data.« less
Improving tritium exposure reconstructions using accelerator mass spectrometry
Hunt, J. R.; Vogel, J. S.; Knezovich, J. P.
2010-01-01
Direct measurement of tritium atoms by accelerator mass spectrometry (AMS) enables rapid low-activity tritium measurements from milligram-sized samples and permits greater ease of sample collection, faster throughput, and increased spatial and/or temporal resolution. Because existing methodologies for quantifying tritium have some significant limitations, the development of tritium AMS has allowed improvements in reconstructing tritium exposure concentrations from environmental measurements and provides an important additional tool in assessing the temporal and spatial distribution of chronic exposure. Tritium exposure reconstructions using AMS were previously demonstrated for a tree growing on known levels of tritiated water and for trees exposed to atmospheric releases of tritiated water vapor. In these analyses, tritium levels were measured from milligram-sized samples with sample preparation times of a few days. Hundreds of samples were analyzed within a few months of sample collection and resulted in the reconstruction of spatial and temporal exposure from tritium releases. Although the current quantification limit of tritium AMS is not adequate to determine natural environmental variations in tritium concentrations, it is expected to be sufficient for studies assessing possible health effects from chronic environmental tritium exposure. PMID:14735274
Puchyr, R F; Bass, D A; Gajewski, R; Calvin, M; Marquardt, W; Urek, K; Druyan, M E; Quig, D
1998-06-01
The preparation of hair for the determination of elements is a critical component of the analysis procedure. Open-beaker, closed-vessel microwave, and flowthrough microwave digestion are methods that have been used for sample preparation and are discussed. A new digestion method for use with inductively coupled plasma-mass spectrometry (ICP-MS) has been developed. The method uses 0.2 g of hair and 3 mL of concentrated nitric acid in an atmospheric pressure-low-temperature microwave digestion (APLTMD) system. This preparation method is useful in handling a large numbers of samples per day and may be adapted to hair sample weights ranging from 0.08 to 0.3 g. After digestion, samples are analyzed by ICP-MS to determine the concentration of Li, Be, B, Na, Mg, Al, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Ge, As, Se, Rb, Sr, Zr, Mo, Pd, Ag, Cd, Sn, Sb, I, Cs, Ba, Pt, Au, Hg, Tl, Pb, Bi, Th, and U. Benefits of the APLTMD include reduced contamination and sample handling, and increased precision, reliability, and sample throughput.
Bereman, Michael S.; Egertson, Jarrett D.; MacCoss, Michael J.
2012-01-01
Filter aided sample preparation (FASP) and a new sample preparation method using a modified commercial SDS removal spin column are quantitatively compared in terms of their performance for shotgun proteomic experiments in three complex proteomic samples: a Saccharomyces cerevisiae lysate (insoluble fraction), a Caenorhabditis elegans lysate (soluble fraction), and a human embryonic kidney cell line (HEK293T). The characteristics and total number of peptides and proteins identified are compared between the two procedures. The SDS spin column procedure affords a conservative 4-fold improvement in throughput, is more reproducible, less expensive (i.e., requires less materials), and identifies between 30–107% more peptides at a q≤0.01, than the FASP procedure. The peptides identified by SDS spin column are more hydrophobic than species identified by the FASP procedure as indicated by the distribution of GRAVY scores. Ultimately, these improvements correlate to as great as a 50% increase in protein identifications with 2 or more peptides. PMID:21656683
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Charles, Isabel; Sinclair, Ian; Addison, Daniel H
2014-04-01
A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.
NASA Astrophysics Data System (ADS)
Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto
2017-02-01
Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.
Fernández, María del Mar Ramírez; Wille, Sarah M R; Samyn, Nele; Wood, Michelle; López-Rivadulla, Manuel; De Boeck, Gert
2009-01-01
An automated online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of amphetamines in blood and urine was developed and validated. Chromatographic separation was achieved on a Nucleodur Sphinx RP column with an LC gradient (a mixture of 10 mM ammonium formate buffer and acetonitrile), ensuring the elution of amphetamine, methamphetamine, MDMA, MDA, MDEA, PMA, and ephedrine within 11 min. The method was fully validated, according to international guidelines, using only 100 and 50 microL of blood and urine, respectively. The method showed an excellent intra- and interassay precision (relative standard deviation < 11.2% and bias < 13%) for two external quality control samples (QC) for both matrices and three and two 'in house' QCs for blood and urine, respectively. Responses were linear over the investigated range (r(2) > 0.99, 2.5-400 microg/L for blood and 25-1000 microg/L for urine). Limits of quantification were determined to be 2.5 and 25 microg/L for blood and urine, respectively. Limits of detection ranged from 0.05 to 0.5 microg/L for blood and 0.25 to 2.5 microg/L for urine, depending on the compound. Furthermore, the analytes and the processed samples were demonstrated to be stable (in the autosampler for at least 72 h and after three freeze/thaw cycles), and no disturbing matrix effects were observed for all compounds. Moreover, no carryover was observed after the analysis of high concentration samples (15,000 microg/L). The method was subsequently applied to authentic blood and urine samples obtained from forensic cases, which covered a broad range of concentrations. The validation results and actual sample analyses demonstrated that this method is rugged, precise, accurate, and well-suited for routine analysis as more than 72 samples are analyzed non-stop in 24 h with minimum sample handling. The combination of the high-throughput online SPE and the well-known sensitivity and selectivity assured by MS-MS resulted in the elimination of the bottleneck associated with the sample preparation requirements and provided increased sensitivity, accuracy, and precision.
de Muinck, Eric J; Trosvik, Pål; Gilfillan, Gregor D; Hov, Johannes R; Sundaram, Arvind Y M
2017-07-06
Advances in sequencing technologies and bioinformatics have made the analysis of microbial communities almost routine. Nonetheless, the need remains to improve on the techniques used for gathering such data, including increasing throughput while lowering cost and benchmarking the techniques so that potential sources of bias can be better characterized. We present a triple-index amplicon sequencing strategy to sequence large numbers of samples at significantly lower c ost and in a shorter timeframe compared to existing methods. The design employs a two-stage PCR protocol, incorpo rating three barcodes to each sample, with the possibility to add a fourth-index. It also includes heterogeneity spacers to overcome low complexity issues faced when sequencing amplicons on Illumina platforms. The library preparation method was extensively benchmarked through analysis of a mock community in order to assess biases introduced by sample indexing, number of PCR cycles, and template concentration. We further evaluated the method through re-sequencing of a standardized environmental sample. Finally, we evaluated our protocol on a set of fecal samples from a small cohort of healthy adults, demonstrating good performance in a realistic experimental setting. Between-sample variation was mainly related to batch effects, such as DNA extraction, while sample indexing was also a significant source of bias. PCR cycle number strongly influenced chimera formation and affected relative abundance estimates of species with high GC content. Libraries were sequenced using the Illumina HiSeq and MiSeq platforms to demonstrate that this protocol is highly scalable to sequence thousands of samples at a very low cost. Here, we provide the most comprehensive study of performance and bias inherent to a 16S rRNA gene amplicon sequencing method to date. Triple-indexing greatly reduces the number of long custom DNA oligos required for library preparation, while the inclusion of variable length heterogeneity spacers minimizes the need for PhiX spike-in. This design results in a significant cost reduction of highly multiplexed amplicon sequencing. The biases we characterize highlight the need for highly standardized protocols. Reassuringly, we find that the biological signal is a far stronger structuring factor than the various sources of bias.
Monolithic amorphous silicon modules on continuous polymer substrate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimmer, D.P.
This report examines manufacturing monolithic amorphous silicon modules on a continuous polymer substrate. Module production costs can be reduced by increasing module performance, expanding production, and improving and modifying production processes. Material costs can be reduced by developing processes that use a 1-mil polyimide substrate and multilayers of low-cost material for the front encapsulant. Research to speed up a-Si and ZnO deposition rates is needed to improve throughputs. To keep throughput rates compatible with depositions, multibeam fiber optic delivery systems for laser scribing can be used. However, mechanical scribing systems promise even higher throughputs. Tandem cells and production experience canmore » increase device efficiency and stability. Two alternative manufacturing processes are described: (1) wet etching and sheet handling and (2) wet etching and roll-to-roll fabrication.« less
Zhang, Xinglei; Liu, Yan; Zhang, Jinghua; Hu, Zhong; Hu, Bin; Ding, Liying; Jia, Li; Chen, Huanwen
2011-09-15
High throughput analysis of sunscreen agents present in cream cosmetic has been demonstrated, typically 2 samples per minute, using neutral desorption extractive electrospray ionization mass spectrometry (ND-EESI-MS) without sample pretreatment. For the targeted compounds such as 4-Aminobenzoic acid and oxybenzone, ND-EESI-MS method provided linear signal responses in the range of 1-100 ppb. Limits of detection (LOD) of the method were estimated at sub-ppb levels for the analytes tested. Reasonable relative standard deviation (RSD=8.4-16.0%) was obtained as a result of 10 independent measurements for commercial cosmetics samples spiked with each individual sunscreen agents at 1-10 ppb. Acceptable recoveries were achieved in the range of 87-116% for direct analysis of commercial cream cosmetic samples. The experimental data demonstrate that ND-EESI-MS is a useful tool for high throughput screening of sunscreen agents in highly viscous cream cosmetic products, with the capability to obtain quantitative information of the analytes. Copyright © 2011 Elsevier B.V. All rights reserved.
Shibata, Kazuhiro; Itoh, Masayoshi; Aizawa, Katsunori; Nagaoka, Sumiharu; Sasaki, Nobuya; Carninci, Piero; Konno, Hideaki; Akiyama, Junichi; Nishi, Katsuo; Kitsunai, Tokuji; Tashiro, Hideo; Itoh, Mari; Sumi, Noriko; Ishii, Yoshiyuki; Nakamura, Shin; Hazama, Makoto; Nishine, Tsutomu; Harada, Akira; Yamamoto, Rintaro; Matsumoto, Hiroyuki; Sakaguchi, Sumito; Ikegami, Takashi; Kashiwagi, Katsuya; Fujiwake, Syuji; Inoue, Kouji; Togawa, Yoshiyuki; Izawa, Masaki; Ohara, Eiji; Watahiki, Masanori; Yoneda, Yuko; Ishikawa, Tomokazu; Ozawa, Kaori; Tanaka, Takumi; Matsuura, Shuji; Kawai, Jun; Okazaki, Yasushi; Muramatsu, Masami; Inoue, Yorinao; Kira, Akira; Hayashizaki, Yoshihide
2000-01-01
The RIKEN high-throughput 384-format sequencing pipeline (RISA system) including a 384-multicapillary sequencer (the so-called RISA sequencer) was developed for the RIKEN mouse encyclopedia project. The RISA system consists of colony picking, template preparation, sequencing reaction, and the sequencing process. A novel high-throughput 384-format capillary sequencer system (RISA sequencer system) was developed for the sequencing process. This system consists of a 384-multicapillary auto sequencer (RISA sequencer), a 384-multicapillary array assembler (CAS), and a 384-multicapillary casting device. The RISA sequencer can simultaneously analyze 384 independent sequencing products. The optical system is a scanning system chosen after careful comparison with an image detection system for the simultaneous detection of the 384-capillary array. This scanning system can be used with any fluorescent-labeled sequencing reaction (chain termination reaction), including transcriptional sequencing based on RNA polymerase, which was originally developed by us, and cycle sequencing based on thermostable DNA polymerase. For long-read sequencing, 380 out of 384 sequences (99.2%) were successfully analyzed and the average read length, with more than 99% accuracy, was 654.4 bp. A single RISA sequencer can analyze 216 kb with >99% accuracy in 2.7 h (90 kb/h). For short-read sequencing to cluster the 3′ end and 5′ end sequencing by reading 350 bp, 384 samples can be analyzed in 1.5 h. We have also developed a RISA inoculator, RISA filtrator and densitometer, RISA plasmid preparator which can handle throughput of 40,000 samples in 17.5 h, and a high-throughput RISA thermal cycler which has four 384-well sites. The combination of these technologies allowed us to construct the RISA system consisting of 16 RISA sequencers, which can process 50,000 DNA samples per day. One haploid genome shotgun sequence of a higher organism, such as human, mouse, rat, domestic animals, and plants, can be revealed by seven RISA systems within one month. PMID:11076861
Drosten, C.; Seifried, E.; Roth, W. K.
2001-01-01
Screening of blood donors for human immunodeficiency virus type 1 (HIV-1) infection by PCR permits the earlier diagnosis of HIV-1 infection compared with that by serologic assays. We have established a high-throughput reverse transcription (RT)-PCR assay based on 5′-nuclease PCR. By in-tube detection of HIV-1 RNA with a fluorogenic probe, the 5′-nuclease PCR technology (TaqMan PCR) eliminates the risk of carryover contamination, a major problem in PCR testing. We outline the development and evaluation of the PCR assay from a technical point of view. A one-step RT-PCR that targets the gag genes of all known HIV-1 group M isolates was developed. An internal control RNA detectable with a heterologous 5′-nuclease probe was derived from the viral target cDNA and was packaged into MS2 coliphages (Armored RNA). Because the RNA was protected against digestion with RNase, it could be spiked into patient plasma to control the complete sample preparation and amplification process. The assay detected 831 HIV-1 type B genome equivalents per ml of native plasma (95% confidence interval [CI], 759 to 936 HIV-1 B genome equivalents per ml) with a ≥95% probability of a positive result, as determined by probit regression analysis. A detection limit of 1,195 genome equivalents per ml of (individual) donor plasma (95% CI, 1,014 to 1,470 genome equivalents per ml of plasma pooled from individuals) was achieved when 96 samples were pooled and enriched by centrifugation. Up to 4,000 plasma samples per PCR run were tested in a 3-month trial period. Although data from the present pilot feasibility study will have to be complemented by a large clinical validation study, the assay is a promising approach to the high-throughput screening of blood donors and is the first noncommercial test for high-throughput screening for HIV-1. PMID:11724836
Rames, Matthew; Yu, Yadong; Ren, Gang
2014-08-15
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electronmore » microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol. Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high-resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography. Moreover, OpNS can be a high-throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.« less
Zhang, Guang Lan; Keskin, Derin B.; Lin, Hsin-Nan; Lin, Hong Huang; DeLuca, David S.; Leppanen, Scott; Milford, Edgar L.; Reinherz, Ellis L.; Brusic, Vladimir
2014-01-01
Human leukocyte antigens (HLA) are important biomarkers because multiple diseases, drug toxicity, and vaccine responses reveal strong HLA associations. Current clinical HLA typing is an elimination process requiring serial testing. We present an alternative in situ synthesized DNA-based microarray method that contains hundreds of thousands of probes representing a complete overlapping set covering 1,610 clinically relevant HLA class I alleles accompanied by computational tools for assigning HLA type to 4-digit resolution. Our proof-of-concept experiment included 21 blood samples, 18 cell lines, and multiple controls. The method is accurate, robust, and amenable to automation. Typing errors were restricted to homozygous samples or those with very closely related alleles from the same locus, but readily resolved by targeted DNA sequencing validation of flagged samples. High-throughput HLA typing technologies that are effective, yet inexpensive, can be used to analyze the world’s populations, benefiting both global public health and personalized health care. PMID:25505899
Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.
Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie
2017-01-01
Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.
Ovaskainen, Otso; Schigel, Dmitry; Ali-Kovero, Heini; Auvinen, Petri; Paulin, Lars; Nordén, Björn; Nordén, Jenni
2013-01-01
Before the recent revolution in molecular biology, field studies on fungal communities were mostly confined to fruit bodies, whereas mycelial interactions were studied in the laboratory. Here we combine high-throughput sequencing with a fruit body inventory to study simultaneously mycelial and fruit body occurrences in a community of fungi inhabiting dead wood of Norway spruce. We studied mycelial occurrence by extracting DNA from wood samples followed by 454-sequencing of the ITS1 and ITS2 regions and an automated procedure for species identification. In total, we detected 198 species as mycelia and 137 species as fruit bodies. The correlation between mycelial and fruit body occurrences was high for the majority of the species, suggesting that high-throughput sequencing can successfully characterize the dominating fungal communities, despite possible biases related to sampling, PCR, sequencing and molecular identification. We used the fruit body and molecular data to test hypothesized links between life history and population dynamic parameters. We show that the species that have on average a high mycelial abundance also have a high fruiting rate and produce large fruit bodies, leading to a positive feedback loop in their population dynamics. Earlier studies have shown that species with specialized resource requirements are rarely seen fruiting, for which reason they are often classified as red-listed. We show with the help of high-throughput sequencing that some of these species are more abundant as mycelium in wood than what could be expected from their occurrence as fruit bodies. PMID:23575372
Noyes, Aaron; Huffman, Ben; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Sunasara, Khurram; Mukhopadhyay, Tarit
2015-08-01
The biotech industry is under increasing pressure to decrease both time to market and development costs. Simultaneously, regulators are expecting increased process understanding. High throughput process development (HTPD) employs small volumes, parallel processing, and high throughput analytics to reduce development costs and speed the development of novel therapeutics. As such, HTPD is increasingly viewed as integral to improving developmental productivity and deepening process understanding. Particle conditioning steps such as precipitation and flocculation may be used to aid the recovery and purification of biological products. In this first part of two articles, we describe an ultra scale-down system (USD) for high throughput particle conditioning (HTPC) composed of off-the-shelf components. The apparatus is comprised of a temperature-controlled microplate with magnetically driven stirrers and integrated with a Tecan liquid handling robot. With this system, 96 individual reaction conditions can be evaluated in parallel, including downstream centrifugal clarification. A comprehensive suite of high throughput analytics enables measurement of product titer, product quality, impurity clearance, clarification efficiency, and particle characterization. HTPC at the 1 mL scale was evaluated with fermentation broth containing a vaccine polysaccharide. The response profile was compared with the Pilot-scale performance of a non-geometrically similar, 3 L reactor. An engineering characterization of the reactors and scale-up context examines theoretical considerations for comparing this USD system with larger scale stirred reactors. In the second paper, we will explore application of this system to industrially relevant vaccines and test different scale-up heuristics. © 2015 Wiley Periodicals, Inc.
Enhanced sensitivity to glucocorticoids in cytarabine-resistant AML.
Malani, D; Murumägi, A; Yadav, B; Kontro, M; Eldfors, S; Kumar, A; Karjalainen, R; Majumder, M M; Ojamies, P; Pemovska, T; Wennerberg, K; Heckman, C; Porkka, K; Wolf, M; Aittokallio, T; Kallioniemi, O
2017-05-01
We sought to identify drugs that could counteract cytarabine resistance in acute myeloid leukemia (AML) by generating eight resistant variants from MOLM-13 and SHI-1 AML cell lines by long-term drug treatment. These cells were compared with 66 ex vivo chemorefractory samples from cytarabine-treated AML patients. The models and patient cells were subjected to genomic and transcriptomic profiling and high-throughput testing with 250 emerging and clinical oncology compounds. Genomic profiling uncovered deletion of the deoxycytidine kinase (DCK) gene in both MOLM-13- and SHI-1-derived cytarabine-resistant variants and in an AML patient sample. Cytarabine-resistant SHI-1 variants and a subset of chemorefractory AML patient samples showed increased sensitivity to glucocorticoids that are often used in treatment of lymphoid leukemia but not AML. Paired samples taken from AML patients before treatment and at relapse also showed acquisition of glucocorticoid sensitivity. Enhanced glucocorticoid sensitivity was only seen in AML patient samples that were negative for the FLT3 mutation (P=0.0006). Our study shows that development of cytarabine resistance is associated with increased sensitivity to glucocorticoids in a subset of AML, suggesting a new therapeutic strategy that should be explored in a clinical trial of chemorefractory AML patients carrying wild-type FLT3.
Enhanced sensitivity to glucocorticoids in cytarabine-resistant AML
Malani, D; Murumägi, A; Yadav, B; Kontro, M; Eldfors, S; Kumar, A; Karjalainen, R; Majumder, M M; Ojamies, P; Pemovska, T; Wennerberg, K; Heckman, C; Porkka, K; Wolf, M; Aittokallio, T; Kallioniemi, O
2017-01-01
We sought to identify drugs that could counteract cytarabine resistance in acute myeloid leukemia (AML) by generating eight resistant variants from MOLM-13 and SHI-1 AML cell lines by long-term drug treatment. These cells were compared with 66 ex vivo chemorefractory samples from cytarabine-treated AML patients. The models and patient cells were subjected to genomic and transcriptomic profiling and high-throughput testing with 250 emerging and clinical oncology compounds. Genomic profiling uncovered deletion of the deoxycytidine kinase (DCK) gene in both MOLM-13- and SHI-1-derived cytarabine-resistant variants and in an AML patient sample. Cytarabine-resistant SHI-1 variants and a subset of chemorefractory AML patient samples showed increased sensitivity to glucocorticoids that are often used in treatment of lymphoid leukemia but not AML. Paired samples taken from AML patients before treatment and at relapse also showed acquisition of glucocorticoid sensitivity. Enhanced glucocorticoid sensitivity was only seen in AML patient samples that were negative for the FLT3 mutation (P=0.0006). Our study shows that development of cytarabine resistance is associated with increased sensitivity to glucocorticoids in a subset of AML, suggesting a new therapeutic strategy that should be explored in a clinical trial of chemorefractory AML patients carrying wild-type FLT3. PMID:27833094
High-speed atomic force microscopy and peak force tapping control
NASA Astrophysics Data System (ADS)
Hu, Shuiqing; Mininni, Lars; Hu, Yan; Erina, Natalia; Kindt, Johannes; Su, Chanmin
2012-03-01
ITRS Roadmap requires defect size measurement below 10 nanometers and challenging classifications for both blank and patterned wafers and masks. Atomic force microscope (AFM) is capable of providing metrology measurement in 3D at sub-nanometer accuracy but has long suffered from drawbacks in throughput and limitation of slow topography imaging without chemical information. This presentation focus on two disruptive technology developments, namely high speed AFM and quantitative nanomechanical mapping, which enables high throughput measurement with capability of identifying components through concurrent physical property imaging. The high speed AFM technology has allowed the imaging speed increase by 10-100 times without loss of the data quality. Such improvement enables the speed of defect review on a wafer to increase from a few defects per hour to nearly 100 defects an hour, approaching the requirements of ITRS Roadmap. Another technology development, Peak Force Tapping, substantially simplified the close loop system response, leading to self-optimization of most challenging samples groups to generate expert quality data. More importantly, AFM also simultaneously provides a series of mechanical property maps with a nanometer spatial resolution during defect review. These nanomechanical maps (including elastic modulus, hardness, and surface adhesion) provide complementary information for elemental analysis, differentiate defect materials by their physical properties, and assist defect classification beyond topographic measurements. This paper will explain the key enabling technologies, namely high speed tip-scanning AFM using innovative flexure design and control algorithm. Another critical element is AFM control using Peak Force Tapping, in which the instantaneous tip-sample interaction force is measured and used to derive a full suite of physical properties at each imaging pixel. We will provide examples of defect review data on different wafers and media disks. The similar AFM-based defect review capacity was also applied to EUV masks.
Preparation of Protein Samples for NMR Structure, Function, and Small Molecule Screening Studies
Acton, Thomas B.; Xiao, Rong; Anderson, Stephen; Aramini, James; Buchwald, William A.; Ciccosanti, Colleen; Conover, Ken; Everett, John; Hamilton, Keith; Huang, Yuanpeng Janet; Janjua, Haleema; Kornhaber, Gregory; Lau, Jessica; Lee, Dong Yup; Liu, Gaohua; Maglaqui, Melissa; Ma, Lichung; Mao, Lei; Patel, Dayaban; Rossi, Paolo; Sahdev, Seema; Shastry, Ritu; Swapna, G.V.T.; Tang, Yeufeng; Tong, Saichiu; Wang, Dongyan; Wang, Huang; Zhao, Li; Montelione, Gaetano T.
2014-01-01
In this chapter, we concentrate on the production of high quality protein samples for NMR studies. In particular, we provide an in-depth description of recent advances in the production of NMR samples and their synergistic use with recent advancements in NMR hardware. We describe the protein production platform of the Northeast Structural Genomics Consortium, and outline our high-throughput strategies for producing high quality protein samples for nuclear magnetic resonance (NMR) studies. Our strategy is based on the cloning, expression and purification of 6X-His-tagged proteins using T7-based Escherichia coli systems and isotope enrichment in minimal media. We describe 96-well ligation-independent cloning and analytical expression systems, parallel preparative scale fermentation, and high-throughput purification protocols. The 6X-His affinity tag allows for a similar two-step purification procedure implemented in a parallel high-throughput fashion that routinely results in purity levels sufficient for NMR studies (> 97% homogeneity). Using this platform, the protein open reading frames of over 17,500 different targeted proteins (or domains) have been cloned as over 28,000 constructs. Nearly 5,000 of these proteins have been purified to homogeneity in tens of milligram quantities (see Summary Statistics, http://nesg.org/statistics.html), resulting in more than 950 new protein structures, including more than 400 NMR structures, deposited in the Protein Data Bank. The Northeast Structural Genomics Consortium pipeline has been effective in producing protein samples of both prokaryotic and eukaryotic origin. Although this paper describes our entire pipeline for producing isotope-enriched protein samples, it focuses on the major updates introduced during the last 5 years (Phase 2 of the National Institute of General Medical Sciences Protein Structure Initiative). Our advanced automated and/or parallel cloning, expression, purification, and biophysical screening technologies are suitable for implementation in a large individual laboratory or by a small group of collaborating investigators for structural biology, functional proteomics, ligand screening and structural genomics research. PMID:21371586
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-01-01
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution. PMID:26961061
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-03-10
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution.
PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.
Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R
2017-10-01
High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.
NASA Astrophysics Data System (ADS)
House, B. M.; Norris, R. D.
2017-12-01
The Early Eocene Climatic Optimum (EECO) around 50 Ma was a sustained period of extreme global warmth with ocean bottom water temperatures of up to 12° C. The marine biologic response to such climatic extremes is unclear, however, in part because proxies that integrate ecosystem-wide productivity signals are scarce. While the accumulation of marine barite (BaSO4) is one such proxy, its applicability has remained limited due to the difficulty in reliably quantifying barite. Discrete measurements of barite content in marine sediments are laborious, and indirect estimates provide unclear results. We have developed a fast, high-throughput method for reliable measurement of barite content that relies on selective extraction of barite rather than sample digestion and quantification of remaining barite. Tests of the new method reveal that it gives the expected results for a wide variety of sediment types and can quantitatively extract 10-100 times the amount of barite typically encountered in natural sediments. Altogether, our method provides an estimated ten-fold increase in analysis efficiency over current sample digestion methods and also works reliably on small ( 1 g or less) sediment samples. Furthermore, the instrumentation requirements of this method are minor, so samples can be analyzed in shipboard labs to generate real-time paleoproductivity records during coring expeditions. Because of the magnitude of throughput improvement, this new technique will permit the generation of large datasets needed to address previously intractable paleoclimate and paleoceanographic questions. One such question is how export productivity changes during climatic extremes. We used our new method to analyze globally distributed sediment cores to determine if the EECO represented a period of anomalous export productivity either due to higher rates of primary production or more vigorous heterotrophic metabolisms. An increase in export productivity could provide a mechanism for exiting periods of extreme warmth, and understanding the interplay between temperature, atmospheric CO2 levels, and export productivity during the EECO will help clarify how the marine biologic system functions as a whole.
Role of microextraction sampling procedures in forensic toxicology.
Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia
2012-07-01
The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.
PHILIS (PORTABLE HIGH-THROUGHPUT INTEGRATED LABORATORY IDENTIFICATION SYSTEM)
These mobile laboratory assets, for the on-site analysis of chemical warfare agent (CWA) and toxic industrial compound (TIC) contaminated environmental samples, are part of the evolving Environmental Response Laboratory Network (ERLN).
L-edge spectroscopy of dilute, radiation-sensitive systems using a transition-edge-sensor array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Titus, Charles J.; Baker, Michael L.; Lee, Sang Jun
Here, we present X-ray absorption spectroscopy and resonant inelastic X-ray scattering (RIXS) measurements on the iron L-edge of 0.5 mM aqueous ferricyanide. These measurements then demonstrate the ability of high-throughput transition-edge-sensor (TES) spectrometers to access the rich soft X-ray (100–2000 eV) spectroscopy regime for dilute and radiation-sensitive samples. Our low-concentration data are in agreement with high-concentration measurements recorded by grating spectrometers. These results show that soft-X-ray RIXS spectroscopy acquired by high-throughput TES spectrometers can be used to study the local electronic structure of dilute metal-centered complexes relevant to biology, chemistry, and catalysis. In particular, TES spectrometers have a unique abilitymore » to characterize frozen solutions of radiation- and temperature-sensitive samples.« less
L-edge spectroscopy of dilute, radiation-sensitive systems using a transition-edge-sensor array
Titus, Charles J.; Baker, Michael L.; Lee, Sang Jun; ...
2017-12-07
Here, we present X-ray absorption spectroscopy and resonant inelastic X-ray scattering (RIXS) measurements on the iron L-edge of 0.5 mM aqueous ferricyanide. These measurements then demonstrate the ability of high-throughput transition-edge-sensor (TES) spectrometers to access the rich soft X-ray (100–2000 eV) spectroscopy regime for dilute and radiation-sensitive samples. Our low-concentration data are in agreement with high-concentration measurements recorded by grating spectrometers. These results show that soft-X-ray RIXS spectroscopy acquired by high-throughput TES spectrometers can be used to study the local electronic structure of dilute metal-centered complexes relevant to biology, chemistry, and catalysis. In particular, TES spectrometers have a unique abilitymore » to characterize frozen solutions of radiation- and temperature-sensitive samples.« less
Tran, Thi-Nguyen-Ny; Signoli, Michel; Fozzati, Luigi; Aboudharam, Gérard; Raoult, Didier; Drancourt, Michel
2011-03-10
Historical records suggest that multiple burial sites from the 14th-16th centuries in Venice, Italy, were used during the Black Death and subsequent plague epidemics. High throughput, multiplexed real-time PCR detected DNA of seven highly transmissible pathogens in 173 dental pulp specimens collected from 46 graves. Bartonella quintana DNA was identified in five (2.9%) samples, including three from the 16th century and two from the 15th century, and Yersinia pestis DNA was detected in three (1.7%) samples, including two from the 14th century and one from the 16th century. Partial glpD gene sequencing indicated that the detected Y. pestis was the Orientalis biotype. These data document for the first time successive plague epidemics in the medieval European city where quarantine was first instituted in the 14th century.
Shen, Shaofei; Ma, Chao; Zhao, Lei; Wang, Yaolei; Wang, Jian-Chun; Xu, Juan; Li, Tianbao; Pang, Long; Wang, Jinyi
2014-07-21
The presence and quantity of rare cells in the bloodstream of cancer patients provide a potentially accessible source for the early detection of invasive cancer and for monitoring the treatment of advanced diseases. The separation of rare cells from peripheral blood, as a "virtual and real-time liquid biopsy", is expected to replace conventional tissue biopsies of metastatic tumors for therapy guidance. However, technical obstacles, similar to looking for a needle in a haystack, have hindered the broad clinical utility of this method. In this study, we developed a multistage microfluidic device for continuous label-free separation and enrichment of rare cells from blood samples based on cell size and deformability. We successfully separated tumor cells (MCF-7 and HeLa cells) and leukemic (K562) cells spiked in diluted whole blood using a unique complementary combination of inertial microfluidics and steric hindrance in a microfluidic system. The processing parameters of the inertial focusing and steric hindrance regions were optimized to achieve high-throughput and high-efficiency separation, significant advantages compared with existing rare cell isolation technologies. The results from experiments with rare cells spiked in 1% hematocrit blood indicated >90% cell recovery at a throughput of 2.24 × 10(7) cells min(-1). The enrichment of rare cells was >2.02 × 10(5)-fold. Thus, this microfluidic system driven by purely hydrodynamic forces has practical potential to be applied either alone or as a sample preparation platform for fundamental studies and clinical applications.
High-Throughput Sequencing: A Roadmap Toward Community Ecology
Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique
2013-01-01
High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines. PMID:23610649
Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K
2018-01-01
The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.
Alterman, Julia F; Coles, Andrew H; Hall, Lauren M; Aronin, Neil; Khvorova, Anastasia; Didiot, Marie-Cécile
2017-08-20
Primary neurons represent an ideal cellular system for the identification of therapeutic oligonucleotides for the treatment of neurodegenerative diseases. However, due to the sensitive nature of primary cells, the transfection of small interfering RNAs (siRNA) using classical methods is laborious and often shows low efficiency. Recent progress in oligonucleotide chemistry has enabled the development of stabilized and hydrophobically modified small interfering RNAs (hsiRNAs). This new class of oligonucleotide therapeutics shows extremely efficient self-delivery properties and supports potent and durable effects in vitro and in vivo . We have developed a high-throughput in vitro assay to identify and test hsiRNAs in primary neuronal cultures. To simply, rapidly, and accurately quantify the mRNA silencing of hundreds of hsiRNAs, we use the QuantiGene 2.0 quantitative gene expression assay. This high-throughput, 96-well plate-based assay can quantify mRNA levels directly from sample lysate. Here, we describe a method to prepare short-term cultures of mouse primary cortical neurons in a 96-well plate format for high-throughput testing of oligonucleotide therapeutics. This method supports the testing of hsiRNA libraries and the identification of potential therapeutics within just two weeks. We detail methodologies of our high throughput assay workflow from primary neuron preparation to data analysis. This method can help identify oligonucleotide therapeutics for treatment of various neurological diseases.
Activity and diversity of methane-oxidizing bacteria along a Norwegian sub-Arctic glacier forefield.
Mateos-Rivera, Alejandro; Øvreås, Lise; Wilson, Bryan; Yde, Jacob C; Finster, Kai W
2018-05-01
Methane (CH4) is one of the most abundant greenhouse gases in the atmosphere and identification of its sources and sinks is crucial for the reliability of climate model outputs. Although CH4 production and consumption rates have been reported from a broad spectrum of environments, data obtained from glacier forefields are restricted to a few locations. We report the activities of methanotrophic communities and their diversity along a chronosequence in front of a sub-Arctic glacier using high-throughput sequencing and gas flux measurements. CH4 oxidation rates were measured in the field throughout the growing season during three sampling times at eight different sampling points in combination with laboratory incubation experiments. The overall results showed that the methanotrophic community had similar trends of increased CH4 consumption and increased abundance as a function of soil development and time of year. Sequencing results revealed that the methanotrophic community was dominated by a few OTUs and that a short-term increase in CH4 concentration, as performed in the field measurements, altered slightly the relative abundance of the OTUs.
A 0.13-µm implementation of 5 Gb/s and 3-mW folded parallel architecture for AES algorithm
NASA Astrophysics Data System (ADS)
Rahimunnisa, K.; Karthigaikumar, P.; Kirubavathy, J.; Jayakumar, J.; Kumar, S. Suresh
2014-02-01
A new architecture for encrypting and decrypting the confidential data using Advanced Encryption Standard algorithm is presented in this article. This structure combines the folded structure with parallel architecture to increase the throughput. The whole architecture achieved high throughput with less power. The proposed architecture is implemented in 0.13-µm Complementary metal-oxide-semiconductor (CMOS) technology. The proposed structure is compared with different existing structures, and from the result it is proved that the proposed structure gives higher throughput and less power compared to existing works.
High-Throughput Incubation and Quantification of Agglutination Assays in a Microfluidic System.
Castro, David; Conchouso, David; Kodzius, Rimantas; Arevalo, Arpys; Foulds, Ian G
2018-06-04
In this paper, we present a two-phase microfluidic system capable of incubating and quantifying microbead-based agglutination assays. The microfluidic system is based on a simple fabrication solution, which requires only laboratory tubing filled with carrier oil, driven by negative pressure using a syringe pump. We provide a user-friendly interface, in which a pipette is used to insert single droplets of a 1.25-µL volume into a system that is continuously running and therefore works entirely on demand without the need for stopping, resetting or washing the system. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5⁻10-fold improvement over traditional agglutination assays. We study system parameters such as channel length, incubation time and flow speed to select optimal assay conditions, using the streptavidin-biotin interaction as a model analyte quantified using optical image processing. We then investigate the effect of changing the concentration of both analyte and microbead concentrations, with a minimum detection limit of 100 ng/mL. The system can be both low- and high-throughput, depending on the rate at which assays are inserted. In our experiments, we were able to easily produce throughputs of 360 assays per hour by simple manual pipetting, which could be increased even further by automation and parallelization. Agglutination assays are a versatile tool, capable of detecting an ever-growing catalog of infectious diseases, proteins and metabolites. A system such as this one is a step towards being able to produce high-throughput microfluidic diagnostic solutions with widespread adoption. The development of analytical techniques in the microfluidic format, such as the one presented in this work, is an important step in being able to continuously monitor the performance and microfluidic outputs of organ-on-chip devices.
High-throughput method to predict extrusion pressure of ceramic pastes.
Cao, Kevin; Liu, Yang; Tucker, Christopher; Baumann, Michael; Grit, Grote; Lakso, Steven
2014-04-14
A new method was developed to measure the rheology of extrudable ceramic pastes using a Hamilton MicroLab Star liquid handler. The Hamilton instrument, normally used for high throughput liquid processing, was expanded to function as a low pressure capillary rheometer. Diluted ceramic pastes were forced through the modified pipettes, which produced pressure drop data that was converted to standard rheology data. A known ceramic paste containing cellulose ether was made and diluted to various concentrations in water. The most dilute paste samples were tested in the Hamilton instrument and the more typical, highly concentrated, ceramic paste were tested with a hydraulic ram extruder fitted with a capillary die and pressure measurement system. The rheology data from this study indicates that the dilute high throughput method using the Hamilton instrument correlates to, and can predict, the rheology of concentrated ceramic pastes normally used in ceramic extrusion production processes.
Shot-noise limited throughput of soft x-ray ptychography for nanometrology applications
NASA Astrophysics Data System (ADS)
Koek, Wouter; Florijn, Bastiaan; Bäumer, Stefan; Kruidhof, Rik; Sadeghian, Hamed
2018-03-01
Due to its potential for high resolution and three-dimensional imaging, soft x-ray ptychography has received interest for nanometrology applications. We have analyzed the measurement time per unit area when using soft x-ray ptychography for various nanometrology applications including mask inspection and wafer inspection, and are thus able to predict (order of magnitude) throughput figures. Here we show that for a typical measurement system, using a typical sampling strategy, and when aiming for 10-15 nm resolution, it is expected that a wafer-based topology (2.5D) measurement takes approximately 4 minutes per μm2 , and a full three-dimensional measurement takes roughly 6 hours per μm2 . Due to their much higher reflectivity EUV masks can be measured considerably faster; a measurement speed of 0.1 seconds per μm2 is expected. However, such speeds do not allow for full wafer or mask inspection at industrially relevant throughput.
High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.
Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen
2013-09-09
Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.
Zhong, Qing; Rüschoff, Jan H.; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J.; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J.; Rupp, Niels J.; Fankhauser, Christian; Buhmann, Joachim M.; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A.; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C.; Jochum, Wolfram; Wild, Peter J.
2016-01-01
Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility. PMID:27052161
Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J
2016-04-07
Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.
High-throughput screening of chemical effects on ...
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples on steroidogenesis via HPLC-MS/MS quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a three stage screening strategy. The first stage established the maximum tolerated concentration (MTC; >70% viability) per sample. The second stage quantified changes in hormone levels at the MTC while the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were pre-stimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2,060 chemical samples evaluated, 524 samples were selected for six-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into five distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A d
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ? 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 00b5M forskolin for 48??h to induce steroidogenesis followed by chemical treatment for 48??h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 1703b2-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mec
Genotype Imputation with Millions of Reference Samples
Browning, Brian L.; Browning, Sharon R.
2016-01-01
We present a genotype imputation method that scales to millions of reference samples. The imputation method, based on the Li and Stephens model and implemented in Beagle v.4.1, is parallelized and memory efficient, making it well suited to multi-core computer processors. It achieves fast, accurate, and memory-efficient genotype imputation by restricting the probability model to markers that are genotyped in the target samples and by performing linear interpolation to impute ungenotyped variants. We compare Beagle v.4.1 with Impute2 and Minimac3 by using 1000 Genomes Project data, UK10K Project data, and simulated data. All three methods have similar accuracy but different memory requirements and different computation times. When imputing 10 Mb of sequence data from 50,000 reference samples, Beagle’s throughput was more than 100× greater than Impute2’s throughput on our computer servers. When imputing 10 Mb of sequence data from 200,000 reference samples in VCF format, Minimac3 consumed 26× more memory per computational thread and 15× more CPU time than Beagle. We demonstrate that Beagle v.4.1 scales to much larger reference panels by performing imputation from a simulated reference panel having 5 million samples and a mean marker density of one marker per four base pairs. PMID:26748515
Knaack, Jennifer S; Zhou, Yingtao; Abney, Carter W; Prezioso, Samantha M; Magnuson, Matthew; Evans, Ronald; Jakubowski, Edward M; Hardy, Katelyn; Johnson, Rudolph C
2012-11-20
We have developed a novel immunomagnetic scavenging technique for extracting cholinesterase inhibitors from aqueous matrixes using biological targeting and antibody-based extraction. The technique was characterized using the organophosphorus nerve agent VX. The limit of detection for VX in high-performance liquid chromatography (HPLC)-grade water, defined as the lowest calibrator concentration, was 25 pg/mL in a small, 500 μL sample. The method was characterized over the course of 22 sample sets containing calibrators, blanks, and quality control samples. Method precision, expressed as the mean relative standard deviation, was less than 9.2% for all calibrators. Quality control sample accuracy was 102% and 100% of the mean for VX spiked into HPLC-grade water at concentrations of 2.0 and 0.25 ng/mL, respectively. This method successfully was applied to aqueous extracts from soil, hamburger, and finished tap water spiked with VX. Recovery was 65%, 81%, and 100% from these matrixes, respectively. Biologically based extractions of organophosphorus compounds represent a new technique for sample extraction that provides an increase in extraction specificity and sensitivity.
Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J
2015-01-01
The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. © 2014 American Academy of Forensic Sciences.
Yin, Weiwei; Garimalla, Swetha; Moreno, Alberto; Galinski, Mary R; Styczynski, Mark P
2015-08-28
There are increasing efforts to bring high-throughput systems biology techniques to bear on complex animal model systems, often with a goal of learning about underlying regulatory network structures (e.g., gene regulatory networks). However, complex animal model systems typically have significant limitations on cohort sizes, number of samples, and the ability to perform follow-up and validation experiments. These constraints are particularly problematic for many current network learning approaches, which require large numbers of samples and may predict many more regulatory relationships than actually exist. Here, we test the idea that by leveraging the accuracy and efficiency of classifiers, we can construct high-quality networks that capture important interactions between variables in datasets with few samples. We start from a previously-developed tree-like Bayesian classifier and generalize its network learning approach to allow for arbitrary depth and complexity of tree-like networks. Using four diverse sample networks, we demonstrate that this approach performs consistently better at low sample sizes than the Sparse Candidate Algorithm, a representative approach for comparison because it is known to generate Bayesian networks with high positive predictive value. We develop and demonstrate a resampling-based approach to enable the identification of a viable root for the learned tree-like network, important for cases where the root of a network is not known a priori. We also develop and demonstrate an integrated resampling-based approach to the reduction of variable space for the learning of the network. Finally, we demonstrate the utility of this approach via the analysis of a transcriptional dataset of a malaria challenge in a non-human primate model system, Macaca mulatta, suggesting the potential to capture indicators of the earliest stages of cellular differentiation during leukopoiesis. We demonstrate that by starting from effective and efficient approaches for creating classifiers, we can identify interesting tree-like network structures with significant ability to capture the relationships in the training data. This approach represents a promising strategy for inferring networks with high positive predictive value under the constraint of small numbers of samples, meeting a need that will only continue to grow as more high-throughput studies are applied to complex model systems.
Davletbaeva, Polina; Chocholouš, Petr; Bulatov, Andrey; Šatínský, Dalibor; Solich, Petr
2017-09-05
Sequential Injection Chromatography (SIC) evolved from fast and automated non-separation Sequential Injection Analysis (SIA) into chromatographic separation method for multi-element analysis. However, the speed of the measurement (sample throughput) is due to chromatography significantly reduced. In this paper, a sub-1min separation using medium polar cyano monolithic column (5mm×4.6mm) resulted in fast and green separation with sample throughput comparable with non-separation flow methods The separation of three synthetic water-soluble dyes (sunset yellow FCF, carmoisine and green S) was in a gradient elution mode (0.02% ammonium acetate, pH 6.7 - water) with flow rate of 3.0mLmin -1 corresponding with sample throughput of 30h -1 . Spectrophotometric detection wavelengths were set to 480, 516 and 630nm and 10Hz data collection rate. The performance of the separation was described and discussed (peak capacities 3.48-7.67, peak symmetries 1.72-1.84 and resolutions 1.42-1.88). The method was represented by validation parameters: LODs of 0.15-0.35mgL -1 , LOQs of 0.50-1.25mgL -1 , calibration ranges 0.50-150.00mgL -1 (r>0.998) and repeatability at 10.0mgL -1 of RSD≤0.98% (n=6). The method was used for determination of the dyes in "forest berries" colored pharmaceutical cough-cold formulation. The sample matrix - pharmaceuticals and excipients were not interfering with vis determination because of no retention in the separation column and colorless nature. The results proved the concept of fast and green chromatography approach using very short medium polar monolithic column in SIC. Copyright © 2017 Elsevier B.V. All rights reserved.
Ramanathan, Ragu; Ghosal, Anima; Ramanathan, Lakshmi; Comstock, Kate; Shen, Helen; Ramanathan, Dil
2018-05-01
Evaluation of HPLC-high-resolution mass spectrometry (HPLC-HRMS) full scan with polarity switching for increasing throughput of human in vitro cocktail drug-drug interaction assay. Microsomal incubates were analyzed using a high resolution and high mass accuracy Q-Exactive mass spectrometer to collect integrated qualitative and quantitative (qual/quant) data. Within assay, positive-to-negative polarity switching HPLC-HRMS method allowed quantification of eight and two probe compounds in the positive and negative ionization modes, respectively, while monitoring for LOR and its metabolites. LOR-inhibited CYP2C19 and showed higher activity for CYP2D6, CYP2E1 and CYP3A4. Overall, LC-HRMS-based nontargeted full scan quantitation allowed to improve the throughput of the in vitro cocktail drug-drug interaction assay.
System for high throughput water extraction from soil material for stable isotope analysis of water
USDA-ARS?s Scientific Manuscript database
A major limitation in the use of stable isotope of water in ecological studies is the time that is required to extract water from soil and plant samples. Using vacuum distillation the extraction time can be less than one hour per sample. Therefore, assembling a distillation system that can process m...
USDA-ARS?s Scientific Manuscript database
To assess the scope of virus disease problems of soybean in Ohio, USA, a survey was conducted during 2011 and 2012 soybean growing seasons. A total of 259 samples were collected from 80 soybean fields distributed in 42 Ohio counties, accounting for more than 90% of major soybean-growing counties in ...
Microarray profiling of chemical-induced effects is being increasingly used in medium and high-throughput formats. In this study, we describe computational methods to identify molecular targets from whole-genome microarray data using as an example the estrogen receptor α (ERα), ...
A noninvasive, direct real-time PCR method for sex determination in multiple avian species
Brubaker, Jessica L.; Karouna-Renier, Natalie K.; Chen, Yu; Jenko, Kathryn; Sprague, Daniel T.; Henry, Paula F.P.
2011-01-01
Polymerase chain reaction (PCR)-based methods to determine the sex of birds are well established and have seen few modifications since they were first introduced in the 1990s. Although these methods allowed for sex determination in species that were previously difficult to analyse, they were not conducive to high-throughput analysis because of the laboriousness of DNA extraction and gel electrophoresis. We developed a high-throughput real-time PCR-based method for analysis of sex in birds, which uses noninvasive sample collection and avoids DNA extraction and gel electrophoresis.
Nebula: reconstruction and visualization of scattering data in reciprocal space.
Reiten, Andreas; Chernyshov, Dmitry; Mathiesen, Ragnvald H
2015-04-01
Two-dimensional solid-state X-ray detectors can now operate at considerable data throughput rates that allow full three-dimensional sampling of scattering data from extended volumes of reciprocal space within second to minute time-scales. For such experiments, simultaneous analysis and visualization allows for remeasurements and a more dynamic measurement strategy. A new software, Nebula , is presented. It efficiently reconstructs X-ray scattering data, generates three-dimensional reciprocal space data sets that can be visualized interactively, and aims to enable real-time processing in high-throughput measurements by employing parallel computing on commodity hardware.
Nebula: reconstruction and visualization of scattering data in reciprocal space
Reiten, Andreas; Chernyshov, Dmitry; Mathiesen, Ragnvald H.
2015-01-01
Two-dimensional solid-state X-ray detectors can now operate at considerable data throughput rates that allow full three-dimensional sampling of scattering data from extended volumes of reciprocal space within second to minute timescales. For such experiments, simultaneous analysis and visualization allows for remeasurements and a more dynamic measurement strategy. A new software, Nebula, is presented. It efficiently reconstructs X-ray scattering data, generates three-dimensional reciprocal space data sets that can be visualized interactively, and aims to enable real-time processing in high-throughput measurements by employing parallel computing on commodity hardware. PMID:25844083
Photon-Counting H33D Detector for Biological Fluorescence Imaging
Michalet, X.; Siegmund, O.H.W.; Vallerga, J.V.; Jelinsky, P.; Millaud, J.E.; Weiss, S.
2010-01-01
We have developed a photon-counting High-temporal and High-spatial resolution, High-throughput 3-Dimensional detector (H33D) for biological imaging of fluorescent samples. The design is based on a 25 mm diameter S20 photocathode followed by a 3-microchannel plate stack, and a cross delay line anode. We describe the bench performance of the H33D detector, as well as preliminary imaging results obtained with fluorescent beads, quantum dots and live cells and discuss applications of future generation detectors for single-molecule imaging and high-throughput study of biomolecular interactions. PMID:20151021
Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A
2007-03-27
A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.