Sample records for perform high throughput

  1. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    NASA Astrophysics Data System (ADS)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  2. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations.

    PubMed

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  3. Quantitative description on structure–property relationships of Li-ion battery materials for high-throughput computations

    PubMed Central

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-01-01

    Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737

  4. The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD

    NASA Astrophysics Data System (ADS)

    Cox, M. A.; Reed, R.; Mellado, B.

    2015-01-01

    After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.

  5. High-throughput measurements of biochemical responses using the plate::vision multimode 96 minilens array reader.

    PubMed

    Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich

    2006-01-01

    The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.

  6. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  7. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  8. 20180312 - Uncertainty and Variability in High-Throughput Toxicokinetics for Risk Prioritization (SOT)

    EPA Science Inventory

    Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...

  9. Graph-based signal integration for high-throughput phenotyping

    PubMed Central

    2012-01-01

    Background Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. Results MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. Conclusions We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping. PMID:23320851

  10. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    PubMed Central

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  11. The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).

    PubMed

    Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi

    2018-01-01

    Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.

  12. The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio)

    PubMed Central

    Usui, Takuji; Noble, Daniel W.A.; O’Dea, Rose E.; Fangmeier, Melissa L.; Lagisz, Malgorzata; Hesselson, Daniel

    2018-01-01

    Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34–0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes. PMID:29372124

  13. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  14. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakala, Jacqueline Alexandra

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  15. Genomics tools available for unravelling mechanisms underlying agronomical traits in strawberry with more to come

    USDA-ARS?s Scientific Manuscript database

    In the last few years, high-throughput genomics promised to bridge the gap between plant physiology and plant sciences. In addition, high-throughput genotyping technologies facilitate marker-based selection for better performing genotypes. In strawberry, Fragaria vesca was the first reference sequen...

  16. FPGA cluster for high-performance AO real-time control system

    NASA Astrophysics Data System (ADS)

    Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.

    2006-06-01

    Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.

  17. High Performance Computing Modernization Program Kerberos Throughput Test Report

    DTIC Science & Technology

    2017-10-26

    functionality as Kerberos plugins. The pre -release production kit was used in these tests to compare against the current release kit. YubiKey support...HPCMP Kerberos Throughput Test Report 3 2. THROUGHPUT TESTING 2.1 Testing Components Throughput testing was done to determine the benefits of the pre ...both the current release kit and the pre -release production kit for a total of 378 individual tests in order to note any improvements. Based on work

  18. High throughput system for magnetic manipulation of cells, polymers, and biomaterials

    PubMed Central

    Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.

    2008-01-01

    In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357

  19. TCP Throughput Profiles Using Measurements over Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less

  20. High-Throughput, Motility-Based Sorter for Microswimmers such as C. elegans

    PubMed Central

    Yuan, Jinzhou; Zhou, Jessie; Raizen, David M.; Bau, Haim H.

    2015-01-01

    Animal motility varies with genotype, disease, aging, and environmental conditions. In many studies, it is desirable to carry out high throughput motility-based sorting to isolate rare animals for, among other things, forward genetic screens to identify genetic pathways that regulate phenotypes of interest. Many commonly used screening processes are labor-intensive, lack sensitivity, and require extensive investigator training. Here, we describe a sensitive, high throughput, automated, motility-based method for sorting nematodes. Our method is implemented in a simple microfluidic device capable of sorting thousands of animals per hour per module, and is amenable to parallelism. The device successfully enriches for known C. elegans motility mutants. Furthermore, using this device, we isolate low-abundance mutants capable of suppressing the somnogenic effects of the flp-13 gene, which regulates C. elegans sleep. By performing genetic complementation tests, we demonstrate that our motility-based sorting device efficiently isolates mutants for the same gene identified by tedious visual inspection of behavior on an agar surface. Therefore, our motility-based sorter is capable of performing high throughput gene discovery approaches to investigate fundamental biological processes. PMID:26008643

  1. High-throughput fabrication and screening improves gold nanoparticle chemiresistor sensor performance.

    PubMed

    Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard

    2015-02-09

    Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.

  2. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    PubMed

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  3. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    PubMed Central

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  4. Inhibition of Retinoblastoma Protein Inactivation

    DTIC Science & Technology

    2016-09-01

    Retinoblastoma protein, E2F transcription factor, high throughput screen, drug discovery, x-ray crystallography 16. SECURITY CLASSIFICATION OF: 17...developed a method to perform fragment based screening by x-ray crystallography . 2.0 KEYWORDS Retinoblastoma (Rb) pathway, E2F transcription factor...cancer, cell-cycle inhibition, activation, modulation, inhibition, high throughput screening, fragment-based screening, x-ray crystallography

  5. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    PubMed

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  6. Identification of functional modules using network topology and high-throughput data.

    PubMed

    Ulitsky, Igor; Shamir, Ron

    2007-01-26

    With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.

  7. SwellGel: an affinity chromatography technology for high-capacity and high-throughput purification of recombinant-tagged proteins.

    PubMed

    Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W

    2001-07-01

    The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.

  8. A high performance hardware implementation image encryption with AES algorithm

    NASA Astrophysics Data System (ADS)

    Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab

    2011-06-01

    This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.

  9. A Self-Reporting Photocatalyst for Online Fluorescence Monitoring of High Throughput RAFT Polymerization.

    PubMed

    Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie

    2018-04-25

    Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High-Throughput Bit-Serial LDPC Decoder LSI Based on Multiple-Valued Asynchronous Interleaving

    NASA Astrophysics Data System (ADS)

    Onizawa, Naoya; Hanyu, Takahiro; Gaudet, Vincent C.

    This paper presents a high-throughput bit-serial low-density parity-check (LDPC) decoder that uses an asynchronous interleaver. Since consecutive log-likelihood message values on the interleaver are similar, node computations are continuously performed by using the most recently arrived messages without significantly affecting bit-error rate (BER) performance. In the asynchronous interleaver, each message's arrival rate is based on the delay due to the wire length, so that the decoding throughput is not restricted by the worst-case latency, which results in a higher average rate of computation. Moreover, the use of a multiple-valued data representation makes it possible to multiplex control signals and data from mutual nodes, thus minimizing the number of handshaking steps in the asynchronous interleaver and eliminating the clock signal entirely. As a result, the decoding throughput becomes 1.3 times faster than that of a bit-serial synchronous decoder under a 90nm CMOS technology, at a comparable BER.

  11. A high-throughput, multi-channel photon-counting detector with picosecond timing

    NASA Astrophysics Data System (ADS)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  12. High performance hybrid magnetic structure for biotechnology applications

    DOEpatents

    Humphries, David E; Pollard, Martin J; Elkin, Christopher J

    2005-10-11

    The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.

  13. High performance hybrid magnetic structure for biotechnology applications

    DOEpatents

    Humphries, David E.; Pollard, Martin J.; Elkin, Christopher J.

    2006-12-12

    The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.

  14. Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Mirsky, Vladimir M.

    New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.

  15. A high-throughput exploration of magnetic materials by using structure predicting methods

    NASA Astrophysics Data System (ADS)

    Arapan, S.; Nieves, P.; Cuesta-López, S.

    2018-02-01

    We study the capability of a structure predicting method based on genetic/evolutionary algorithm for a high-throughput exploration of magnetic materials. We use the USPEX and VASP codes to predict stable and generate low-energy meta-stable structures for a set of representative magnetic structures comprising intermetallic alloys, oxides, interstitial compounds, and systems containing rare-earths elements, and for both types of ferromagnetic and antiferromagnetic ordering. We have modified the interface between USPEX and VASP codes to improve the performance of structural optimization as well as to perform calculations in a high-throughput manner. We show that exploring the structure phase space with a structure predicting technique reveals large sets of low-energy metastable structures, which not only improve currently exiting databases, but also may provide understanding and solutions to stabilize and synthesize magnetic materials suitable for permanent magnet applications.

  16. Development of a Rapid Fluorescence-Based High-Throughput Screening Assay to Identify Novel Kynurenine 3-Monooxygenase Inhibitor Scaffolds.

    PubMed

    Jacobs, K R; Guillemin, G J; Lovejoy, D B

    2018-02-01

    Kynurenine 3-monooxygenase (KMO) is a well-validated therapeutic target for the treatment of neurodegenerative diseases, including Alzheimer's disease (AD) and Huntington's disease (HD). This work reports a facile fluorescence-based KMO assay optimized for high-throughput screening (HTS) that achieves a throughput approximately 20-fold higher than the fastest KMO assay currently reported. The screen was run with excellent performance (average Z' value of 0.80) from 110,000 compounds across 341 plates and exceeded all statistical parameters used to describe a robust HTS assay. A subset of molecules was selected for validation by ultra-high-performance liquid chromatography, resulting in the confirmation of a novel hit with an IC 50 comparable to that of the well-described KMO inhibitor Ro-61-8048. A medicinal chemistry program is currently underway to further develop our novel KMO inhibitor scaffolds.

  17. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.

    PubMed

    Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian

    2016-07-05

    This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  18. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    PubMed

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  19. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  20. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  1. Latest performance of ArF immersion scanner NSR-S630D for high-volume manufacturing for 7nm node

    NASA Astrophysics Data System (ADS)

    Funatsu, Takayuki; Uehara, Yusaku; Hikida, Yujiro; Hayakawa, Akira; Ishiyama, Satoshi; Hirayama, Toru; Kono, Hirotaka; Shirata, Yosuke; Shibazaki, Yuichi

    2015-03-01

    In order to achieve stable operation in cutting-edge semiconductor manufacturing, Nikon has developed NSR-S630D with extremely accurate overlay while maintaining throughput in various conditions resembling a real production environment. In addition, NSR-S630D has been equipped with enhanced capabilities to maintain long-term overlay stability and user interface improvement all due to our newly developed application software platform. In this paper, we describe the most recent S630D performance in various conditions similar to real productions. In a production environment, superior overlay accuracy with high dose conditions and high throughput are often required; therefore, we have performed several experiments with high dose conditions to demonstrate NSR's thermal aberration capabilities in order to achieve world class overlay performance. Furthermore, we will introduce our new software that enables long term overlay performance.

  2. Novel molecular diagnostic tools for malaria elimination: a review of options from the point of view of high-throughput and applicability in resource limited settings.

    PubMed

    Britton, Sumudu; Cheng, Qin; McCarthy, James S

    2016-02-16

    As malaria transmission continues to decrease, an increasing number of countries will enter pre-elimination and elimination. To interrupt transmission, changes in control strategies are likely to require more accurate identification of all carriers of Plasmodium parasites, both symptomatic and asymptomatic, using diagnostic tools that are highly sensitive, high throughput and with fast turnaround times preferably performed in local health service settings. Currently available immunochromatographic lateral flow rapid diagnostic tests and field microscopy are unlikely to consistently detect infections at parasite densities less than 100 parasites/µL making them insufficiently sensitive for detecting all carriers. Molecular diagnostic platforms, such as PCR and LAMP, are currently available in reference laboratories, but at a cost both financially and in turnaround time. This review describes the recent progress in developing molecular diagnostic tools in terms of their capacity for high throughput and potential for performance in non-reference laboratories for malaria elimination.

  3. High-throughput heterodyne thermoreflectance: Application to thermal conductivity measurements of a Fe-Si-Ge thin film alloy library.

    PubMed

    d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan

    2017-07-01

    A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library Fe x Si y Ge 100-x-y (20

  4. Robot-Based High-Throughput Engineering of Alcoholic Polymer: Fullerene Nanoparticle Inks for an Eco-Friendly Processing of Organic Solar Cells.

    PubMed

    Xie, Chen; Tang, Xiaofeng; Berlinghof, Marvin; Langner, Stefan; Chen, Shi; Späth, Andreas; Li, Ning; Fink, Rainer H; Unruh, Tobias; Brabec, Christoph J

    2018-06-27

    Development of high-quality organic nanoparticle inks is a significant scientific challenge for the industrial production of solution-processed organic photovoltaics (OPVs) with eco-friendly processing methods. In this work, we demonstrate a novel, robot-based, high-throughput procedure performing automatic poly(3-hexylthio-phene-2,5-diyl) and indene-C 60 bisadduct nanoparticle ink synthesis in nontoxic alcohols. A novel methodology to prepare particle dispersions for fully functional OPVs by manipulating the particle size and solvent system was studied in detail. The ethanol dispersion with a particle diameter of around 80-100 nm exhibits reduced degradation, yielding a power conversion efficiency of 4.52%, which is the highest performance reported so far for water/alcohol-processed OPV devices. By successfully deploying the high-throughput robot-based approach for an organic nanoparticle ink preparation, we believe that the findings demonstrated in this work will trigger more research interest and effort on eco-friendly industrial production of OPVs.

  5. High-throughput heterodyne thermoreflectance: Application to thermal conductivity measurements of a Fe-Si-Ge thin film alloy library

    NASA Astrophysics Data System (ADS)

    d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan

    2017-07-01

    A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library FexSiyGe100-x-y (20

  6. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  7. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    PubMed Central

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, ChiHye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I.; Lee, Hoonkyung

    2016-01-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10−3 bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc– or V–porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials. PMID:26902156

  8. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide.

    PubMed

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, ChiHye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I; Lee, Hoonkyung

    2016-02-23

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10(-3) bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  9. High-throughput screening of metal-porphyrin-like graphenes for selective capture of carbon dioxide

    NASA Astrophysics Data System (ADS)

    Bae, Hyeonhu; Park, Minwoo; Jang, Byungryul; Kang, Yura; Park, Jinwoo; Lee, Hosik; Chung, Haegeun; Chung, Chihye; Hong, Suklyun; Kwon, Yongkyung; Yakobson, Boris I.; Lee, Hoonkyung

    2016-02-01

    Nanostructured materials, such as zeolites and metal-organic frameworks, have been considered to capture CO2. However, their application has been limited largely because they exhibit poor selectivity for flue gases and low capture capacity under low pressures. We perform a high-throughput screening for selective CO2 capture from flue gases by using first principles thermodynamics. We find that elements with empty d orbitals selectively attract CO2 from gaseous mixtures under low CO2 pressures (~10-3 bar) at 300 K and release it at ~450 K. CO2 binding to elements involves hybridization of the metal d orbitals with the CO2 π orbitals and CO2-transition metal complexes were observed in experiments. This result allows us to perform high-throughput screening to discover novel promising CO2 capture materials with empty d orbitals (e.g., Sc- or V-porphyrin-like graphene) and predict their capture performance under various conditions. Moreover, these findings provide physical insights into selective CO2 capture and open a new path to explore CO2 capture materials.

  10. Optical tools for high-throughput screening of abrasion resistance of combinatorial libraries of organic coatings

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.

    2002-02-01

    Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.

  11. A review of the theory, methods and recent applications of high-throughput single-cell droplet microfluidics

    NASA Astrophysics Data System (ADS)

    Lagus, Todd P.; Edd, Jon F.

    2013-03-01

    Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.

  12. A high-throughput media design approach for high performance mammalian fed-batch cultures

    PubMed Central

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame. PMID:23563583

  13. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    PubMed Central

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  14. Data transfer nodes and demonstration of 100-400 Gbps wide area throughput using the Caltech SDN testbed

    NASA Astrophysics Data System (ADS)

    Mughal, A.; Newman, H.

    2017-10-01

    We review and demonstrate the design of efficient data transfer nodes (DTNs), from the perspective of the highest throughput over both local and wide area networks, as well as the highest performance per unit cost. A careful system-level design is required for the hardware, firmware, OS and software components. Furthermore, additional tuning of these components, and the identification and elimination of any remaining bottlenecks is needed once the system is assembled and commissioned, in order to obtain optimal performance. For high throughput data transfers, specialized software is used to overcome the traditional limits in performance caused by the OS, file system, file structures used, etc. Concretely, we will discuss and present the latest results using Fast Data Transfer (FDT), developed by Caltech. We present and discuss the design choices for three generations of Caltech DTNs. Their transfer capabilities range from 40 Gbps to 400 Gbps. Disk throughput is still the biggest challenge in the current generation of available hardware. However, new NVME drives combined with RDMA and a new NVME network fabric are expected to improve the overall data-transfer throughput and simultaneously reduce the CPU load on the end nodes.

  15. High-Throughput Screening for a Moderately Halophilic Phenol-Degrading Strain and Its Salt Tolerance Response

    PubMed Central

    Lu, Zhi-Yan; Guo, Xiao-Jue; Li, Hui; Huang, Zhong-Zi; Lin, Kuang-Fei; Liu, Yong-Di

    2015-01-01

    A high-throughput screening system for moderately halophilic phenol-degrading bacteria from various habitats was developed to replace the conventional strain screening owing to its high efficiency. Bacterial enrichments were cultivated in 48 deep well microplates instead of shake flasks or tubes. Measurement of phenol concentrations was performed in 96-well microplates instead of using the conventional spectrophotometric method or high-performance liquid chromatography (HPLC). The high-throughput screening system was used to cultivate forty-three bacterial enrichments and gained a halophilic bacterial community E3 with the best phenol-degrading capability. Halomonas sp. strain 4-5 was isolated from the E3 community. Strain 4-5 was able to degrade more than 94% of the phenol (500 mg·L−1 starting concentration) over a range of 3%–10% NaCl. Additionally, the strain accumulated the compatible solute, ectoine, with increasing salt concentrations. PCR detection of the functional genes suggested that the largest subunit of multicomponent phenol hydroxylase (LmPH) and catechol 1,2-dioxygenase (C12O) were active in the phenol degradation process. PMID:26020478

  16. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.

  17. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  18. Microscale High-Throughput Experimentation as an Enabling Technology in Drug Discovery: Application in the Discovery of (Piperidinyl)pyridinyl-1H-benzimidazole Diacylglycerol Acyltransferase 1 Inhibitors.

    PubMed

    Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D

    2017-05-11

    Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.

  19. High-resolution and high-throughput multichannel Fourier transform spectrometer with two-dimensional interferogram warping compensation

    NASA Astrophysics Data System (ADS)

    Watanabe, A.; Furukawa, H.

    2018-04-01

    The resolution of multichannel Fourier transform (McFT) spectroscopy is insufficient for many applications despite its extreme advantage of high throughput. We propose an improved configuration to realise both performance using a two-dimensional area sensor. For the spectral resolution, we obtained the interferogram of a larger optical path difference by shifting the area sensor without altering any optical components. The non-linear phase error of the interferometer was successfully corrected using a phase-compensation calculation. Warping compensation was also applied to realise a higher throughput to accumulate the signal between vertical pixels. Our approach significantly improved the resolution and signal-to-noise ratio by factors of 1.7 and 34, respectively. This high-resolution and high-sensitivity McFT spectrometer will be useful for detecting weak light signals such as those in non-invasive diagnosis.

  20. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    PubMed

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  1. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    PubMed Central

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  2. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-05-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  3. Droplet microfluidic technology for single-cell high-throughput screening.

    PubMed

    Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L

    2009-08-25

    We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.

  4. Microfluidic guillotine for single-cell wound repair studies

    NASA Astrophysics Data System (ADS)

    Blauch, Lucas R.; Gai, Ya; Khor, Jian Wei; Sood, Pranidhi; Marshall, Wallace F.; Tang, Sindy K. Y.

    2017-07-01

    Wound repair is a key feature distinguishing living from nonliving matter. Single cells are increasingly recognized to be capable of healing wounds. The lack of reproducible, high-throughput wounding methods has hindered single-cell wound repair studies. This work describes a microfluidic guillotine for bisecting single Stentor coeruleus cells in a continuous-flow manner. Stentor is used as a model due to its robust repair capacity and the ability to perform gene knockdown in a high-throughput manner. Local cutting dynamics reveals two regimes under which cells are bisected, one at low viscous stress where cells are cut with small membrane ruptures and high viability and one at high viscous stress where cells are cut with extended membrane ruptures and decreased viability. A cutting throughput up to 64 cells per minute—more than 200 times faster than current methods—is achieved. The method allows the generation of more than 100 cells in a synchronized stage of their repair process. This capacity, combined with high-throughput gene knockdown in Stentor, enables time-course mechanistic studies impossible with current wounding methods.

  5. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  6. Evaluation of high throughput gene expression platforms using a genomic biomarker signature for prediction of skin sensitization.

    PubMed

    Forreryd, Andy; Johansson, Henrik; Albrekt, Ann-Sofie; Lindstedt, Malin

    2014-05-16

    Allergic contact dermatitis (ACD) develops upon exposure to certain chemical compounds termed skin sensitizers. To reduce the occurrence of skin sensitizers, chemicals are regularly screened for their capacity to induce sensitization. The recently developed Genomic Allergen Rapid Detection (GARD) assay is an in vitro alternative to animal testing for identification of skin sensitizers, classifying chemicals by evaluating transcriptional levels of a genomic biomarker signature. During assay development and biomarker identification, genome-wide expression analysis was applied using microarrays covering approximately 30,000 transcripts. However, the microarray platform suffers from drawbacks in terms of low sample throughput, high cost per sample and time consuming protocols and is a limiting factor for adaption of GARD into a routine assay for screening of potential sensitizers. With the purpose to simplify assay procedures, improve technical parameters and increase sample throughput, we assessed the performance of three high throughput gene expression platforms--nCounter®, BioMark HD™ and OpenArray®--and correlated their performance metrics against our previously generated microarray data. We measured the levels of 30 transcripts from the GARD biomarker signature across 48 samples. Detection sensitivity, reproducibility, correlations and overall structure of gene expression measurements were compared across platforms. Gene expression data from all of the evaluated platforms could be used to classify most of the sensitizers from non-sensitizers in the GARD assay. Results also showed high data quality and acceptable reproducibility for all platforms but only medium to poor correlations of expression measurements across platforms. In addition, evaluated platforms were superior to the microarray platform in terms of cost efficiency, simplicity of protocols and sample throughput. We evaluated the performance of three non-array based platforms using a limited set of transcripts from the GARD biomarker signature. We demonstrated that it was possible to achieve acceptable discriminatory power in terms of separation between sensitizers and non-sensitizers in the GARD assay while reducing assay costs, simplify assay procedures and increase sample throughput by using an alternative platform, providing a first step towards the goal to prepare GARD for formal validation and adaption of the assay for industrial screening of potential sensitizers.

  7. Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concavemore » and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.« less

  8. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  9. Performance comparison of SNP detection tools with illumina exome sequencing data—an assessment using both family pedigree information and sample-matched SNP array data

    PubMed Central

    Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.

    2014-01-01

    To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545

  10. Fluorescence-based high-throughput screening of dicer cleavage activity.

    PubMed

    Podolska, Katerina; Sedlak, David; Bartunek, Petr; Svoboda, Petr

    2014-03-01

    Production of small RNAs by ribonuclease III Dicer is a key step in microRNA and RNA interference pathways, which employ Dicer-produced small RNAs as sequence-specific silencing guides. Further studies and manipulations of microRNA and RNA interference pathways would benefit from identification of small-molecule modulators. Here, we report a study of a fluorescence-based in vitro Dicer cleavage assay, which was adapted for high-throughput screening. The kinetic assay can be performed under single-turnover conditions (35 nM substrate and 70 nM Dicer) in a small volume (5 µL), which makes it suitable for high-throughput screening in a 1536-well format. As a proof of principle, a small library of bioactive compounds was analyzed, demonstrating potential of the assay.

  11. Behavioral barcoding in the cloud: embracing data-intensive digital phenotyping in neuropharmacology.

    PubMed

    Kokel, David; Rennekamp, Andrew J; Shah, Asmi H; Liebel, Urban; Peterson, Randall T

    2012-08-01

    For decades, studying the behavioral effects of individual drugs and genetic mutations has been at the heart of efforts to understand and treat nervous system disorders. High-throughput technologies adapted from other disciplines (e.g., high-throughput chemical screening, genomics) are changing the scale of data acquisition in behavioral neuroscience. Massive behavioral datasets are beginning to emerge, particularly from zebrafish labs, where behavioral assays can be performed rapidly and reproducibly in 96-well, high-throughput format. Mining these datasets and making comparisons across different assays are major challenges for the field. Here, we review behavioral barcoding, a process by which complex behavioral assays are reduced to a string of numeric features, facilitating analysis and comparison within and across datasets. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  13. 3D pulsed laser-triggered high-speed microfluidic fluorescence-activated cell sorter

    PubMed Central

    Chen, Yue; Wu, Ting-Hsiang; Kung, Yu-Chun; Teitell, Michael A.; Chiou, Pei-Yu

    2014-01-01

    We report a 3D microfluidic pulsed laser-triggered fluorescence-activated cell sorter capable of sorting at a throughput of 23,000 cells sec−1 with 90% purity in high-purity mode and at a throughput of 45,000 cells sec−1 with 45% purity in enrichment mode in one stage and in a single channel. This performance is realized by exciting laser-induced cavitation bubbles in a 3D PDMS microfluidic channel to generate high-speed liquid jets that deflect detected fluorescent cells and particles focused by 3D sheath flows. The ultrafast switching mechanism (20 μsec complete on-off cycle), small liquid jet perturbation volume, and three-dimensional sheath flow focusing for accurate timing control of fast (1.5 m sec−1) passing cells and particles are three critical factors enabling high-purity sorting at high-throughput in this sorter. PMID:23844418

  14. A high throughput architecture for a low complexity soft-output demapping algorithm

    NASA Astrophysics Data System (ADS)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  15. Passive and Active Monitoring on a High Performance Research Network.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Warren

    2001-05-01

    The bold network challenges described in ''Internet End-to-end Performance Monitoring for the High Energy and Nuclear Physics Community'' presented at PAM 2000 have been tackled by the intrepid administrators and engineers providing the network services. After less than a year, the BaBar collaboration has collected almost 100 million particle collision events in a database approaching 165TB (Tera=10{sup 12}). Around 20TB has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, for processing and around 40 TB of simulated events have been imported to SLAC from Lawrence Livermore National Laboratory (LLNL). An unforseen challenge hasmore » arisen due to recent events and highlighted security concerns at DoE funded labs. New rules and regulations suggest it is only a matter of time before many active performance measurements may not be possible between many sites. Yet, at the same time, the importance of understanding every aspect of the network and eradicating packet loss for high throughput data transfers has become apparent. Work at SLAC to employ passive monitoring using netflow and OC3MON is underway and techniques to supplement and possibly replace the active measurements are being considered. This paper will detail the special needs and traffic characterization of a remarkable research project, and how the networking hurdles have been resolved (or not!) to achieve the required high data throughput. Results from active and passive measurements will be compared, and methods for achieving high throughput and the effect on the network will be assessed along with tools that directly measure throughput and applications used to actually transfer data.« less

  16. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    PubMed

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  17. High-Reflectivity Coatings for a Vacuum Ultraviolet Spectropolarimeter

    NASA Astrophysics Data System (ADS)

    Narukage, Noriyuki; Kubo, Masahito; Ishikawa, Ryohko; Ishikawa, Shin-nosuke; Katsukawa, Yukio; Kobiki, Toshihiko; Giono, Gabriel; Kano, Ryouhei; Bando, Takamasa; Tsuneta, Saku; Auchère, Frédéric; Kobayashi, Ken; Winebarger, Amy; McCandless, Jim; Chen, Jianrong; Choi, Joanne

    2017-03-01

    Precise polarization measurements in the vacuum ultraviolet (VUV) region are expected to be a new tool for inferring the magnetic fields in the upper atmosphere of the Sun. High-reflectivity coatings are key elements to achieving high-throughput optics for precise polarization measurements. We fabricated three types of high-reflectivity coatings for a solar spectropolarimeter in the hydrogen Lyman-α (Lyα; 121.567 nm) region and evaluated their performance. The first high-reflectivity mirror coating offers a reflectivity of more than 80 % in Lyα optics. The second is a reflective narrow-band filter coating that has a peak reflectivity of 57 % in Lyα, whereas its reflectivity in the visible light range is lower than 1/10 of the peak reflectivity (˜ 5 % on average). This coating can be used to easily realize a visible light rejection system, which is indispensable for a solar telescope, while maintaining high throughput in the Lyα line. The third is a high-efficiency reflective polarizing coating that almost exclusively reflects an s-polarized beam at its Brewster angle of 68° with a reflectivity of 55 %. This coating achieves both high polarizing power and high throughput. These coatings contributed to the high-throughput solar VUV spectropolarimeter called the Chromospheric Lyman-Alpha SpectroPolarimeter (CLASP), which was launched on 3 September, 2015.

  18. Heterogeneous high throughput scientific computing with APM X-Gene and Intel Xeon Phi

    DOE PAGES

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; ...

    2015-05-22

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. As a result, we report our experience on software porting, performance and energy efficiency and evaluatemore » the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).« less

  19. High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila.

    PubMed

    Chiaraviglio, Lucius; Kirby, James E

    2015-12-01

    Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  20. High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila

    PubMed Central

    Chiaraviglio, Lucius

    2015-01-01

    Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509

  1. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  2. The MAMMOTH project

    NASA Technical Reports Server (NTRS)

    Gerchar, Tim

    1994-01-01

    On the surface MAMMOTH is a high performance 5.25-inch half-high 8mm helical scan tape drive that records a native 20 Gigabytes of data on Advanced Metal Evaporated media at a sustained throughput of 3 Megabyte per second over a high speed SCSI interface, that is scheduled for production in the second half of 1995. But it's much more than that. Inside its custom designed sheet metal enclosure lies one of the greatest technical achievements of its kind. Exabyte's strategic direction is to increase throughput and capacity while continuing to improve drive, data and media reliability to its products. MAMMOTH adheres to that direction and the description of its technical advances is described in this paper. MAMMOTH can be broken down into four main functional assemblies: high-performance integrated digital electronics, high-reliability tape transport mechanism, high-performance scanner, and advanced metal evaporated media. All this technology is packaged into a standard 5.25-inch half-high form factor that dissipates only 15 watts.

  3. Determination of equilibrium dissociation constants for recombinant antibodies by high-throughput affinity electrophoresis.

    PubMed

    Pan, Yuchen; Sackmann, Eric K; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S; Herr, Amy E

    2016-12-23

    High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality - the binding affinity - is quantified through the dissociation constant (K D ) of each recombinant antibody and the target antigen. To characterize the K D of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The K D for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization.

  4. Determination of equilibrium dissociation constants for recombinant antibodies by high-throughput affinity electrophoresis

    PubMed Central

    Pan, Yuchen; Sackmann, Eric K.; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S.; Herr, Amy E.

    2016-01-01

    High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality – the binding affinity – is quantified through the dissociation constant (KD) of each recombinant antibody and the target antigen. To characterize the KD of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The KD for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization. PMID:28008969

  5. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  6. Photon-Counting H33D Detector for Biological Fluorescence Imaging

    PubMed Central

    Michalet, X.; Siegmund, O.H.W.; Vallerga, J.V.; Jelinsky, P.; Millaud, J.E.; Weiss, S.

    2010-01-01

    We have developed a photon-counting High-temporal and High-spatial resolution, High-throughput 3-Dimensional detector (H33D) for biological imaging of fluorescent samples. The design is based on a 25 mm diameter S20 photocathode followed by a 3-microchannel plate stack, and a cross delay line anode. We describe the bench performance of the H33D detector, as well as preliminary imaging results obtained with fluorescent beads, quantum dots and live cells and discuss applications of future generation detectors for single-molecule imaging and high-throughput study of biomolecular interactions. PMID:20151021

  7. Performance Evaluation of the Sysmex CS-5100 Automated Coagulation Analyzer.

    PubMed

    Chen, Liming; Chen, Yu

    2015-01-01

    Coagulation testing is widely applied clinically, and laboratories increasingly demand automated coagulation analyzers with short turn-around times and high-throughput. The purpose of this study was to evaluate the performance of the Sysmex CS-5100 automated coagulation analyzer for routine use in a clinical laboratory. The prothrombin time (PT), international normalized ratio (INR), activated partial thromboplastin time (APTT), fibrinogen (Fbg), and D-dimer were compared between the Sysmex CS-5100 and Sysmex CA-7000 analyzers, and the imprecision, comparison, throughput, STAT function, and performance for abnormal samples were measured in each. The within-run and between-run coefficients of variation (CV) for the PT, APTT, INR, and D-dimer analyses showed excellent results both in the normal and pathologic ranges. The correlation coefficients between the Sysmex CS-5100 and Sysmex CA-7000 were highly correlated. The throughput of the Sysmex CS-5100 was faster than that of the Sysmex CA-7000. There was no interference at all by total bilirubin concentrations and triglyceride concentrations in the Sysmex CS-5100 analyzer. We demonstrated that the Sysmex CS-5100 performs with satisfactory imprecision and is well suited for coagulation analysis in laboratories processing large sample numbers and icteric and lipemic samples.

  8. REDItools: high-throughput RNA editing detection made easy.

    PubMed

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  9. Fluorescence imaging technology (FI) for high-throughput screening of selenide-modified nano-TiO2 catalysts.

    PubMed

    Wang, Liping; Lee, Jianchao; Zhang, Meijuan; Duan, Qiannan; Zhang, Jiarui; Qi, Hailang

    2016-02-18

    A high-throughput screening (HTS) method based on fluorescence imaging (FI) was implemented to evaluate the catalytic performance of selenide-modified nano-TiO2. Chemical ink-jet printing (IJP) technology was reformed to fabricate a catalyst library comprising 1405 (Ni(a)Cu(b)Cd(c)Ce(d)In(e)Y(f))Se(x)/TiO2 (M6Se/Ti) composite photocatalysts. Nineteen M6Se/Tis were screened out from the 1405 candidates efficiently.

  10. High-throughput GPU-based LDPC decoding

    NASA Astrophysics Data System (ADS)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  11. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orton, Daniel J.; Tfaily, Malak M.; Moore, Ronald J.

    To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, lowmore » pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.« less

  12. Study of data I/O performance on distributed disk system in mask data preparation

    NASA Astrophysics Data System (ADS)

    Ohara, Shuichiro; Odaira, Hiroyuki; Chikanaga, Tomoyuki; Hamaji, Masakazu; Yoshioka, Yasuharu

    2010-09-01

    Data volume is getting larger every day in Mask Data Preparation (MDP). In the meantime, faster data handling is always required. MDP flow typically introduces Distributed Processing (DP) system to realize the demand because using hundreds of CPU is a reasonable solution. However, even if the number of CPU were increased, the throughput might be saturated because hard disk I/O and network speeds could be bottlenecks. So, MDP needs to invest a lot of money to not only hundreds of CPU but also storage and a network device which make the throughput faster. NCS would like to introduce new distributed processing system which is called "NDE". NDE could be a distributed disk system which makes the throughput faster without investing a lot of money because it is designed to use multiple conventional hard drives appropriately over network. NCS studies I/O performance with OASIS® data format on NDE which contributes to realize the high throughput in this paper.

  13. QoS support for end users of I/O-intensive applications using shared storage systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Marion Kei; Zhang, Xuechen; Jiang, Song

    2011-01-19

    I/O-intensive applications are becoming increasingly common on today's high-performance computing systems. While performance of compute-bound applications can be effectively guaranteed with techniques such as space sharing or QoS-aware process scheduling, it remains a challenge to meet QoS requirements for end users of I/O-intensive applications using shared storage systems because it is difficult to differentiate I/O services for different applications with individual quality requirements. Furthermore, it is difficult for end users to accurately specify performance goals to the storage system using I/O-related metrics such as request latency or throughput. As access patterns, request rates, and the system workload change in time,more » a fixed I/O performance goal, such as bounds on throughput or latency, can be expensive to achieve and may not lead to a meaningful performance guarantees such as bounded program execution time. We propose a scheme supporting end-users QoS goals, specified in terms of program execution time, in shared storage environments. We automatically translate the users performance goals into instantaneous I/O throughput bounds using a machine learning technique, and use dynamically determined service time windows to efficiently meet the throughput bounds. We have implemented this scheme in the PVFS2 parallel file system and have conducted an extensive evaluation. Our results show that this scheme can satisfy realistic end-user QoS requirements by making highly efficient use of the I/O resources. The scheme seeks to balance programs attainment of QoS requirements, and saves as much of the remaining I/O capacity as possible for best-effort programs.« less

  14. A high-throughput assay for DNA topoisomerases and other enzymes, based on DNA triplex formation.

    PubMed

    Burrell, Matthew R; Burton, Nicolas P; Maxwell, Anthony

    2010-01-01

    We have developed a rapid, high-throughput assay for measuring the catalytic activity (DNA supercoiling or relaxation) of topoisomerase enzymes that is also capable of monitoring the activity of other enzymes that alter the topology of DNA. The assay utilises intermolecular triplex formation to resolve supercoiled and relaxed forms of DNA, the principle being the greater efficiency of a negatively supercoiled plasmid to form an intermolecular triplex with an immobilised oligonucleotide than the relaxed form. The assay provides a number of advantages over the standard gel-based methods, including greater speed of analysis, reduced sample handling, better quantitation and improved reliability and accuracy of output data. The assay is performed in microtitre plates and can be adapted to high-throughput screening of libraries of potential inhibitors of topoisomerases including bacterial DNA gyrase.

  15. Improving Data Transfer Throughput with Direct Search Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar

    2016-01-01

    Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamicallymore » adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.« less

  16. Improvement in electron-beam lithography throughput by exploiting relaxed patterning fidelity requirements with directed self-assembly

    NASA Astrophysics Data System (ADS)

    Yu, Hao Yun; Liu, Chun-Hung; Shen, Yu Tian; Lee, Hsuan-Ping; Tsai, Kuen Yu

    2014-03-01

    Line edge roughness (LER) influencing the electrical performance of circuit components is a key challenge for electronbeam lithography (EBL) due to the continuous scaling of technology feature sizes. Controlling LER within an acceptable tolerance that satisfies International Technology Roadmap for Semiconductors requirements while achieving high throughput become a challenging issue. Although lower dosage and more-sensitive resist can be used to improve throughput, they would result in serious LER-related problems because of increasing relative fluctuation in the incident positions of electrons. Directed self-assembly (DSA) is a promising technique to relax LER-related pattern fidelity (PF) requirements because of its self-healing ability, which may benefit throughput. To quantify the potential of throughput improvement in EBL by introducing DSA for post healing, rigorous numerical methods are proposed to simultaneously maximize throughput by adjusting writing parameters of EBL systems subject to relaxed LER-related PF requirements. A fast, continuous model for parameter sweeping and a hybrid model for more accurate patterning prediction are employed for the patterning simulation. The tradeoff between throughput and DSA self-healing ability is investigated. Preliminary results indicate that significant throughput improvements are achievable at certain process conditions.

  17. VIRTEX-5 Fpga Implementation of Advanced Encryption Standard Algorithm

    NASA Astrophysics Data System (ADS)

    Rais, Muhammad H.; Qasim, Syed M.

    2010-06-01

    In this paper, we present an implementation of Advanced Encryption Standard (AES) cryptographic algorithm using state-of-the-art Virtex-5 Field Programmable Gate Array (FPGA). The design is coded in Very High Speed Integrated Circuit Hardware Description Language (VHDL). Timing simulation is performed to verify the functionality of the designed circuit. Performance evaluation is also done in terms of throughput and area. The design implemented on Virtex-5 (XC5VLX50FFG676-3) FPGA achieves a maximum throughput of 4.34 Gbps utilizing a total of 399 slices.

  18. High-throughput microfluidic single-cell digital polymerase chain reaction.

    PubMed

    White, A K; Heyries, K A; Doolin, C; Vaninsberghe, M; Hansen, C L

    2013-08-06

    Here we present an integrated microfluidic device for the high-throughput digital polymerase chain reaction (dPCR) analysis of single cells. This device allows for the parallel processing of single cells and executes all steps of analysis, including cell capture, washing, lysis, reverse transcription, and dPCR analysis. The cDNA from each single cell is distributed into a dedicated dPCR array consisting of 1020 chambers, each having a volume of 25 pL, using surface-tension-based sample partitioning. The high density of this dPCR format (118,900 chambers/cm(2)) allows the analysis of 200 single cells per run, for a total of 204,000 PCR reactions using a device footprint of 10 cm(2). Experiments using RNA dilutions show this device achieves shot-noise-limited performance in quantifying single molecules, with a dynamic range of 10(4). We performed over 1200 single-cell measurements, demonstrating the use of this platform in the absolute quantification of both high- and low-abundance mRNA transcripts, as well as micro-RNAs that are not easily measured using alternative hybridization methods. We further apply the specificity and sensitivity of single-cell dPCR to performing measurements of RNA editing events in single cells. High-throughput dPCR provides a new tool in the arsenal of single-cell analysis methods, with a unique combination of speed, precision, sensitivity, and specificity. We anticipate this approach will enable new studies where high-performance single-cell measurements are essential, including the analysis of transcriptional noise, allelic imbalance, and RNA processing.

  19. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  20. Recent advances in high-throughput QCL-based infrared microspectral imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rowlette, Jeremy A.; Fotheringham, Edeline; Nichols, David; Weida, Miles J.; Kane, Justin; Priest, Allen; Arnone, David B.; Bird, Benjamin; Chapman, William B.; Caffey, David B.; Larson, Paul; Day, Timothy

    2017-02-01

    The field of infrared spectral imaging and microscopy is advancing rapidly due in large measure to the recent commercialization of the first high-throughput, high-spatial-definition quantum cascade laser (QCL) microscope. Having speed, resolution and noise performance advantages while also eliminating the need for cryogenic cooling, its introduction has established a clear path to translating the well-established diagnostic capability of infrared spectroscopy into clinical and pre-clinical histology, cytology and hematology workflows. Demand for even higher throughput while maintaining high-spectral fidelity and low-noise performance continues to drive innovation in QCL-based spectral imaging instrumentation. In this talk, we will present for the first time, recent technological advances in tunable QCL photonics which have led to an additional 10X enhancement in spectral image data collection speed while preserving the high spectral fidelity and SNR exhibited by the first generation of QCL microscopes. This new approach continues to leverage the benefits of uncooled microbolometer focal plane array cameras, which we find to be essential for ensuring both reproducibility of data across instruments and achieving the high-reliability needed in clinical applications. We will discuss the physics underlying these technological advancements as well as the new biomedical applications these advancements are enabling, including automated whole-slide infrared chemical imaging on clinically relevant timescales.

  1. Scalable software-defined optical networking with high-performance routing and wavelength assignment algorithms.

    PubMed

    Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin

    2015-10-19

    The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.

  2. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    PubMed Central

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-01-01

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925

  3. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    PubMed

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  4. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping1[C][W][OPEN

    PubMed Central

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-01-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818

  5. Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.

    PubMed

    Browne, Patience; Judson, Richard S; Casey, Warren M; Kleinstreuer, Nicole C; Thomas, Russell S

    2015-07-21

    The U.S. Environmental Protection Agency (EPA) is considering high-throughput and computational methods to evaluate the endocrine bioactivity of environmental chemicals. Here we describe a multistep, performance-based validation of new methods and demonstrate that these new tools are sufficiently robust to be used in the Endocrine Disruptor Screening Program (EDSP). Results from 18 estrogen receptor (ER) ToxCast high-throughput screening assays were integrated into a computational model that can discriminate bioactivity from assay-specific interference and cytotoxicity. Model scores range from 0 (no activity) to 1 (bioactivity of 17β-estradiol). ToxCast ER model performance was evaluated for reference chemicals, as well as results of EDSP Tier 1 screening assays in current practice. The ToxCast ER model accuracy was 86% to 93% when compared to reference chemicals and predicted results of EDSP Tier 1 guideline and other uterotrophic studies with 84% to 100% accuracy. The performance of high-throughput assays and ToxCast ER model predictions demonstrates that these methods correctly identify active and inactive reference chemicals, provide a measure of relative ER bioactivity, and rapidly identify chemicals with potential endocrine bioactivities for additional screening and testing. EPA is accepting ToxCast ER model data for 1812 chemicals as alternatives for EDSP Tier 1 ER binding, ER transactivation, and uterotrophic assays.

  6. Label-free cancer cell separation from human whole blood using inertial microfluidics at low shear stress.

    PubMed

    Lee, Myung Gwon; Shin, Joong Ho; Bae, Chae Yun; Choi, Sungyoung; Park, Je-Kyun

    2013-07-02

    We report a contraction-expansion array (CEA) microchannel device that performs label-free high-throughput separation of cancer cells from whole blood at low Reynolds number (Re). The CEA microfluidic device utilizes hydrodynamic field effect for cancer cell separation, two kinds of inertial effects: (1) inertial lift force and (2) Dean flow, which results in label-free size-based separation with high throughput. To avoid cell damages potentially caused by high shear stress in conventional inertial separation techniques, the CEA microfluidic device isolates the cells with low operational Re, maintaining high-throughput separation, using nondiluted whole blood samples (hematocrit ~45%). We characterized inertial particle migration and investigated the migration of blood cells and various cancer cells (MCF-7, SK-BR-3, and HCC70) in the CEA microchannel. The separation of cancer cells from whole blood was demonstrated with a cancer cell recovery rate of 99.1%, a blood cell rejection ratio of 88.9%, and a throughput of 1.1 × 10(8) cells/min. In addition, the blood cell rejection ratio was further improved to 97.3% by a two-step filtration process with two devices connected in series.

  7. THE RABIT: A RAPID AUTOMATED BIODOSIMETRY TOOL FOR RADIOLOGICAL TRIAGE

    PubMed Central

    Garty, Guy; Chen, Youhua; Salerno, Alessio; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Brenner, David J.

    2010-01-01

    In response to the recognized need for high throughput biodosimetry methods for use after large scale radiological events, a logical approach is complete automation of standard biodosimetric assays that are currently performed manually. We describe progress to date on the RABIT (Rapid Automated BIodosimetry Tool), designed to score micronuclei or γ-H2AX fluorescence in lymphocytes derived from a single drop of blood from a fingerstick. The RABIT system is designed to be completely automated, from the input of the capillary blood sample into the machine, to the output of a dose estimate. Improvements in throughput are achieved through use of a single drop of blood, optimization of the biological protocols for in-situ analysis in multi-well plates, implementation of robotic plate and liquid handling, and new developments in high-speed imaging. Automating well-established bioassays represents a promising approach to high-throughput radiation biodosimetry, both because high throughputs can be achieved, but also because the time to deployment is potentially much shorter than for a new biological assay. Here we describe the development of each of the individual modules of the RABIT system, and show preliminary data from key modules. Ongoing is system integration, followed by calibration and validation. PMID:20065685

  8. Lens-free shadow image based high-throughput continuous cell monitoring technique.

    PubMed

    Jin, Geonsoo; Yoo, In-Hwa; Pack, Seung Pil; Yang, Ji-Woon; Ha, Un-Hwan; Paek, Se-Hwan; Seo, Sungkyu

    2012-01-01

    A high-throughput continuous cell monitoring technique which does not require any labeling reagents or destruction of the specimen is demonstrated. More than 6000 human alveolar epithelial A549 cells are monitored for up to 72 h simultaneously and continuously with a single digital image within a cost and space effective lens-free shadow imaging platform. In an experiment performed within a custom built incubator integrated with the lens-free shadow imaging platform, the cell nucleus division process could be successfully characterized by calculating the signal-to-noise ratios (SNRs) and the shadow diameters (SDs) of the cell shadow patterns. The versatile nature of this platform also enabled a single cell viability test followed by live cell counting. This study firstly shows that the lens-free shadow imaging technique can provide a continuous cell monitoring without any staining/labeling reagent and destruction of the specimen. This high-throughput continuous cell monitoring technique based on lens-free shadow imaging may be widely utilized as a compact, low-cost, and high-throughput cell monitoring tool in the fields of drug and food screening or cell proliferation and viability testing. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. High-throughput countercurrent microextraction in passive mode.

    PubMed

    Xie, Tingliang; Xu, Cong

    2018-05-15

    Although microextraction is much more efficient than conventional macroextraction, its practical application has been limited by low throughputs and difficulties in constructing robust countercurrent microextraction (CCME) systems. In this work, a robust CCME process was established based on a novel passive microextractor with four units without any moving parts. The passive microextractor has internal recirculation and can efficiently mix two immiscible liquids. The hydraulic characteristics as well as the extraction and back-extraction performance of the passive CCME were investigated experimentally. The recovery efficiencies of the passive CCME were 1.43-1.68 times larger than the best values achieved using cocurrent extraction. Furthermore, the total throughput of the passive CCME developed in this work was about one to three orders of magnitude higher than that of other passive CCME systems reported in the literature. Therefore, a robust CCME process with high throughputs has been successfully constructed, which may promote the application of passive CCME in a wide variety of fields.

  10. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    PubMed

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand.

  11. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic

    PubMed Central

    Kerkhove, Dwight; Tian, Le; Munteanu, Adrian; De Poorter, Eli

    2018-01-01

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand. PMID:29360798

  12. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST)

    PubMed Central

    Dowd, Scot E; Zaragoza, Joaquin; Rodriguez, Javier R; Oliver, Melvin J; Payton, Paxton R

    2005-01-01

    Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST), which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN). W.ND-BLAST provides intuitive Graphic User Interfaces (GUI) for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours) on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV) and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is freely downloadable from . With registration the software is free, installation, networking, and usage instructions are provided as well as a support forum. PMID:15819992

  13. Engineering and Characterizing Light-Matter Interactions in Photonic Crystals

    DTIC Science & Technology

    2010-01-01

    photonic crystal effects would occur at wavelengths in the infrared spectrum. These effects would not be easily measured by our available...spectrometers which operate in the visible and near- infrared , at wavelengths shorter than 1.6 microns. Similarly, the majority of interesting luminescent...periodicity of the photonic crystal is defined by the high -throughput method while the low-throughput method performs the complementary task of adding a

  14. Custom Super-Resolution Microscope for the Structural Analysis of Nanostructures

    DTIC Science & Technology

    2018-05-29

    research community. As part of our validation of the new design approach, we performed two - color imaging of pairs of adjacent oligo probes hybridized...nanostructures and biological targets. Our microscope features a large field of view and custom optics that facilitate 3D imaging and enhanced contrast in...our imaging throughput by creating two microscopy platforms for high-throughput, super-resolution materials characterization, with the AO set-up being

  15. On Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang

    Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which showmore » concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.« less

  16. High-throughput sequencing: a failure mode analysis.

    PubMed

    Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A

    2005-01-04

    Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.

  17. Comprehensive analysis of the T-cell receptor beta chain gene in rhesus monkey by high throughput sequencing

    PubMed Central

    Li, Zhoufang; Liu, Guangjie; Tong, Yin; Zhang, Meng; Xu, Ying; Qin, Li; Wang, Zhanhui; Chen, Xiaoping; He, Jiankui

    2015-01-01

    Profiling immune repertoires by high throughput sequencing enhances our understanding of immune system complexity and immune-related diseases in humans. Previously, cloning and Sanger sequencing identified limited numbers of T cell receptor (TCR) nucleotide sequences in rhesus monkeys, thus their full immune repertoire is unknown. We applied multiplex PCR and Illumina high throughput sequencing to study the TCRβ of rhesus monkeys. We identified 1.26 million TCRβ sequences corresponding to 643,570 unique TCRβ sequences and 270,557 unique complementarity-determining region 3 (CDR3) gene sequences. Precise measurements of CDR3 length distribution, CDR3 amino acid distribution, length distribution of N nucleotide of junctional region, and TCRV and TCRJ gene usage preferences were performed. A comprehensive profile of rhesus monkey immune repertoire might aid human infectious disease studies using rhesus monkeys. PMID:25961410

  18. High-throughput investigation of catalysts for JP-8 fuel cracking to liquefied petroleum gas.

    PubMed

    Bedenbaugh, John E; Kim, Sungtak; Sasmaz, Erdem; Lauterbach, Jochen

    2013-09-09

    Portable power technologies for military applications necessitate the production of fuels similar to LPG from existing feedstocks. Catalytic cracking of military jet fuel to form a mixture of C₂-C₄ hydrocarbons was investigated using high-throughput experimentation. Cracking experiments were performed in a gas-phase, 16-sample high-throughput reactor. Zeolite ZSM-5 catalysts with low Si/Al ratios (≤25) demonstrated the highest production of C₂-C₄ hydrocarbons at moderate reaction temperatures (623-823 K). ZSM-5 catalysts were optimized for JP-8 cracking activity to LPG through varying reaction temperature and framework Si/Al ratio. The reducing atmosphere required during catalytic cracking resulted in coking of the catalyst and a commensurate decrease in conversion rate. Rare earth metal promoters for ZSM-5 catalysts were screened to reduce coking deactivation rates, while noble metal promoters reduced onset temperatures for coke burnoff regeneration.

  19. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships

    PubMed Central

    2010-01-01

    Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245

  20. Knowledge-based analysis of microarrays for the discovery of transcriptional regulation relationships.

    PubMed

    Seok, Junhee; Kaushal, Amit; Davis, Ronald W; Xiao, Wenzhong

    2010-01-18

    The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data.

  1. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    PubMed

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.

  2. Potential of dynamically harmonized Fourier transform ion cyclotron resonance cell for high-throughput metabolomics fingerprinting: control of data quality.

    PubMed

    Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle

    2018-01-01

    Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.

  3. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    PubMed

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  4. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    NASA Technical Reports Server (NTRS)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  5. Identification of Inhibitors in Lignocellulosic Slurries and Determination of Their Effect on Hydrocarbon-Producing Microorganisms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Shihui; Franden, Mary A; Yang, Qing

    The aim of this work was to identify inhibitors in pretreated lignocellulosic slurries, evaluate high-throughput screening strategies, and investigate the impact of inhibitors on potential hydrocarbon-producing microorganisms. Compounds present in slurries that could inhibit microbial growth were identified through a detailed analysis of saccharified slurries by applying a combination of approaches of high-performance liquid chromatography, GC-MS, LC-DAD-MS, and ICP-MS. Several high-throughput assays were then evaluated to generate toxicity profiles. Our results demonstrated that Bioscreen C was useful for analyzing bacterial toxicity but not for yeast. AlamarBlue reduction assay can be a useful high-throughput assay for both bacterial and yeast strainsmore » as long as medium components do not interfere with fluorescence measurements. In addition, this work identified two major inhibitors (furfural and ammonium acetate) for three potential hydrocarbon-producing bacterial species that include Escherichia coli, Cupriavidus necator, and Rhodococcus opacus PD630, which are also the primary inhibitors for ethanologens. Here, this study was strived to establish a pipeline to quantify inhibitory compounds in biomass slurries and high-throughput approaches to investigate the effect of inhibitors on microbial biocatalysts, which can be applied for various biomass slurries or hydrolyzates generated through different pretreatment and enzymatic hydrolysis processes or different microbial candidates.« less

  6. Matrix-assisted laser desorption/ionisation, time-of-flight mass spectrometry-based blood group genotyping--the alternative approach.

    PubMed

    Gassner, Christoph; Meyer, Stefan; Frey, Beat M; Vollmert, Caren

    2013-01-01

    Although matrix-assisted laser desorption/ionisation, time-of-flight mass spectrometry (MALDI-TOF MS) has previously been reported for high throughput blood group genotyping, those reports are limited to only a few blood group systems. This review describes the development of a large cooperative Swiss-German project, aiming to employ MALDI-TOF MS for the molecular detection of the blood groups Rh, Kell, Kidd, Duffy, MNSs, a comprehensive collection of low incidence antigens, as well as the platelet and granulocyte antigens HPA and HNA, representing a total of 101 blood group antigens, encoded by 170 alleles, respectively. Recent reports describe MALDI-TOF MS as a technology with short time-to-resolution, ability for high throughput, and cost-efficiency when used in genetic analysis, including forensics, pharmacogenetics, oncology and hematology. Furthermore, Kell and RhD genotyping have been performed on fetal DNA from maternal plasma with excellent results. In summary, this article introduces a new technological approach for high throughput blood group genotyping by means of MALDI-TOF MS. Although all data presented are preliminary, the observed success rates, data quality and concordance with known blood group types are highly impressive, underlining the accuracy and reliability of this cost-efficient high throughput method. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Identification of Inhibitors in Lignocellulosic Slurries and Determination of Their Effect on Hydrocarbon-Producing Microorganisms

    DOE PAGES

    Yang, Shihui; Franden, Mary A; Yang, Qing; ...

    2018-04-04

    The aim of this work was to identify inhibitors in pretreated lignocellulosic slurries, evaluate high-throughput screening strategies, and investigate the impact of inhibitors on potential hydrocarbon-producing microorganisms. Compounds present in slurries that could inhibit microbial growth were identified through a detailed analysis of saccharified slurries by applying a combination of approaches of high-performance liquid chromatography, GC-MS, LC-DAD-MS, and ICP-MS. Several high-throughput assays were then evaluated to generate toxicity profiles. Our results demonstrated that Bioscreen C was useful for analyzing bacterial toxicity but not for yeast. AlamarBlue reduction assay can be a useful high-throughput assay for both bacterial and yeast strainsmore » as long as medium components do not interfere with fluorescence measurements. In addition, this work identified two major inhibitors (furfural and ammonium acetate) for three potential hydrocarbon-producing bacterial species that include Escherichia coli, Cupriavidus necator, and Rhodococcus opacus PD630, which are also the primary inhibitors for ethanologens. Here, this study was strived to establish a pipeline to quantify inhibitory compounds in biomass slurries and high-throughput approaches to investigate the effect of inhibitors on microbial biocatalysts, which can be applied for various biomass slurries or hydrolyzates generated through different pretreatment and enzymatic hydrolysis processes or different microbial candidates.« less

  8. High-throughput and selective solid-phase extraction of urinary catecholamines by crown ether-modified resin composite fiber.

    PubMed

    Chen, LiQin; Wang, Hui; Xu, Zhen; Zhang, QiuYue; Liu, Jia; Shen, Jun; Zhang, WanQi

    2018-08-03

    In the present study, we developed a simple and high-throughput solid phase extraction (SPE) procedure for selective extraction of catecholamines (CAs) in urine samples. The SPE adsorbents were electrospun composite fibers functionalized with 4-carboxybenzo-18-crown-6 ether modified XAD resin and polystyrene, which were packed into 96-well columns and used for high-throughput selective extraction of CAs in healthy human urine samples. Moreover, the extraction efficiency of packed-fiber SPE (PFSPE) was examined by high performance liquid chromatography coupled with fluorescence detector. The parameters affecting the extraction efficiency and impurity removal efficiency were optimized, and good linearity ranging from 0.5 to 400 ng/mL was obtained with a low limit of detection (LOD, 0.2-0.5 ng/mL) and a good repeatability (2.7%-3.7%, n = 6). The extraction recoveries of three CAs ranged from 70.5% to 119.5%. Furthermore, stable and reliable results obtained by the fluorescence detector were superior to those obtained by the electrochemical detector. Collectively, PFSPE coupled with 96-well columns was a simple, rapid, selective, high-throughput and cost-efficient method, and the proposed method could be applied in clinical chemistry. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  10. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    PubMed Central

    2010-01-01

    Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Conclusions Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available. PMID:21050468

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    PubMed

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available.

  12. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    PubMed

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.

  13. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  14. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  15. An evaluation of MPI message rate on hybrid-core processors

    DOE PAGES

    Barrett, Brian W.; Brightwell, Ron; Grant, Ryan; ...

    2014-11-01

    Power and energy concerns are motivating chip manufacturers to consider future hybrid-core processor designs that may combine a small number of traditional cores optimized for single-thread performance with a large number of simpler cores optimized for throughput performance. This trend is likely to impact the way in which compute resources for network protocol processing functions are allocated and managed. In particular, the performance of MPI match processing is critical to achieving high message throughput. In this paper, we analyze the ability of simple and more complex cores to perform MPI matching operations for various scenarios in order to gain insightmore » into how MPI implementations for future hybrid-core processors should be designed.« less

  16. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph

    2009-03-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  17. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    NASA Astrophysics Data System (ADS)

    Alexander, Kristen; Lopez, Rene; Hampton, Meredith; Desimone, Joseph

    2008-10-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  18. Novel method for high-throughput colony PCR screening in nanoliter-reactors

    PubMed Central

    Walser, Marcel; Pellaux, Rene; Meyer, Andreas; Bechtold, Matthias; Vanderschuren, Herve; Reinhardt, Richard; Magyar, Joseph; Panke, Sven; Held, Martin

    2009-01-01

    We introduce a technology for the rapid identification and sequencing of conserved DNA elements employing a novel suspension array based on nanoliter (nl)-reactors made from alginate. The reactors have a volume of 35 nl and serve as reaction compartments during monoseptic growth of microbial library clones, colony lysis, thermocycling and screening for sequence motifs via semi-quantitative fluorescence analyses. nl-Reactors were kept in suspension during all high-throughput steps which allowed performing the protocol in a highly space-effective fashion and at negligible expenses of consumables and reagents. As a first application, 11 high-quality microsatellites for polymorphism studies in cassava were isolated and sequenced out of a library of 20 000 clones in 2 days. The technology is widely scalable and we envision that throughputs for nl-reactor based screenings can be increased up to 100 000 and more samples per day thereby efficiently complementing protocols based on established deep-sequencing technologies. PMID:19282448

  19. Ethoscopes: An open platform for high-throughput ethomics.

    PubMed

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F

    2017-10-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  20. Optima MDxt: A high throughput 335 keV mid-dose implanter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisner, Edward; David, Jonathan; Justesen, Perry

    2012-11-06

    The continuing demand for both energy purity and implant angle control along with high wafer throughput drove the development of the Axcelis Optima MDxt mid-dose ion implanter. The system utilizes electrostatic scanning, an electrostatic parallelizing lens and an electrostatic energy filter to produce energetically pure beams with high angular integrity. Based on field proven components, the Optima MDxt beamline architecture offers the high beam currents possible with singly charged species including arsenic at energies up to 335 keV as well as large currents from multiply charged species at energies extending over 1 MeV. Conversely, the excellent energy filtering capability allowsmore » high currents at low beam energies, since it is safe to utilize large deceleration ratios. This beamline is coupled with the >500 WPH capable endstation technology used on the Axcelis Optima XEx high energy ion implanter. The endstation includes in-situ angle measurements of the beam in order to maintain excellent beam-to-wafer implant angle control in both the horizontal and vertical directions. The Optima platform control system provides new generation dose control system that assures excellent dosimetry and charge control. This paper will describe the features and technologies that allow the Optima MDxt to provide superior process performance at the highest wafer throughput, and will provide examples of the process performance achievable.« less

  1. Comparative Analysis of Performance and Microbial Characteristics Between High-Solid and Low-Solid Anaerobic Digestion of Sewage Sludge Under Mesophilic Conditions.

    PubMed

    Lu, Qin; Yi, Jing; Yang, Dianhai

    2016-01-01

    High-solid anaerobic digestion of sewage sludge achieves highly efficient volatile solid reduction, and production of volatile fatty acid (VFA) and methane compared with conventional low-solid anaerobic digestion. In this study, the potential mechanisms of the better performance in high-solid anaerobic digestion of sewage sludge were investigated by using 454 high-throughput pyrosequencing and real-time PCR to analyze the microbial characteristics in sewage sludge fermentation reactors. The results obtained by 454 high-throughput pyrosequencing revealed that the phyla Chloroflexi, Bacteroidetes, and Firmicutes were the dominant functional microorganisms in high-solid and low-solid anaerobic systems. Meanwhile, the real-time PCR assays showed that high-solid anaerobic digestion significantly increased the number of total bacteria, which enhanced the hydrolysis and acidification of sewage sludge. Further study indicated that the number of total archaea (dominated by Methanosarcina) in a high-solid anaerobic fermentation reactor was also higher than that in a low-solid reactor, resulting in higher VFA consumption and methane production. Hence, the increased key bacteria and methanogenic archaea involved in sewage sludge hydrolysis, acidification, and methanogenesis resulted in the better performance of high-solid anaerobic sewage sludge fermentation.

  2. A paper-based microbial fuel cell array for rapid and high-throughput screening of electricity-producing bacteria.

    PubMed

    Choi, Gihoon; Hassett, Daniel J; Choi, Seokheun

    2015-06-21

    There is a large global effort to improve microbial fuel cell (MFC) techniques and advance their translational potential toward practical, real-world applications. Significant boosts in MFC performance can be achieved with the development of new techniques in synthetic biology that can regulate microbial metabolic pathways or control their gene expression. For these new directions, a high-throughput and rapid screening tool for microbial biopower production is needed. In this work, a 48-well, paper-based sensing platform was developed for the high-throughput and rapid characterization of the electricity-producing capability of microbes. 48 spatially distinct wells of a sensor array were prepared by patterning 48 hydrophilic reservoirs on paper with hydrophobic wax boundaries. This paper-based platform exploited the ability of paper to quickly wick fluid and promoted bacterial attachment to the anode pads, resulting in instant current generation upon loading of the bacterial inoculum. We validated the utility of our MFC array by studying how strategic genetic modifications impacted the electrochemical activity of various Pseudomonas aeruginosa mutant strains. Within just 20 minutes, we successfully determined the electricity generation capacity of eight isogenic mutants of P. aeruginosa. These efforts demonstrate that our MFC array displays highly comparable performance characteristics and identifies genes in P. aeruginosa that can trigger a higher power density.

  3. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    PubMed Central

    Kim, Robin E.; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F.; Song, Junho

    2016-01-01

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved. PMID:27258270

  4. Gas pressure assisted microliquid-liquid extraction coupled online to direct infusion mass spectrometry: a new automated screening platform for bioanalysis.

    PubMed

    Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas

    2014-10-21

    In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.

  5. Experimental Study of an Advanced Concept of Moderate-resolution Holographic Spectrographs

    NASA Astrophysics Data System (ADS)

    Muslimov, Eduard; Valyavin, Gennady; Fabrika, Sergei; Musaev, Faig; Galazutdinov, Gazinur; Pavlycheva, Nadezhda; Emelianov, Eduard

    2018-07-01

    We present the results of an experimental study of an advanced moderate-resolution spectrograph based on a cascade of narrow-band holographic gratings. The main goal of the project is to achieve a moderately high spectral resolution with R up to 5000 simultaneously in the 4300–6800 Å visible spectral range on a single standard CCD, together with an increased throughput. The experimental study consisted of (1) resolution and image quality tests performed using the solar spectrum, and (2) a total throughput test performed for a number of wavelengths using a calibrated lab monochromator. The measured spectral resolving power reaches values over R > 4000 while the experimental throughput is as high as 55%, which agrees well with the modeling results. Comparing the obtained characteristics of the spectrograph under consideration with the best existing spectrographs, we conclude that the used concept can be considered as a very competitive and cheap alternative to the existing spectrographs of the given class. We propose several astrophysical applications for the instrument and discuss the prospect of creating its full-scale version.

  6. Strategic and Operational Plan for Integrating Transcriptomics ...

    EPA Pesticide Factsheets

    Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016

  7. High-Throughput Experimental Approach Capabilities | Materials Science |

    Science.gov Websites

    NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non

  8. Short-read, high-throughput sequencing technology for STR genotyping

    PubMed Central

    Bornman, Daniel M.; Hester, Mark E.; Schuetter, Jared M.; Kasoji, Manjula D.; Minard-Smith, Angela; Barden, Curt A.; Nelson, Scott C.; Godbold, Gene D.; Baker, Christine H.; Yang, Boyu; Walther, Jacquelyn E.; Tornes, Ivan E.; Yan, Pearlly S.; Rodriguez, Benjamin; Bundschuh, Ralf; Dickens, Michael L.; Young, Brian A.; Faith, Seth A.

    2013-01-01

    DNA-based methods for human identification principally rely upon genotyping of short tandem repeat (STR) loci. Electrophoretic-based techniques for variable-length classification of STRs are universally utilized, but are limited in that they have relatively low throughput and do not yield nucleotide sequence information. High-throughput sequencing technology may provide a more powerful instrument for human identification, but is not currently validated for forensic casework. Here, we present a systematic method to perform high-throughput genotyping analysis of the Combined DNA Index System (CODIS) STR loci using short-read (150 bp) massively parallel sequencing technology. Open source reference alignment tools were optimized to evaluate PCR-amplified STR loci using a custom designed STR genome reference. Evaluation of this approach demonstrated that the 13 CODIS STR loci and amelogenin (AMEL) locus could be accurately called from individual and mixture samples. Sensitivity analysis showed that as few as 18,500 reads, aligned to an in silico referenced genome, were required to genotype an individual (>99% confidence) for the CODIS loci. The power of this technology was further demonstrated by identification of variant alleles containing single nucleotide polymorphisms (SNPs) and the development of quantitative measurements (reads) for resolving mixed samples. PMID:25621315

  9. High throughput screening of particle conditioning operations: I. System design and method development.

    PubMed

    Noyes, Aaron; Huffman, Ben; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Sunasara, Khurram; Mukhopadhyay, Tarit

    2015-08-01

    The biotech industry is under increasing pressure to decrease both time to market and development costs. Simultaneously, regulators are expecting increased process understanding. High throughput process development (HTPD) employs small volumes, parallel processing, and high throughput analytics to reduce development costs and speed the development of novel therapeutics. As such, HTPD is increasingly viewed as integral to improving developmental productivity and deepening process understanding. Particle conditioning steps such as precipitation and flocculation may be used to aid the recovery and purification of biological products. In this first part of two articles, we describe an ultra scale-down system (USD) for high throughput particle conditioning (HTPC) composed of off-the-shelf components. The apparatus is comprised of a temperature-controlled microplate with magnetically driven stirrers and integrated with a Tecan liquid handling robot. With this system, 96 individual reaction conditions can be evaluated in parallel, including downstream centrifugal clarification. A comprehensive suite of high throughput analytics enables measurement of product titer, product quality, impurity clearance, clarification efficiency, and particle characterization. HTPC at the 1 mL scale was evaluated with fermentation broth containing a vaccine polysaccharide. The response profile was compared with the Pilot-scale performance of a non-geometrically similar, 3 L reactor. An engineering characterization of the reactors and scale-up context examines theoretical considerations for comparing this USD system with larger scale stirred reactors. In the second paper, we will explore application of this system to industrially relevant vaccines and test different scale-up heuristics. © 2015 Wiley Periodicals, Inc.

  10. Multiplexed single-molecule force spectroscopy using a centrifuge.

    PubMed

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P

    2016-03-17

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise.

  11. Multiplexed single-molecule force spectroscopy using a centrifuge

    PubMed Central

    Yang, Darren; Ward, Andrew; Halvorsen, Ken; Wong, Wesley P.

    2016-01-01

    We present a miniature centrifuge force microscope (CFM) that repurposes a benchtop centrifuge for high-throughput single-molecule experiments with high-resolution particle tracking, a large force range, temperature control and simple push-button operation. Incorporating DNA nanoswitches to enable repeated interrogation by force of single molecular pairs, we demonstrate increased throughput, reliability and the ability to characterize population heterogeneity. We perform spatiotemporally multiplexed experiments to collect 1,863 bond rupture statistics from 538 traceable molecular pairs in a single experiment, and show that 2 populations of DNA zippers can be distinguished using per-molecule statistics to reduce noise. PMID:26984516

  12. High-Throughput Density Measurement Using Magnetic Levitation.

    PubMed

    Ge, Shencheng; Wang, Yunzhe; Deshler, Nicolas J; Preston, Daniel J; Whitesides, George M

    2018-06-20

    This work describes the development of an integrated analytical system that enables high-throughput density measurements of diamagnetic particles (including cells) using magnetic levitation (MagLev), 96-well plates, and a flatbed scanner. MagLev is a simple and useful technique with which to carry out density-based analysis and separation of a broad range of diamagnetic materials with different physical forms (e.g., liquids, solids, gels, pastes, gums, etc.); one major limitation, however, is the capacity to perform high-throughput density measurements. This work addresses this limitation by (i) re-engineering the shape of the magnetic fields so that the MagLev system is compatible with 96-well plates, and (ii) integrating a flatbed scanner (and simple optical components) to carry out imaging of the samples that levitate in the system. The resulting system is compatible with both biological samples (human erythrocytes) and nonbiological samples (simple liquids and solids, such as 3-chlorotoluene, cholesterol crystals, glass beads, copper powder, and polymer beads). The high-throughput capacity of this integrated MagLev system will enable new applications in chemistry (e.g., analysis and separation of materials) and biochemistry (e.g., cellular responses under environmental stresses) in a simple and label-free format on the basis of a universal property of all matter, i.e., density.

  13. High Throughput, Polymeric Aqueous Two-Phase Printing of Tumor Spheroids

    PubMed Central

    Atefi, Ehsan; Lemmo, Stephanie; Fyffe, Darcy; Luker, Gary D.; Tavana, Hossein

    2014-01-01

    This paper presents a new 3D culture microtechnology for high throughput production of tumor spheroids and validates its utility for screening anti-cancer drugs. We use two immiscible polymeric aqueous solutions and microprint a submicroliter drop of the “patterning” phase containing cells into a bath of the “immersion” phase. Selecting proper formulations of biphasic systems using a panel of biocompatible polymers results in the formation of a round drop that confines cells to facilitate spontaneous formation of a spheroid without any external stimuli. Adapting this approach to robotic tools enables straightforward generation and maintenance of spheroids of well-defined size in standard microwell plates and biochemical analysis of spheroids in situ, which is not possible with existing techniques for spheroid culture. To enable high throughput screening, we establish a phase diagram to identify minimum cell densities within specific volumes of the patterning drop to result in a single spheroid. Spheroids show normal growth over long-term incubation and dose-dependent decrease in cellular viability when treated with drug compounds, but present significant resistance compared to monolayer cultures. The unprecedented ease of implementing this microtechnology and its robust performance will benefit high throughput studies of drug screening against cancer cells with physiologically-relevant 3D tumor models. PMID:25411577

  14. Lessons we learned from high-throughput and top-down systems biology analyses about glioma stem cells.

    PubMed

    Mock, Andreas; Chiblak, Sara; Herold-Mende, Christel

    2014-01-01

    A growing body of evidence suggests that glioma stem cells (GSCs) account for tumor initiation, therapy resistance, and the subsequent regrowth of gliomas. Thus, continuous efforts have been undertaken to further characterize this subpopulation of less differentiated tumor cells. Although we are able to enrich GSCs, we still lack a comprehensive understanding of GSC phenotypes and behavior. The advent of high-throughput technologies raised hope that incorporation of these newly developed platforms would help to tackle such questions. Since then a couple of comparative genome-, transcriptome- and proteome-wide studies on GSCs have been conducted giving new insights in GSC biology. However, lessons had to be learned in designing high-throughput experiments and some of the resulting conclusions fell short of expectations because they were performed on only a few GSC lines or at one molecular level instead of an integrative poly-omics approach. Despite these shortcomings, our knowledge of GSC biology has markedly expanded due to a number of survival-associated biomarkers as well as glioma-relevant signaling pathways and therapeutic targets being identified. In this article we review recent findings obtained by comparative high-throughput analyses of GSCs. We further summarize fundamental concepts of systems biology as well as its applications for glioma stem cell research.

  15. Throughput, latency and cost comparisons of microcontroller-based implementations of wireless sensor network (WSN) in high jump sports

    NASA Astrophysics Data System (ADS)

    Ahmad, Afandi; Roslan, Muhammad Faris; Amira, Abbes

    2017-09-01

    In high jump sports, approach take-off speed and force during the take-off are two (2) main important parts to gain maximum jump. To measure both parameters, wireless sensor network (WSN) that contains microcontroller and sensor are needed to describe the results of speed and force for jumpers. Most of the microcontroller exhibit transmission issues in terms of throughput, latency and cost. Thus, this study presents the comparison of wireless microcontrollers in terms of throughput, latency and cost, and the microcontroller that have best performances and cost will be implemented in high jump wearable device. In the experiments, three (3) parts have been integrated - input, process and output. Force (for ankle) and global positioning system (GPS) sensor (for body waist) acts as an input for data transmission. These data were then being processed by both microcontrollers, ESP8266 and Arduino Yun Mini to transmit the data from sensors to the server (host-PC) via message queuing telemetry transport (MQTT) protocol. The server acts as receiver and the results was calculated from the MQTT log files. At the end, results obtained have shown ESP8266 microcontroller had been chosen since it achieved high throughput, low latency and 11 times cheaper in term of prices compared to Arduino Yun Mini microcontroller.

  16. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  17. From genes to protein mechanics on a chip.

    PubMed

    Otten, Marcus; Ott, Wolfgang; Jobst, Markus A; Milles, Lukas F; Verdorfer, Tobias; Pippig, Diana A; Nash, Michael A; Gaub, Hermann E

    2014-11-01

    Single-molecule force spectroscopy enables mechanical testing of individual proteins, but low experimental throughput limits the ability to screen constructs in parallel. We describe a microfluidic platform for on-chip expression, covalent surface attachment and measurement of single-molecule protein mechanical properties. A dockerin tag on each protein molecule allowed us to perform thousands of pulling cycles using a single cohesin-modified cantilever. The ability to synthesize and mechanically probe protein libraries enables high-throughput mechanical phenotyping.

  18. Achieving High Throughput for Data Transfer over ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  19. High throughput two-step ultrasonic spray deposited CH3NH3PbI3 thin film layer for solar cell application

    NASA Astrophysics Data System (ADS)

    Lan, Ding-Hung; Hong, Shao-Huan; Chou, Li-Hui; Wang, Xiao-Feng; Liu, Cheng-Liang

    2018-06-01

    Organometal halide perovskite materials have demonstrated tremendous advances in the photovoltaic field recently because of their advantageous features of simple fabrication and high power conversion efficiency. To meet the high demand for high throughput and cost-effective, we present a wet process method that enables the probing of the parameters for perovskite layer deposition through two-step sequential ultrasonic spray-coating. This paper describes a detailed investigation on the effects of modification of spray precursor solution (PbI2 and CH3NH3I precursor concentration and solvents used) and post-annealing condition (temperature and time), which can be performed to create optimal film quality as well as improve device efficiency. Through the systematic optimization, the inverted planar perovskite solar cells show the reproducible photovoltaic properties with best power conversion efficiency (PCE) of 10.40% and average PCE of 9.70 ± 0.40%. A continuous spray-coating technique for rapid fabrication of total 16 pieces of perovskite films was demonstrated for providing a viable alternative for the high throughput production of the perovskite solar cells.

  20. External evaluation of the Dimension Vista 1500® intelligent lab system.

    PubMed

    Bruneel, Arnaud; Dehoux, Monique; Barnier, Anne; Boutten, Anne

    2012-09-01

    Dimension Vista® analyzer combines four technologies (photometry, nephelometry, V-LYTE® integrated multisensor potentiometry, and LOCI® chemiluminescence) into one high-throughput system. We assessed analytical performance of assays routinely performed in our emergency laboratory according to the VALTEC protocol, and practicability. Precision was good for most parameters. Analytical domain was large and suitable for undiluted analysis in most clinical settings encountered in our hospital. Data were comparable and correlated to our routine analyzers (Roche Modular DP®, Abbott AXSYM®, Siemens Dimension® RxL, and BN ProSpec®). Performance of nephelometric and LOCI modules was excellent. Functional sensitivity of high-sensitivity C-reactive protein and cardiac troponin I were 0.165 mg/l and 0.03 ng/ml, respectively (coefficient of variation; CV < 10%). The influence of interfering substances (i.e., hemoglobin, bilirubin, or lipids) was moderate, and Dimension Vista® specifically alerted for interference according to HIL (hemolysis, icterus, lipemia) indices. Good instrument performance and full functionality (no reagent or sample carryover in the conditions evaluated, effective sample-volume detection, and clot detection) were confirmed. Simulated routine testing demonstrated excellent practicability, throughput, ease of use of software and security. Performance and practicability of Dimension Vista® are highly suitable for both routine and emergency use. Since no volume detection and thus no warning is available on limited sample racks, pediatric samples require special caution to the Siemens protocol to be analyzed in secured conditions. Our experience in routine practice is also discussed, i.e., the impact of daily workload, "manual" steps resulting from dilutions and pediatric samples, maintenances, flex hydration on instrument's performance on throughput and turnaround time. © 2012 Wiley Periodicals, Inc.

  1. Netest: A Tool to Measure the Maximum Burst Size, Available Bandwidth and Achievable Throughput

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Guojun; Tierney, Brian

    2003-01-31

    Distinguishing available bandwidth and achievable throughput is essential for improving network applications' performance. Achievable throughput is the throughput considering a number of factors such as network protocol, host speed, network path, and TCP buffer space, where as available bandwidth only considers the network path. Without understanding this difference, trying to improve network applications' performance is like ''blind men feeling the elephant'' [4]. In this paper, we define and distinguish bandwidth and throughput, and debate which part of each is achievable and which is available. Also, we introduce and discuss a new concept - Maximum Burst Size that is crucial tomore » the network performance and bandwidth sharing. A tool, netest, is introduced to help users to determine the available bandwidth, and provides information to achieve better throughput with fairness of sharing the available bandwidth, thus reducing misuse of the network.« less

  2. A radial flow microfluidic device for ultra-high-throughput affinity-based isolation of circulating tumor cells.

    PubMed

    Murlidhar, Vasudha; Zeinali, Mina; Grabauskiene, Svetlana; Ghannad-Rezaie, Mostafa; Wicha, Max S; Simeone, Diane M; Ramnath, Nithya; Reddy, Rishindra M; Nagrath, Sunitha

    2014-12-10

    Circulating tumor cells (CTCs) are believed to play an important role in metastasis, a process responsible for the majority of cancer-related deaths. But their rarity in the bloodstream makes microfluidic isolation complex and time-consuming. Additionally the low processing speeds can be a hindrance to obtaining higher yields of CTCs, limiting their potential use as biomarkers for early diagnosis. Here, a high throughput microfluidic technology, the OncoBean Chip, is reported. It employs radial flow that introduces a varying shear profile across the device, enabling efficient cell capture by affinity at high flow rates. The recovery from whole blood is validated with cancer cell lines H1650 and MCF7, achieving a mean efficiency >80% at a throughput of 10 mL h(-1) in contrast to a flow rate of 1 mL h(-1) standardly reported with other microfluidic devices. Cells are recovered with a viability rate of 93% at these high speeds, increasing the ability to use captured CTCs for downstream analysis. Broad clinical application is demonstrated using comparable flow rates from blood specimens obtained from breast, pancreatic, and lung cancer patients. Comparable CTC numbers are recovered in all the samples at the two flow rates, demonstrating the ability of the technology to perform at high throughputs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Wei; Shabbir, Faizan; Gong, Chao

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less

  4. Detection of protein-small molecule binding using a self-referencing external cavity laser biosensor.

    PubMed

    Meng Zhang; Peh, Jessie; Hergenrother, Paul J; Cunningham, Brian T

    2014-01-01

    High throughput screening of protein-small molecule binding interactions using label-free optical biosensors is challenging, as the detected signals are often similar in magnitude to experimental noise. Here, we describe a novel self-referencing external cavity laser (ECL) biosensor approach that achieves high resolution and high sensitivity, while eliminating thermal noise with sub-picometer wavelength accuracy. Using the self-referencing ECL biosensor, we demonstrate detection of binding between small molecules and a variety of immobilized protein targets with binding affinities or inhibition constants in the sub-nanomolar to low micromolar range. The demonstrated ability to perform detection in the presence of several interfering compounds opens the potential for increasing the throughput of the approach. As an example application, we performed a "needle-in-the-haystack" screen for inhibitors against carbonic anhydrase isozyme II (CA II), in which known inhibitors are clearly differentiated from inactive molecules within a compound library.

  5. Dynamic bandwidth allocation based on multiservice in software-defined wavelength-division multiplexing time-division multiplexing passive optical network

    NASA Astrophysics Data System (ADS)

    Wang, Fu; Liu, Bo; Zhang, Lijia; Jin, Feifei; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun

    2017-03-01

    The wavelength-division multiplexing passive optical network (WDM-PON) is a potential technology to carry multiple services in an optical access network. However, it has the disadvantages of high cost and an immature technique for users. A software-defined WDM/time-division multiplexing PON was proposed to meet the requirements of high bandwidth, high performance, and multiple services. A reasonable and effective uplink dynamic bandwidth allocation algorithm was proposed. A controller with dynamic wavelength and slot assignment was introduced, and a different optical dynamic bandwidth management strategy was formulated flexibly for services of different priorities according to the network loading. The simulation compares the proposed algorithm with the interleaved polling with adaptive cycle time algorithm. The algorithm shows better performance in average delay, throughput, and bandwidth utilization. The results show that the delay is reduced to 62% and the throughput is improved by 35%.

  6. Design, motivation, and on-sky tests of an efficient fiber coupling unit for 1-meter class telescopes

    NASA Astrophysics Data System (ADS)

    Bottom, Michael; Muirhead, Philip S.; Swift, Jonathan J.; Zhao, Ming; Gardner, Paul; Plavchan, Peter P.; Riddle, Reed L.; Herzig, Erich; Johnson, John A.; Wright, Jason T.; McCrady, Nate; Wittenmyer, Robert A.

    2014-08-01

    We present the science motivation, design, and on-sky test data of a high-throughput fiber coupling unit suitable for automated 1-meter class telescopes. The optical and mechanical design of the fiber coupling is detailed and we describe a flexible controller software designed specifically for this unit. The system performance is characterized with a set of numerical simulations, and we present on-sky results that validate the performance of the controller and the expected throughput of the fiber coupling. This unit was designed specifically for the MINERVA array, a robotic observatory consisting of multiple 0.7 m telescopes linked to a single high-resolution stabilized spectrograph for the purpose of exoplanet discovery using high-cadence radial velocimetry. However, this unit could easily be used for general astronomical purposes requiring fiber coupling or precise guiding.

  7. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    NASA Astrophysics Data System (ADS)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  8. A gas trapping method for high-throughput metabolic experiments.

    PubMed

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  9. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    PubMed

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  10. Using constitutive activity to define appropriate high-throughput screening assays for orphan g protein-coupled receptors.

    PubMed

    Ngo, Tony; Coleman, James L J; Smith, Nicola J

    2015-01-01

    Orphan G protein-coupled receptors represent an underexploited resource for drug discovery but pose a considerable challenge for assay development because their cognate G protein signaling pathways are often unknown. In this methodological chapter, we describe the use of constitutive activity, that is, the inherent ability of receptors to couple to their cognate G proteins in the absence of ligand, to inform the development of high-throughput screening assays for a particular orphan receptor. We specifically focus on a two-step process, whereby constitutive G protein coupling is first determined using yeast Gpa1/human G protein chimeras linked to growth and β-galactosidase generation. Coupling selectivity is then confirmed in mammalian cells expressing endogenous G proteins and driving accumulation of transcription factor-fused luciferase reporters specific to each of the classes of G protein. Based on these findings, high-throughput screening campaigns can be performed on the already miniaturized mammalian reporter system.

  11. A Method of High Throughput Monitoring Crop Physiology Using Chlorophyll Fluorescence and Multispectral Imaging.

    PubMed

    Wang, Heng; Qian, Xiangjie; Zhang, Lan; Xu, Sailong; Li, Haifeng; Xia, Xiaojian; Dai, Liankui; Xu, Liang; Yu, Jingquan; Liu, Xu

    2018-01-01

    We present a high throughput crop physiology condition monitoring system and corresponding monitoring method. The monitoring system can perform large-area chlorophyll fluorescence imaging and multispectral imaging. The monitoring method can determine the crop current condition continuously and non-destructively. We choose chlorophyll fluorescence parameters and relative reflectance of multispectral as the indicators of crop physiological status. Using tomato as experiment subject, the typical crop physiological stress, such as drought, nutrition deficiency and plant disease can be distinguished by the monitoring method. Furthermore, we have studied the correlation between the physiological indicators and the degree of stress. Besides realizing the continuous monitoring of crop physiology, the monitoring system and method provide the possibility of machine automatic diagnosis of the plant physiology. Highlights: A newly designed high throughput crop physiology monitoring system and the corresponding monitoring method are described in this study. Different types of stress can induce distinct fluorescence and spectral characteristics, which can be used to evaluate the physiological status of plants.

  12. High-throughput characterization for solar fuels materials discovery

    NASA Astrophysics Data System (ADS)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  13. An automated high throughput tribometer for adhesion, wear, and friction measurements

    NASA Astrophysics Data System (ADS)

    Kalihari, Vivek; Timpe, Shannon J.; McCarty, Lyle; Ninke, Matthew; Whitehead, Jim

    2013-03-01

    Understanding the origin and correlation of different surface properties under a multitude of operating conditions is critical in tribology. Diverse tribological properties and a lack of a single instrument to measure all make it difficult to compare and correlate properties, particularly in light of the wide range of interfaces commonly investigated. In the current work, a novel automated tribometer has been designed and validated, providing a unique experimental platform capable of high throughput adhesion, wear, kinetic friction, and static friction measurements. The innovative design aspects are discussed that allow for a variety of probes, sample surfaces, and testing conditions. Critical components of the instrument and their design criteria are described along with examples of data collection schemes. A case study is presented with multiple surface measurements performed on a set of characteristic substrates. Adhesion, wear, kinetic friction, and static friction are analyzed and compared across surfaces, highlighting the comprehensive nature of the surface data that can be generated using the automated high throughput tribometer.

  14. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules

    PubMed Central

    Panzeri, Francesco

    2017-01-01

    We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions. PMID:28419142

  15. Ethoscopes: An open platform for high-throughput ethomics

    PubMed Central

    Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J.; French, Alice S.; Jamasb, Arian R.

    2017-01-01

    Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope. PMID:29049280

  16. Fluorescence Adherence Inhibition Assay: A Novel Functional Assessment of Blocking Virus Attachment by Vaccine-Induced Antibodies

    PubMed Central

    Asati, Atul; Kachurina, Olga; Karol, Alex; Dhir, Vipra; Nguyen, Michael; Parkhill, Robert; Kouiavskaia, Diana; Chumakov, Konstantin; Warren, William; Kachurin, Anatoly

    2016-01-01

    Neutralizing antibodies induced by vaccination or natural infection play a critically important role in protection against the viral diseases. In general, neutralization of the viral infection occurs via two major pathways: pre- and post-attachment modes, the first being the most important for such infections as influenza and polio, the latter being significant for filoviruses. Neutralizing capacity of antibodies is typically evaluated by virus neutralization assays that assess reduction of viral infectivity to the target cells in the presence of functional antibodies. Plaque reduction neutralization test, microneutralization and immunofluorescent assays are often used as gold standard virus neutralization assays. However, these methods are associated with several important prerequisites such as use of live virus requiring safety precautions, tedious evaluation procedure and long assessment time. Hence, there is a need for a robust, inexpensive high throughput functional assay that can be performed rapidly using inactivated virus, without extensive safety precautions. Herein, we report a novel high throughput Fluorescence Adherence Inhibition assay (fADI) using inactivated virus labeled with fluorescent secondary antibodies virus and Vero cells or erythrocytes as targets. It requires only few hours to assess pre-attachment neutralizing capacity of donor sera. fADI assay was tested successfully on donors immunized with polio, yellow fever and influenza vaccines. To further simplify and improve the throughput of the assay, we have developed a mathematical approach for calculating the 50% titers from a single sample dilution, without the need to analyze multi-point titration curves. Assessment of pre- and post-vaccination human sera from subjects immunized with IPOL®, YF-VAX® and 2013–2014 Fluzone® vaccines demonstrated high efficiency of the assay. The results correlated very well with microneutralization assay performed independently by the FDA Center of Biologics Evaluation and Research, with plaque reduction neutralization test performed by Focus Diagnostics, and with hemaglutination inhibition assay performed in-house at Sanofi Pasteur. Taken together, fADI assay appears to be a useful high throughput functional immunoassay for assessment of antibody-related neutralization of the viral infections for which pre-attachment neutralization pathway is predominant, such as polio, influenza, yellow fever and dengue. PMID:26863313

  17. The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences

    USDA-ARS?s Scientific Manuscript database

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...

  18. Modulation and coding for throughput-efficient optical free-space links

    NASA Technical Reports Server (NTRS)

    Georghiades, Costas N.

    1993-01-01

    Optical direct-detection systems are currently being considered for some high-speed inter-satellite links, where data-rates of a few hundred megabits per second are evisioned under power and pulsewidth constraints. In this paper we investigate the capacity, cutoff-rate and error-probability performance of uncoded and trellis-coded systems for various modulation schemes and under various throughput and power constraints. Modulation schemes considered are on-off keying (OOK), pulse-position modulation (PPM), overlapping PPM (OPPM) and multi-pulse (combinatorial) PPM (MPPM).

  19. Supplemental treatment of air in airborne infection isolation rooms using high-throughput in-room air decontamination units.

    PubMed

    Bergeron, Vance; Chalfine, Annie; Misset, Benoît; Moules, Vincent; Laudinet, Nicolas; Carlet, Jean; Lina, Bruno

    2011-05-01

    Evidence has recently emerged indicating that in addition to large airborne droplets, fine aerosol particles can be an important mode of influenza transmission that may have been hitherto underestimated. Furthermore, recent performance studies evaluating airborne infection isolation (AII) rooms designed to house infectious patients have revealed major discrepancies between what is prescribed and what is actually measured. We conducted an experimental study to investigate the use of high-throughput in-room air decontamination units for supplemental protection against airborne contamination in areas that host infectious patients. The study included both intrinsic performance tests of the air-decontamination unit against biological aerosols of particular epidemiologic interest and field tests in a hospital AII room under different ventilation scenarios. The unit tested efficiently eradicated airborne H5N2 influenza and Mycobacterium bovis (a 4- to 5-log single-pass reduction) and, when implemented with a room extractor, reduced the peak contamination levels by a factor of 5, with decontamination rates at least 33% faster than those achieved with the extractor alone. High-throughput in-room air treatment units can provide supplemental control of airborne pathogen levels in patient isolation rooms. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  20. GPU Lossless Hyperspectral Data Compression System

    NASA Technical Reports Server (NTRS)

    Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.

    2014-01-01

    Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.

  1. High-performance single cell genetic analysis using microfluidic emulsion generator arrays.

    PubMed

    Zeng, Yong; Novak, Richard; Shuga, Joe; Smith, Martyn T; Mathies, Richard A

    2010-04-15

    High-throughput genetic and phenotypic analysis at the single cell level is critical to advance our understanding of the molecular mechanisms underlying cellular function and dysfunction. Here we describe a high-performance single cell genetic analysis (SCGA) technique that combines high-throughput microfluidic emulsion generation with single cell multiplex polymerase chain reaction (PCR). Microfabricated emulsion generator array (MEGA) devices containing 4, 32, and 96 channels are developed to confer a flexible capability of generating up to 3.4 x 10(6) nanoliter-volume droplets per hour. Hybrid glass-polydimethylsiloxane diaphragm micropumps integrated into the MEGA chips afford uniform droplet formation, controlled generation frequency, and effective transportation and encapsulation of primer functionalized microbeads and cells. A multiplex single cell PCR method is developed to detect and quantify both wild type and mutant/pathogenic cells. In this method, microbeads functionalized with multiple forward primers targeting specific genes from different cell types are used for solid-phase PCR in droplets. Following PCR, the droplets are lysed and the beads are pooled and rapidly analyzed by multicolor flow cytometry. Using Escherichia coli bacterial cells as a model, we show that this technique enables digital detection of pathogenic E. coli O157 cells in a high background of normal K12 cells, with a detection limit on the order of 1/10(5). This result demonstrates that multiplex SCGA is a promising tool for high-throughput quantitative digital analysis of genetic variation in complex populations.

  2. High-Performance Single Cell Genetic Analysis Using Microfluidic Emulsion Generator Arrays

    PubMed Central

    Zeng, Yong; Novak, Richard; Shuga, Joe; Smith, Martyn T.; Mathies, Richard A.

    2010-01-01

    High-throughput genetic and phenotypic analysis at the single cell level is critical to advance our understanding of the molecular mechanisms underlying cellular function and dysfunction. Here we describe a high-performance single cell genetic analysis (SCGA) technique that combines high-throughput microfluidic emulsion generation with single cell multiplex PCR. Microfabricated emulsion generator array (MEGA) devices containing 4, 32 and 96 channels are developed to confer a flexible capability of generating up to 3.4 × 106 nanoliter-volume droplets per hour. Hybrid glass-polydimethylsiloxane diaphragm micropumps integrated into the MEGA chips afford uniform droplet formation, controlled generation frequency, and effective transportation and encapsulation of primer functionalized microbeads and cells. A multiplex single cell PCR method is developed to detect and quantify both wild type and mutant/pathogenic cells. In this method, microbeads functionalized with multiple forward primers targeting specific genes from different cell types are used for solid-phase PCR in droplets. Following PCR, the droplets are lysed, the beads are pooled and rapidly analyzed by multi-color flow cytometry. Using E. coli bacterial cells as a model, we show that this technique enables digital detection of pathogenic E. coli O157 cells in a high background of normal K12 cells, with a detection limit on the order of 1:105. This result demonstrates that multiplex SCGA is a promising tool for high-throughput quantitative digital analysis of genetic variation in complex populations. PMID:20192178

  3. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  4. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    PubMed

    Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  5. Arioc: high-throughput read alignment with GPU-accelerated exploration of the seed-and-extend search space

    PubMed Central

    Budavari, Tamas; Langmead, Ben; Wheelan, Sarah J.; Salzberg, Steven L.; Szalay, Alexander S.

    2015-01-01

    When computing alignments of DNA sequences to a large genome, a key element in achieving high processing throughput is to prioritize locations in the genome where high-scoring mappings might be expected. We formulated this task as a series of list-processing operations that can be efficiently performed on graphics processing unit (GPU) hardware.We followed this approach in implementing a read aligner called Arioc that uses GPU-based parallel sort and reduction techniques to identify high-priority locations where potential alignments may be found. We then carried out a read-by-read comparison of Arioc’s reported alignments with the alignments found by several leading read aligners. With simulated reads, Arioc has comparable or better accuracy than the other read aligners we tested. With human sequencing reads, Arioc demonstrates significantly greater throughput than the other aligners we evaluated across a wide range of sensitivity settings. The Arioc software is available at https://github.com/RWilton/Arioc. It is released under a BSD open-source license. PMID:25780763

  6. Genome editing in the mushroom-forming basidiomycete Coprinopsis cinerea, optimized by a high-throughput transformation system.

    PubMed

    Sugano, Shigeo S; Suzuki, Hiroko; Shimokita, Eisuke; Chiba, Hirofumi; Noji, Sumihare; Osakabe, Yuriko; Osakabe, Keishi

    2017-04-28

    Mushroom-forming basidiomycetes produce a wide range of metabolites and have great value not only as food but also as an important global natural resource. Here, we demonstrate CRISPR/Cas9-based genome editing in the model species Coprinopsis cinerea. Using a high-throughput reporter assay with cryopreserved protoplasts, we identified a novel promoter, CcDED1 pro , with seven times stronger activity in this assay than the conventional promoter GPD2. To develop highly efficient genome editing using CRISPR/Cas9 in C. cinerea, we used the CcDED1 pro to express Cas9 and a U6-snRNA promoter from C. cinerea to express gRNA. Finally, CRISPR/Cas9-mediated GFP mutagenesis was performed in a stable GFP expression line. Individual genome-edited lines were isolated, and loss of GFP function was detected in hyphae and fruiting body primordia. This novel method of high-throughput CRISPR/Cas9-based genome editing using cryopreserved protoplasts should be a powerful tool in the study of edible mushrooms.

  7. High-Throughput and Low-Latency Network Communication with NetIO

    NASA Astrophysics Data System (ADS)

    Schumacher, Jörn; Plessl, Christian; Vandelli, Wainer

    2017-10-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low- latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS exclusively target the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They make it possible to build distributed applications with a high-level approach and provide good performance. Unfortunately, their usage usually limits developers to TCP/IP- based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath, this approach may not be very efficient compared to a direct use of native APIs. NetIO is a simple, novel asynchronous message service that can operate on Ethernet, Infiniband and similar network fabrics. In this paper the design and implementation of NetIO is presented and described, and its use is evaluated in comparison to other approaches. NetIO supports different high-level programming models and typical workloads of HEP applications. The ATLAS FELIX project [1] successfully uses NetIO as its central communication platform. The architecture of NetIO is described in this paper, including the user-level API and the internal data-flow design. The paper includes a performance evaluation of NetIO including throughput and latency measurements. The performance is compared against the state-of-the- art ZeroMQ message service. Performance measurements are performed in a lab environment with Ethernet and FDR Infiniband networks.

  8. High Throughput System for Plant Height and Hyperspectral Measurement

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Xu, L.; Jiang, H.; Shi, S.; Chen, D.

    2018-04-01

    Hyperspectral and three-dimensional measurement can obtain the intrinsic physicochemical properties and external geometrical characteristics of objects, respectively. Currently, a variety of sensors are integrated into a system to collect spectral and morphological information in agriculture. However, previous experiments were usually performed with several commercial devices on a single platform. Inadequate registration and synchronization among instruments often resulted in mismatch between spectral and 3D information of the same target. And narrow field of view (FOV) extends the working hours in farms. Therefore, we propose a high throughput prototype that combines stereo vision and grating dispersion to simultaneously acquire hyperspectral and 3D information.

  9. High-Throughput Continuous Hydrothermal Synthesis of Transparent Conducting Aluminum and Gallium Co-doped Zinc Oxides.

    PubMed

    Howard, Dougal P; Marchand, Peter; McCafferty, Liam; Carmalt, Claire J; Parkin, Ivan P; Darr, Jawwad A

    2017-04-10

    High-throughput continuous hydrothermal flow synthesis was used to generate a library of aluminum and gallium-codoped zinc oxide nanoparticles of specific atomic ratios. Resistivities of the materials were determined by Hall Effect measurements on heat-treated pressed discs and the results collated into a conductivity-composition map. Optimal resistivities of ∼9 × 10 -3 Ω cm were reproducibly achieved for several samples, for example, codoped ZnO with 2 at% Ga and 1 at% Al. The optimum sample on balance of performance and cost was deemed to be ZnO codoped with 3 at% Al and 1 at% Ga.

  10. A High-Throughput Screen Reveals New Small-Molecule Activators and Inhibitors of Pantothenate Kinases

    PubMed Central

    2016-01-01

    Pantothenate kinase (PanK) is a regulatory enzyme that controls coenzyme A (CoA) biosynthesis. The association of PanK with neurodegeneration and diabetes suggests that chemical modifiers of PanK activity may be useful therapeutics. We performed a high throughput screen of >520000 compounds from the St. Jude compound library and identified new potent PanK inhibitors and activators with chemically tractable scaffolds. The HTS identified PanK inhibitors exemplified by the detailed characterization of a tricyclic compound (7) and a preliminary SAR. Biophysical studies reveal that the PanK inhibitor acts by binding to the ATP–enzyme complex. PMID:25569308

  11. Convenient, Sensitive and High-Throughput Method for Screening Botanic Origin

    NASA Astrophysics Data System (ADS)

    Yuan, Yuan; Jiang, Chao; Liu, Libing; Yu, Shulin; Cui, Zhanhu; Chen, Min; Lin, Shufang; Wang, Shu; Huang, Luqi

    2014-06-01

    In this work, a rapid (within 4-5 h), sensitive and visible new method for assessing botanic origin is developed by combining loop-mediated isothermal amplification with cationic conjugated polymers. The two Chinese medicinal materials (Jin-Yin-Hua and Shan-Yin-Hua) with similar morphology and chemical composition were clearly distinguished by gene SNP genotyping assays. The identification of plant species in Patented Chinese drugs containing Lonicera buds is successfully performed using this detection system. The method is also robust enough to be used in high-throughput screening. This new method is very helpful to identify herbal materials, and is beneficial for detecting safety and quality of botanic products.

  12. Application of High-Throughput In Vitro Assays for Risk-Based ...

    EPA Pesticide Factsheets

    Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos

  13. High-throughput electrical characterization for robust overlay lithography control

    NASA Astrophysics Data System (ADS)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  14. 20150325 - Application of High-Throughput In Vitro Assays for ...

    EPA Pesticide Factsheets

    Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos

  15. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  16. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  17. High-Throughput Platform for Synthesis of Melamine-Formaldehyde Microcapsules.

    PubMed

    Çakir, Seda; Bauters, Erwin; Rivero, Guadalupe; Parasote, Tom; Paul, Johan; Du Prez, Filip E

    2017-07-10

    The synthesis of microcapsules via in situ polymerization is a labor-intensive and time-consuming process, where many composition and process factors affect the microcapsule formation and its morphology. Herein, we report a novel combinatorial technique for the preparation of melamine-formaldehyde microcapsules, using a custom-made and automated high-throughput platform (HTP). After performing validation experiments for ensuring the accuracy and reproducibility of the novel platform, a design of experiment study was performed. The influence of different encapsulation parameters was investigated, such as the effect of the surfactant, surfactant type, surfactant concentration and core/shell ratio. As a result, this HTP-platform is suitable to be used for the synthesis of different types of microcapsules in an automated and controlled way, allowing the screening of different reaction parameters in a shorter time compared to the manual synthetic techniques.

  18. Influence relevance voting: an accurate and interpretable virtual high throughput screening method.

    PubMed

    Swamidass, S Joshua; Azencott, Chloé-Agathe; Lin, Ting-Wan; Gramajo, Hugo; Tsai, Shiou-Chuan; Baldi, Pierre

    2009-04-01

    Given activity training data from high-throughput screening (HTS) experiments, virtual high-throughput screening (vHTS) methods aim to predict in silico the activity of untested chemicals. We present a novel method, the Influence Relevance Voter (IRV), specifically tailored for the vHTS task. The IRV is a low-parameter neural network which refines a k-nearest neighbor classifier by nonlinearly combining the influences of a chemical's neighbors in the training set. Influences are decomposed, also nonlinearly, into a relevance component and a vote component. The IRV is benchmarked using the data and rules of two large, open, competitions, and its performance compared to the performance of other participating methods, as well as of an in-house support vector machine (SVM) method. On these benchmark data sets, IRV achieves state-of-the-art results, comparable to the SVM in one case, and significantly better than the SVM in the other, retrieving three times as many actives in the top 1% of its prediction-sorted list. The IRV presents several other important advantages over SVMs and other methods: (1) the output predictions have a probabilistic semantic; (2) the underlying inferences are interpretable; (3) the training time is very short, on the order of minutes even for very large data sets; (4) the risk of overfitting is minimal, due to the small number of free parameters; and (5) additional information can easily be incorporated into the IRV architecture. Combined with its performance, these qualities make the IRV particularly well suited for vHTS.

  19. Condor-COPASI: high-throughput computing for biochemical networks

    PubMed Central

    2012-01-01

    Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945

  20. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitablymore » designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.« less

  1. Fast and Adaptive Lossless Onboard Hyperspectral Data Compression System

    NASA Technical Reports Server (NTRS)

    Aranki, Nazeeh I.; Keymeulen, Didier; Kimesh, Matthew A.

    2012-01-01

    Modern hyperspectral imaging systems are able to acquire far more data than can be downlinked from a spacecraft. Onboard data compression helps to alleviate this problem, but requires a system capable of power efficiency and high throughput. Software solutions have limited throughput performance and are power-hungry. Dedicated hardware solutions can provide both high throughput and power efficiency, while taking the load off of the main processor. Thus a hardware compression system was developed. The implementation uses a field-programmable gate array (FPGA). The implementation is based on the fast lossless (FL) compression algorithm reported in Fast Lossless Compression of Multispectral-Image Data (NPO-42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which achieves excellent compression performance and has low complexity. This algorithm performs predictive compression using an adaptive filtering method, and uses adaptive Golomb coding. The implementation also packetizes the coded data. The FL algorithm is well suited for implementation in hardware. In the FPGA implementation, one sample is compressed every clock cycle, which makes for a fast and practical realtime solution for space applications. Benefits of this implementation are: 1) The underlying algorithm achieves a combination of low complexity and compression effectiveness that exceeds that of techniques currently in use. 2) The algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. 3) Hardware acceleration provides a throughput improvement of 10 to 100 times vs. the software implementation. A prototype of the compressor is available in software, but it runs at a speed that does not meet spacecraft requirements. The hardware implementation targets the Xilinx Virtex IV FPGAs, and makes the use of this compressor practical for Earth satellites as well as beyond-Earth missions with hyperspectral instruments.

  2. Development of combinatorial chemistry methods for coatings: high-throughput adhesion evaluation and scale-up of combinatorial leads.

    PubMed

    Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia

    2003-01-01

    Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.

  3. SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid

    PubMed Central

    Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin

    2016-01-01

    The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid. PMID:27043573

  4. SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid.

    PubMed

    Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin

    2016-03-31

    The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid.

  5. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    PubMed

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D

    2009-11-01

    While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  6. A high-throughput lab-on-a-chip interface for zebrafish embryo tests in drug discovery and ecotoxicology

    NASA Astrophysics Data System (ADS)

    Zhu, Feng; Akagi, Jin; Hall, Chris J.; Crosier, Kathryn E.; Crosier, Philip S.; Delaage, Pierre; Wlodkowic, Donald

    2013-12-01

    Drug discovery screenings performed on zebrafish embryos mirror with a high level of accuracy. The tests usually performed on mammalian animal models, and the fish embryo toxicity assay (FET) is one of the most promising alternative approaches to acute ecotoxicity testing with adult fish. Notwithstanding this, conventional methods utilising 96-well microtiter plates and manual dispensing of fish embryos are very time-consuming. They rely on laborious and iterative manual pipetting that is a main source of analytical errors and low throughput. In this work, we present development of a miniaturised and high-throughput Lab-on-a-Chip (LOC) platform for automation of FET assays. The 3D high-density LOC array was fabricated in poly-methyl methacrylate (PMMA) transparent thermoplastic using infrared laser micromachining while the off-chip interfaces were fabricated using additive manufacturing processes (FDM and SLA). The system's design facilitates rapid loading and immobilization of a large number of embryos in predefined clusters of traps during continuous microperfusion of drugs/toxins. It has been conceptually designed to seamlessly interface with both upright and inverted fluorescent imaging systems and also to directly interface with conventional microtiter plate readers that accept 96-well plates. We also present proof-of-concept interfacing with a high-speed imaging cytometer Plate RUNNER HD® capable of multispectral image acquisition with resolution of up to 8192 x 8192 pixels and depth of field of about 40 μm. Furthermore, we developed a miniaturized and self-contained analytical device interfaced with a miniaturized USB microscope. This system modification is capable of performing rapid imaging of multiple embryos at a low resolution for drug toxicity analysis.

  7. A High-Throughput Screening Approach to Discovering Good Forms of Biologically Inspired Visual Representation

    PubMed Central

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J.; Cox, David D.

    2009-01-01

    While many models of biological object recognition share a common set of “broad-stroke” properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model—e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct “parts” have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision. PMID:19956750

  8. VLab: A Science Gateway for Distributed First Principles Calculations in Heterogeneous High Performance Computing Systems

    ERIC Educational Resources Information Center

    da Silveira, Pedro Rodrigo Castro

    2014-01-01

    This thesis describes the development and deployment of a cyberinfrastructure for distributed high-throughput computations of materials properties at high pressures and/or temperatures--the Virtual Laboratory for Earth and Planetary Materials--VLab. VLab was developed to leverage the aggregated computational power of grid systems to solve…

  9. Application of ToxCast High-Throughput Screening and ...

    EPA Pesticide Factsheets

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  10. Polymer-Based Dense Fluidic Networks for High Throughput Screening with Ultrasensitive Fluorescence Detection

    PubMed Central

    Okagbare, Paul I.; Soper, Steven A.

    2011-01-01

    Microfluidics represents a viable platform for performing High Throughput Screening (HTS) due to its ability to automate fluid handling and generate fluidic networks with high number densities over small footprints appropriate for the simultaneous optical interrogation of many screening assays. While most HTS campaigns depend on fluorescence, readers typically use point detection and serially address the assay results significantly lowering throughput or detection sensitivity due to a low duty cycle. To address this challenge, we present here the fabrication of a high density microfluidic network packed into the imaging area of a large field-of-view (FoV) ultrasensitive fluorescence detection system. The fluidic channels were 1, 5 or 10 μm (width), 1 μm (depth) with a pitch of 1–10 μm and each fluidic processor was individually addressable. The fluidic chip was produced from a molding tool using hot embossing and thermal fusion bonding to enclose the fluidic channels. A 40X microscope objective (numerical aperture = 0.75) created a FoV of 200 μm, providing the ability to interrogate ~25 channels using the current fluidic configuration. An ultrasensitive fluorescence detection system with a large FoV was used to transduce fluorescence signals simultaneously from each fluidic processor onto the active area of an electron multiplying charge-coupled device (EMCCD). The utility of these multichannel networks for HTS was demonstrated by carrying out the high throughput monitoring of the activity of an enzyme, APE1, used as a model screening assay. PMID:20872611

  11. High-speed Fourier ptychographic microscopy based on programmable annular illuminations.

    PubMed

    Sun, Jiasong; Zuo, Chao; Zhang, Jialin; Fan, Yao; Chen, Qian

    2018-05-16

    High-throughput quantitative phase imaging (QPI) is essential to cellular phenotypes characterization as it allows high-content cell analysis and avoids adverse effects of staining reagents on cellular viability and cell signaling. Among different approaches, Fourier ptychographic microscopy (FPM) is probably the most promising technique to realize high-throughput QPI by synthesizing a wide-field, high-resolution complex image from multiple angle-variably illuminated, low-resolution images. However, the large dataset requirement in conventional FPM significantly limits its imaging speed, resulting in low temporal throughput. Moreover, the underlying theoretical mechanism as well as optimum illumination scheme for high-accuracy phase imaging in FPM remains unclear. Herein, we report a high-speed FPM technique based on programmable annular illuminations (AIFPM). The optical-transfer-function (OTF) analysis of FPM reveals that the low-frequency phase information can only be correctly recovered if the LEDs are precisely located at the edge of the objective numerical aperture (NA) in the frequency space. By using only 4 low-resolution images corresponding to 4 tilted illuminations matching a 10×, 0.4 NA objective, we present the high-speed imaging results of in vitro Hela cells mitosis and apoptosis at a frame rate of 25 Hz with a full-pitch resolution of 655 nm at a wavelength of 525 nm (effective NA = 0.8) across a wide field-of-view (FOV) of 1.77 mm 2 , corresponding to a space-bandwidth-time product of 411 megapixels per second. Our work reveals an important capability of FPM towards high-speed high-throughput imaging of in vitro live cells, achieving video-rate QPI performance across a wide range of scales, both spatial and temporal.

  12. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  13. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    NASA Astrophysics Data System (ADS)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  14. A Low Power and High Throughput Self Synchronous FPGA Using 65nm CMOS with Throughput Optimization by Pipeline Alignment

    NASA Astrophysics Data System (ADS)

    Stefan Devlin, Benjamin; Nakura, Toru; Ikeda, Makoto; Asada, Kunihiro

    We detail a self synchronous field programmable gate array (SSFPGA) with dual-pipeline (DP) architecture to conceal pre-charge time for dynamic logic, and its throughput optimization by using pipeline alignment implemented on benchmark circuits. A self synchronous LUT (SSLUT) consists of a three input tree-type structure with 8bits of SRAM for programming. A self synchronous switch box (SSSB) consists of both pass transistors and buffers to route signals, with 12bits of SRAM. One common block with one SSLUT and one SSSB occupies 2.2Mλ2 area with 35bits of SRAM, and the prototype SSFPGA with 34 × 30 (1020) blocks is designed and fabricated using 65nm CMOS. Measured results show at 1.2V 430MHz and 647MHz operation for a 3bit ripple carry adder, without and with throughput optimization, respectively. We find that using the proposed pipeline alignment techniques we can perform at maximum throughput of 647MHz in various benchmarks on the SSFPGA. We demonstrate up to 56.1 times throughput improvement with our pipeline alignment techniques. The pipeline alignment is carried out within the number of logic elements in the array and pipeline buffers in the switching matrix.

  15. Entropy as a Gene-Like Performance Indicator Promoting Thermoelectric Materials.

    PubMed

    Liu, Ruiheng; Chen, Hongyi; Zhao, Kunpeng; Qin, Yuting; Jiang, Binbin; Zhang, Tiansong; Sha, Gang; Shi, Xun; Uher, Ctirad; Zhang, Wenqing; Chen, Lidong

    2017-10-01

    High-throughput explorations of novel thermoelectric materials based on the Materials Genome Initiative paradigm only focus on digging into the structure-property space using nonglobal indicators to design materials with tunable electrical and thermal transport properties. As the genomic units, following the biogene tradition, such indicators include localized crystal structural blocks in real space or band degeneracy at certain points in reciprocal space. However, this nonglobal approach does not consider how real materials differentiate from others. Here, this study successfully develops a strategy of using entropy as the global gene-like performance indicator that shows how multicomponent thermoelectric materials with high entropy can be designed via a high-throughput screening method. Optimizing entropy works as an effective guide to greatly improve the thermoelectric performance through either a significantly depressed lattice thermal conductivity down to its theoretical minimum value and/or via enhancing the crystal structure symmetry to yield large Seebeck coefficients. The entropy engineering using multicomponent crystal structures or other possible techniques provides a new avenue for an improvement of the thermoelectric performance beyond the current methods and approaches. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. High Throughput Transcriptomics: From screening to pathways

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  17. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  18. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  19. Using ALFA for high throughput, distributed data transmission in the ALICE O2 system

    NASA Astrophysics Data System (ADS)

    Wegrzynek, A.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion detector designed to study the physics of strongly interacting matter (the Quark-Gluon Plasma at the CERN LHC (Large Hadron Collider). ALICE has been successfully collecting physics data in Run 2 since spring 2015. In parallel, preparations for a major upgrade of the computing system, called O2 (Online-Offline), scheduled for the Long Shutdown 2 in 2019-2020, are being made. One of the major requirements of the system is the capacity to transport data between so-called FLPs (First Level Processors), equipped with readout cards, and the EPNs (Event Processing Node), performing data aggregation, frame building and partial reconstruction. It is foreseen to have 268 FLPs dispatching data to 1500 EPNs with an average output of 20 Gb/s each. In overall, the O2 processing system will operate at terabits per second of throughput while handling millions of concurrent connections. The ALFA framework will standardize and handle software related tasks such as readout, data transport, frame building, calibration, online reconstruction and more in the upgraded computing system. ALFA supports two data transport libraries: ZeroMQ and nanomsg. This paper discusses the efficiency of ALFA in terms of high throughput data transport. The tests were performed with multiple FLPs pushing data to multiple EPNs. The transfer was done using push-pull communication patterns and two socket configurations: bind, connect. The set of benchmarks was prepared to get the most performant results on each hardware setup. The paper presents the measurement process and final results - data throughput combined with computing resources usage as a function of block size. The high number of nodes and connections in the final set up may cause race conditions that can lead to uneven load balancing and poor scalability. The performed tests allow us to validate whether the traffic is distributed evenly over all receivers. It also measures the behaviour of the network in saturation and evaluates scalability from a 1-to-1 to a N-to-M solution.

  20. High Throughput Experimental Materials Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakutayev, Andriy; Perkins, John; Schwarting, Marcus

    The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).

  1. Pneumatic Microvalve-Based Hydrodynamic Sample Injection for High-Throughput, Quantitative Zone Electrophoresis in Capillaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Ryan T.; Wang, Chenchen; Rausch, Sarah J.

    2014-07-01

    A hybrid microchip/capillary CE system was developed to allow unbiased and lossless sample loading and high throughput repeated injections. This new hybrid CE system consists of a polydimethylsiloxane (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel and a fused silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channelmore » and the fused silica capillary separation column. Analytes are rapidly separated in the fused silica capillary with high resolution. High sensitivity MS detection after CE separation is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a good linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates and CE separation voltages.« less

  2. Combinatorial electrochemical cell array for high throughput screening of micro-fuel-cells and metal/air batteries.

    PubMed

    Jiang, Rongzhong

    2007-07-01

    An electrochemical cell array was designed that contains a common air electrode and 16 microanodes for high throughput screening of both fuel cells (based on polymer electrolyte membrane) and metal/air batteries (based on liquid electrolyte). Electrode materials can easily be coated on the anodes of the electrochemical cell array and screened by switching a graphite probe from one cell to the others. The electrochemical cell array was used to study direct methanol fuel cells (DMFCs), including high throughput screening of electrode catalysts and determination of optimum operating conditions. For screening of DMFCs, there is about 6% relative standard deviation (percentage of standard deviation versus mean value) for discharge current from 10 to 20 mAcm(2). The electrochemical cell array was also used to study tin/air batteries. The effect of Cu content in the anode electrode on the discharge performance of the tin/air battery was investigated. The relative standard deviations for screening of metal/air battery (based on zinc/air) are 2.4%, 3.6%, and 5.1% for discharge current at 50, 100, and 150 mAcm(2), respectively.

  3. Fast multiclonal clusterization of V(D)J recombinations from high-throughput sequencing.

    PubMed

    Giraud, Mathieu; Salson, Mikaël; Duez, Marc; Villenet, Céline; Quief, Sabine; Caillault, Aurélie; Grardel, Nathalie; Roumier, Christophe; Preudhomme, Claude; Figeac, Martin

    2014-05-28

    V(D)J recombinations in lymphocytes are essential for immunological diversity. They are also useful markers of pathologies. In leukemia, they are used to quantify the minimal residual disease during patient follow-up. However, the full breadth of lymphocyte diversity is not fully understood. We propose new algorithms that process high-throughput sequencing (HTS) data to extract unnamed V(D)J junctions and gather them into clones for quantification. This analysis is based on a seed heuristic and is fast and scalable because in the first phase, no alignment is performed with germline database sequences. The algorithms were applied to TR γ HTS data from a patient with acute lymphoblastic leukemia, and also on data simulating hypermutations. Our methods identified the main clone, as well as additional clones that were not identified with standard protocols. The proposed algorithms provide new insight into the analysis of high-throughput sequencing data for leukemia, and also to the quantitative assessment of any immunological profile. The methods described here are implemented in a C++ open-source program called Vidjil.

  4. Projection Exposure with Variable Axis Immersion Lenses: A High-Throughput Electron Beam Approach to “Suboptical” Lithography

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Hans

    1995-12-01

    IBM's high-throughput e-beam stepper approach PRojection Exposure with Variable Axis Immersion Lenses (PREVAIL) is reviewed. The PREVAIL concept combines technology building blocks of our probe-forming EL-3 and EL-4 systems with the exposure efficiency of pattern projection. The technology represents an extension of the shaped-beam approach toward massively parallel pixel projection. As demonstrated, the use of variable-axis lenses can provide large field coverage through reduction of off-axis aberrations which limit the performance of conventional projection systems. Subfield pattern sections containing 107 or more pixels can be electronically selected (mask plane), projected and positioned (wafer plane) at high speed. To generate the entire chip pattern subfields must be stitched together sequentially in a combination of electronic and mechanical positioning of mask and wafer. The PREVAIL technology promises throughput levels competitive with those of optical steppers at superior resolution. The PREVAIL project is being pursued to demonstrate the viability of the technology and to develop an e-beam alternative to “suboptical” lithography.

  5. Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens.

    PubMed

    Morgens, David W; Wainberg, Michael; Boyle, Evan A; Ursu, Oana; Araya, Carlos L; Tsui, C Kimberly; Haney, Michael S; Hess, Gaelen T; Han, Kyuho; Jeng, Edwin E; Li, Amy; Snyder, Michael P; Greenleaf, William J; Kundaje, Anshul; Bassik, Michael C

    2017-05-05

    CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens.

  6. Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens

    PubMed Central

    Morgens, David W.; Wainberg, Michael; Boyle, Evan A.; Ursu, Oana; Araya, Carlos L.; Tsui, C. Kimberly; Haney, Michael S.; Hess, Gaelen T.; Han, Kyuho; Jeng, Edwin E.; Li, Amy; Snyder, Michael P.; Greenleaf, William J.; Kundaje, Anshul; Bassik, Michael C.

    2017-01-01

    CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens. PMID:28474669

  7. QR-on-a-chip: a computer-recognizable micro-pattern engraved microfluidic device for high-throughput image acquisition.

    PubMed

    Yun, Kyungwon; Lee, Hyunjae; Bang, Hyunwoo; Jeon, Noo Li

    2016-02-21

    This study proposes a novel way to achieve high-throughput image acquisition based on a computer-recognizable micro-pattern implemented on a microfluidic device. We integrated the QR code, a two-dimensional barcode system, onto the microfluidic device to simplify imaging of multiple ROIs (regions of interest). A standard QR code pattern was modified to arrays of cylindrical structures of polydimethylsiloxane (PDMS). Utilizing the recognition of the micro-pattern, the proposed system enables: (1) device identification, which allows referencing additional information of the device, such as device imaging sequences or the ROIs and (2) composing a coordinate system for an arbitrarily located microfluidic device with respect to the stage. Based on these functionalities, the proposed method performs one-step high-throughput imaging for data acquisition in microfluidic devices without further manual exploration and locating of the desired ROIs. In our experience, the proposed method significantly reduced the time for the preparation of an acquisition. We expect that the method will innovatively improve the prototype device data acquisition and analysis.

  8. A multilayer microdevice for cell-based high-throughput drug screening

    NASA Astrophysics Data System (ADS)

    Liu, Chong; Wang, Lei; Xu, Zheng; Li, Jingmin; Ding, Xiping; Wang, Qi; Chunyu, Li

    2012-06-01

    A multilayer polydimethylsiloxane microdevice for cell-based high-throughput drug screening is described in this paper. This established microdevice was based on a modularization method and it integrated a drug/medium concentration gradient generator (CGG), pneumatic microvalves and a cell culture microchamber array. The CGG was able to generate five steps of linear concentrations with the same outlet flow rate. The medium/drug flowed through CGG and then into the pear-shaped cell culture microchambers vertically. This vertical perfusion mode was used to reduce the impact of the shear stress on the physiology of cells induced by the fluid flow in the microchambers. Pear-shaped microchambers with two arrays of miropillars at each outlet were adopted in this microdevice, which were beneficial to cell distribution. The chemotherapeutics Cisplatin (DDP)-induced Cisplatin-resistant cell line A549/DDP apoptotic experiments were performed well on this platform. The results showed that this novel microdevice could not only provide well-defined and stable conditions for cell culture, but was also useful for cell-based high-throughput drug screening with less reagents and time consumption.

  9. HTS-Net: An integrated regulome-interactome approach for establishing network regulation models in high-throughput screenings

    PubMed Central

    Rioualen, Claire; Da Costa, Quentin; Chetrit, Bernard; Charafe-Jauffret, Emmanuelle; Ginestier, Christophe

    2017-01-01

    High-throughput RNAi screenings (HTS) allow quantifying the impact of the deletion of each gene in any particular function, from virus-host interactions to cell differentiation. However, there has been less development for functional analysis tools dedicated to RNAi analyses. HTS-Net, a network-based analysis program, was developed to identify gene regulatory modules impacted in high-throughput screenings, by integrating transcription factors-target genes interaction data (regulome) and protein-protein interaction networks (interactome) on top of screening z-scores. HTS-Net produces exhaustive HTML reports for results navigation and exploration. HTS-Net is a new pipeline for RNA interference screening analyses that proves better performance than simple gene rankings by z-scores, by re-prioritizing genes and replacing them in their biological context, as shown by the three studies that we reanalyzed. Formatted input data for the three studied datasets, source code and web site for testing the system are available from the companion web site at http://htsnet.marseille.inserm.fr/. We also compared our program with existing algorithms (CARD and hotnet2). PMID:28949986

  10. Transition-metal-free catalysts for the sustainable epoxidation of alkenes: from discovery to optimisation by means of high throughput experimentation.

    PubMed

    Lueangchaichaweng, Warunee; Geukens, Inge; Peeters, Annelies; Jarry, Benjamin; Launay, Franck; Bonardet, Jean-Luc; Jacobs, Pierre A; Pescarmona, Paolo P

    2012-02-01

    Transition-metal-free oxides were studied as heterogeneous catalysts for the sustainable epoxidation of alkenes with aqueous H₂O₂ by means of high throughput experimentation (HTE) techniques. A full-factorial HTE approach was applied in the various stages of the development of the catalysts: the synthesis of the materials, their screening as heterogeneous catalysts in liquid-phase epoxidation and the optimisation of the reaction conditions. Initially, the chemical composition of transition-metal-free oxides was screened, leading to the discovery of gallium oxide as a novel, active and selective epoxidation catalyst. On the basis of these results, the research line was continued with the study of structured porous aluminosilicates, gallosilicates and silica-gallia composites. In general, the gallium-based materials showed the best catalytic performances. This family of materials represents a promising class of heterogeneous catalysts for the sustainable epoxidation of alkenes and offers a valid alternative to the transition-metal heterogeneous catalysts commonly used in epoxidation. High throughput experimentation played an important role in promoting the development of these catalytic systems.

  11. A high-throughput two channel discrete wavelet transform architecture for the JPEG2000 standard

    NASA Astrophysics Data System (ADS)

    Badakhshannoory, Hossein; Hashemi, Mahmoud R.; Aminlou, Alireza; Fatemi, Omid

    2005-07-01

    The Discrete Wavelet Transform (DWT) is increasingly recognized in image and video compression standards, as indicated by its use in JPEG2000. The lifting scheme algorithm is an alternative DWT implementation that has a lower computational complexity and reduced resource requirement. In the JPEG2000 standard two lifting scheme based filter banks are introduced: the 5/3 and 9/7. In this paper a high throughput, two channel DWT architecture for both of the JPEG2000 DWT filters is presented. The proposed pipelined architecture has two separate input channels that process the incoming samples simultaneously with minimum memory requirement for each channel. The architecture had been implemented in VHDL and synthesized on a Xilinx Virtex2 XCV1000. The proposed architecture applies DWT on a 2K by 1K image at 33 fps with a 75 MHZ clock frequency. This performance is achieved with 70% less resources than two independent single channel modules. The high throughput and reduced resource requirement has made this architecture the proper choice for real time applications such as Digital Cinema.

  12. Implicit Block ACK Scheme for IEEE 802.11 WLANs

    PubMed Central

    Sthapit, Pranesh; Pyun, Jae-Young

    2016-01-01

    The throughput of IEEE 802.11 standard is significantly bounded by the associated Medium Access Control (MAC) overhead. Because of the overhead, an upper limit exists for throughput, which is bounded, including situations where data rates are extremely high. Therefore, an overhead reduction is necessary to achieve higher throughput. The IEEE 802.11e amendment introduced the block ACK mechanism, to reduce the number of control messages in MAC. Although the block ACK scheme greatly reduces overhead, further improvements are possible. In this letter, we propose an implicit block ACK method that further reduces the overhead associated with IEEE 802.11e’s block ACK scheme. The mathematical analysis results are presented for both the original protocol and the proposed scheme. A performance improvement of greater than 10% was achieved with the proposed implementation.

  13. High-throughput state-machine replication using software transactional memory.

    PubMed

    Zhao, Wenbing; Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin

    2016-11-01

    State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload.

  14. High-throughput state-machine replication using software transactional memory

    PubMed Central

    Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin

    2017-01-01

    State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload. PMID:29075049

  15. New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data

    NASA Astrophysics Data System (ADS)

    Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.

    2007-12-01

    High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.

  16. Combinatorial high-throughput optical screening of high performance Pd alloy cathode for hybrid Li-air battery.

    PubMed

    Jun, Young Jin; Park, Sung Hyeon; Woo, Seong Ihl

    2014-12-08

    Combinatorial high-throughput optical screening method was developed to find the optimum composition of highly active Pd-based catalysts at the cathode of the hybrid Li-air battery. Pd alone, which is one-third the cost of Pt, has difficulty in replacing Pt; therefore, the integration of other metals was investigated to improve its performance toward oxygen reduction reaction (ORR). Among the binary Pd-based catalysts, the composition of Pd-Ir derived catalysts had higher performance toward ORR compared to other Pd-based binary combinations. The composition at 88:12 at. % (Pd: Ir) showed the highest activity toward ORR at the cathode of the hybrid Li-air battery. The prepared Pd(88)Ir(12)/C catalyst showed a current density of -2.58 mA cm(-2) at 0.8 V (vs RHE), which was around 30% higher compared to that of Pd/C (-1.97 mA cm(-2)). When the prepared Pd(88)Ir(12)/C catalyst was applied to the hybrid Li-air battery, the polarization of the cell was reduced and the energy efficiency of the cell was about 30% higher than that of the cell with Pd/C.

  17. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    EPA Science Inventory

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  18. Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...

  19. High performance computing environment for multidimensional image analysis

    PubMed Central

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-01-01

    Background The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. Results We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478× speedup. Conclusion Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets. PMID:17634099

  20. High performance computing environment for multidimensional image analysis.

    PubMed

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-07-10

    The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478x speedup. Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets.

  1. Microplate-Based Method for High-Throughput Screening (HTS) of Chromatographic Conditions Studies for Recombinant Protein Purification.

    PubMed

    Carvalho, Rimenys J; Cruz, Thayana A

    2018-01-01

    High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.

  2. Convenient, sensitive and high-throughput method for screening botanic origin.

    PubMed

    Yuan, Yuan; Jiang, Chao; Liu, Libing; Yu, Shulin; Cui, Zhanhu; Chen, Min; Lin, Shufang; Wang, Shu; Huang, Luqi

    2014-06-23

    In this work, a rapid (within 4-5 h), sensitive and visible new method for assessing botanic origin is developed by combining loop-mediated isothermal amplification with cationic conjugated polymers. The two Chinese medicinal materials (Jin-Yin-Hua and Shan-Yin-Hua) with similar morphology and chemical composition were clearly distinguished by gene SNP genotyping assays. The identification of plant species in Patented Chinese drugs containing Lonicera buds is successfully performed using this detection system. The method is also robust enough to be used in high-throughput screening. This new method is very helpful to identify herbal materials, and is beneficial for detecting safety and quality of botanic products.

  3. High-throughput SNP-genotyping analysis of the relationships among Ponto-Caspian sturgeon species

    PubMed Central

    Rastorguev, Sergey M; Nedoluzhko, Artem V; Mazur, Alexander M; Gruzdeva, Natalia M; Volkov, Alexander A; Barmintseva, Anna E; Mugue, Nikolai S; Prokhortchouk, Egor B

    2013-01-01

    Abstract Legally certified sturgeon fisheries require population protection and conservation methods, including DNA tests to identify the source of valuable sturgeon roe. However, the available genetic data are insufficient to distinguish between different sturgeon populations, and are even unable to distinguish between some species. We performed high-throughput single-nucleotide polymorphism (SNP)-genotyping analysis on different populations of Russian (Acipenser gueldenstaedtii), Persian (A. persicus), and Siberian (A. baerii) sturgeon species from the Caspian Sea region (Volga and Ural Rivers), the Azov Sea, and two Siberian rivers. We found that Russian sturgeons from the Volga and Ural Rivers were essentially indistinguishable, but they differed from Russian sturgeons in the Azov Sea, and from Persian and Siberian sturgeons. We identified eight SNPs that were sufficient to distinguish these sturgeon populations with 80% confidence, and allowed the development of markers to distinguish sturgeon species. Finally, on the basis of our SNP data, we propose that the A. baerii-like mitochondrial DNA found in some Russian sturgeons from the Caspian Sea arose via an introgression event during the Pleistocene glaciation. In the present study, the high-throughput genotyping analysis of several sturgeon populations was performed. SNP markers for species identification were defined. The possible explanation of the baerii-like mitotype presence in some Russian sturgeons in the Caspian Sea was suggested. PMID:24567827

  4. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types

    PubMed Central

    Pagès, Hervé

    2018-01-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set. PMID:29723188

  5. Genome-wide association study of rice (Oryza sativa L.) leaf traits with a high-throughput leaf scorer.

    PubMed

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Wang, Ke; Jiang, Ni; Feng, Hui; Chen, Guoxing; Liu, Qian; Xiong, Lizhong

    2015-09-01

    Leaves are the plant's solar panel and food factory, and leaf traits are always key issues to investigate in plant research. Traditional methods for leaf trait measurement are time-consuming. In this work, an engineering prototype has been established for high-throughput leaf scoring (HLS) of a large number of Oryza sativa accessions. The mean absolute per cent of errors in traditional measurements versus HLS were below 5% for leaf number, area, shape, and colour. Moreover, HLS can measure up to 30 leaves per minute. To demonstrate the usefulness of HLS in dissecting the genetic bases of leaf traits, a genome-wide association study (GWAS) was performed for 29 leaf traits related to leaf size, shape, and colour at three growth stages using HLS on a panel of 533 rice accessions. Nine associated loci contained known leaf-related genes, such as Nal1 for controlling the leaf width. In addition, a total of 73, 123, and 177 new loci were detected for traits associated with leaf size, colour, and shape, respectively. In summary, after evaluating the performance with a large number of rice accessions, the combination of GWAS and high-throughput leaf phenotyping (HLS) has proven a valuable strategy to identify the genetic loci controlling rice leaf traits. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  6. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types.

    PubMed

    Lun, Aaron T L; Pagès, Hervé; Smith, Mike L

    2018-05-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set.

  7. Design and construction of a first-generation high-throughput integrated robotic molecular biology platform for bioenergy applications.

    PubMed

    Hughes, Stephen R; Butt, Tauseef R; Bartolett, Scott; Riedmuller, Steven B; Farrelly, Philip

    2011-08-01

    The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly clone and express heterologous gene open reading frames in bacteria and yeast and to screen large numbers of expressed proteins for optimized function are an important technology for improving microbial strains for biofuel production. The process involves the production of full-length complementary DNA libraries as a source of plasmid-based clones to express the desired proteins in active form for determination of their functions. Proteins that were identified by high-throughput screening as having desired characteristics are overexpressed in microbes to enable them to perform functions that will allow more cost-effective and sustainable production of biofuels. Because the plasmid libraries are composed of several thousand unique genes, automation of the process is essential. This review describes the design and implementation of an automated integrated programmable robotic workcell capable of producing complementary DNA libraries, colony picking, isolating plasmid DNA, transforming yeast and bacteria, expressing protein, and performing appropriate functional assays. These operations will allow tailoring microbial strains to use renewable feedstocks for production of biofuels, bioderived chemicals, fertilizers, and other coproducts for profitable and sustainable biorefineries. Published by Elsevier Inc.

  8. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  9. Effect of PS-PVD production throughput on Si nanoparticles for negative electrode of lithium ion batteries

    NASA Astrophysics Data System (ADS)

    Ohta, R.; Fukada, K.; Tashiro, T.; Dougakiuchi, M.; Kambara, M.

    2018-03-01

    Silicon nanoparticles (Si-NPs) have been produced by plasma spray physical vapor deposition at throughput as high as 1 kg h-1 (17 g min-1) and the effect on the battery performance is investigated. When the Si powder feed-rate is changed from 1 to 17 g min-1, although the average primary particle size increases to 50 nm, the cycle capacity of the batteries using these Si-NPs is improved slightly owing to their less agglomerated structure. In contrast, when Ni is added to Si feedstock, the cycle capacity is improved at 1 g min-1 due to modified Si-NP structure having SiNi2 interface. Whereas, the batteries with the Si-NP produced at 17 g min-1 shows significant decrease in the cycle capacity because of the excess Ni silicide formation that is resulted from the elevated co-condensation point and the increased reaction area at high throughputs despite the constant Ni concentration in the feedstock.

  10. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    PubMed

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  11. SINA: accurate high-throughput multiple sequence alignment of ribosomal RNA genes.

    PubMed

    Pruesse, Elmar; Peplies, Jörg; Glöckner, Frank Oliver

    2012-07-15

    In the analysis of homologous sequences, computation of multiple sequence alignments (MSAs) has become a bottleneck. This is especially troublesome for marker genes like the ribosomal RNA (rRNA) where already millions of sequences are publicly available and individual studies can easily produce hundreds of thousands of new sequences. Methods have been developed to cope with such numbers, but further improvements are needed to meet accuracy requirements. In this study, we present the SILVA Incremental Aligner (SINA) used to align the rRNA gene databases provided by the SILVA ribosomal RNA project. SINA uses a combination of k-mer searching and partial order alignment (POA) to maintain very high alignment accuracy while satisfying high throughput performance demands. SINA was evaluated in comparison with the commonly used high throughput MSA programs PyNAST and mothur. The three BRAliBase III benchmark MSAs could be reproduced with 99.3, 97.6 and 96.1 accuracy. A larger benchmark MSA comprising 38 772 sequences could be reproduced with 98.9 and 99.3% accuracy using reference MSAs comprising 1000 and 5000 sequences. SINA was able to achieve higher accuracy than PyNAST and mothur in all performed benchmarks. Alignment of up to 500 sequences using the latest SILVA SSU/LSU Ref datasets as reference MSA is offered at http://www.arb-silva.de/aligner. This page also links to Linux binaries, user manual and tutorial. SINA is made available under a personal use license.

  12. Field Evaluation of a High Throughput Loop Mediated Isothermal Amplification Test for the Detection of Asymptomatic Plasmodium Infections in Zanzibar

    PubMed Central

    Morris, Ulrika; Ding, Xavier C.; Jovel, Irina; Msellem, Mwinyi I.; Bergman, Daniel; Islam, Atiqul; Ali, Abdullah S.; Polley, Spencer; Gonzalez, Iveth J.; Mårtensson, Andreas; Björkman, Anders

    2017-01-01

    Background New field applicable diagnostic tools are needed for highly sensitive detection of residual malaria infections in pre-elimination settings. Field performance of a high throughput DNA extraction system for loop mediated isothermal amplification (HTP-LAMP) was therefore evaluated for detecting malaria parasites among asymptomatic individuals in Zanzibar. Methods HTP-LAMP performance was evaluated against real-time PCR on 3008 paired blood samples collected on filter papers in a community-based survey in 2015. Results The PCR and HTP-LAMP determined malaria prevalences were 1.6% (95%CI 1.3–2.4) and 0.7% (95%CI 0.4–1.1), respectively. The sensitivity of HTP-LAMP compared to PCR was 40.8% (CI95% 27.0–55.8) and the specificity was 99.9% (CI95% 99.8–100). For the PCR positive samples, there was no statistically significant difference between the geometric mean parasite densities among the HTP-LAMP positive (2.5 p/μL, range 0.2–770) and HTP-LAMP negative (1.4 p/μL, range 0.1–7) samples (p = 0.088). Two lab technicians analysed up to 282 samples per day and the HTP-LAMP method was experienced as user friendly. Conclusions Although field applicable, this high throughput format of LAMP as used here was not sensitive enough to be recommended for detection of asymptomatic low-density infections in areas like Zanzibar, approaching malaria elimination. PMID:28095434

  13. Field Evaluation of a High Throughput Loop Mediated Isothermal Amplification Test for the Detection of Asymptomatic Plasmodium Infections in Zanzibar.

    PubMed

    Aydin-Schmidt, Berit; Morris, Ulrika; Ding, Xavier C; Jovel, Irina; Msellem, Mwinyi I; Bergman, Daniel; Islam, Atiqul; Ali, Abdullah S; Polley, Spencer; Gonzalez, Iveth J; Mårtensson, Andreas; Björkman, Anders

    2017-01-01

    New field applicable diagnostic tools are needed for highly sensitive detection of residual malaria infections in pre-elimination settings. Field performance of a high throughput DNA extraction system for loop mediated isothermal amplification (HTP-LAMP) was therefore evaluated for detecting malaria parasites among asymptomatic individuals in Zanzibar. HTP-LAMP performance was evaluated against real-time PCR on 3008 paired blood samples collected on filter papers in a community-based survey in 2015. The PCR and HTP-LAMP determined malaria prevalences were 1.6% (95%CI 1.3-2.4) and 0.7% (95%CI 0.4-1.1), respectively. The sensitivity of HTP-LAMP compared to PCR was 40.8% (CI95% 27.0-55.8) and the specificity was 99.9% (CI95% 99.8-100). For the PCR positive samples, there was no statistically significant difference between the geometric mean parasite densities among the HTP-LAMP positive (2.5 p/μL, range 0.2-770) and HTP-LAMP negative (1.4 p/μL, range 0.1-7) samples (p = 0.088). Two lab technicians analysed up to 282 samples per day and the HTP-LAMP method was experienced as user friendly. Although field applicable, this high throughput format of LAMP as used here was not sensitive enough to be recommended for detection of asymptomatic low-density infections in areas like Zanzibar, approaching malaria elimination.

  14. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    PubMed Central

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358

  15. Performance analysis of TCP traffic and its influence on ONU's energy saving in energy efficient TDM-PON

    NASA Astrophysics Data System (ADS)

    Alaelddin, Fuad Yousif Mohammed; Newaz, S. H. Shah; Lee, Joohyung; Uddin, Mohammad Rakib; Lee, Gyu Myoung; Choi, Jun Kyun

    2015-12-01

    The majority of the traffic over the Internet is TCP based, which is very sensitive to packet loss and delay. Existing research efforts in TDM-Passive Optical Networks (TDM-PONs) mostly evaluate energy saving and traffic delay performances under different energy saving solutions. However, to the best of our knowledge, how energy saving mechanisms could affect TCP traffic performance in TDM-PONs has hardly been studied. In this paper, by means of our state-of-art OPNET Modular based TDM-PON simulator, we evaluate TCP traffic delay, throughput, and Optical Network Unit (ONU) energy consumption performances in a TDM-PON where energy saving mechanisms are employed in ONUs. Here, we study the performances under commonly used energy saving mechanisms defined in standards for TDM-PONs: cyclic sleep and doze mode. In cyclic sleep mode, we evaluate the performances under two well-known sleep interval length deciding algorithms (i.e. fixed sleep interval (FSI) and exponential sleep interval deciding (ESID)) that an OLT uses to decide sleep interval lengths for an ONU. Findings in this paper put forward the strong relationship among TCP traffic delay, throughput and ONU energy consumption under different sleep interval lengths. Moreover, we reveal that under high TCP traffic, both FSI and ESID will end up showing similar delay, energy and throughput performance. Our findings also show that doze mode can offer better TCP throughput and delay performance at the price of consuming more energy than cyclic sleep mode. In addition, our results provide a glimpse on understanding at what point doze mode becomes futile in improving energy saving of an ONU under TCP traffic. Furthermore, in this paper, we highlight important research issues that should be studied in future research to maximize energy saving in TDM-PONs while meeting traffic Quality of Service requirements.

  16. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  17. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  18. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  19. The use of FTA cards for preserving unfixed cytological material for high-throughput molecular analysis.

    PubMed

    Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Liu, Ni; Tsao, Ming; Zhang, Tong; Kamel-Reid, Suzanne; da Cunha Santos, Gilda

    2012-06-25

    Novel high-throughput molecular technologies have made the collection and storage of cells and small tissue specimens a critical issue. The FTA card provides an alternative to cryopreservation for biobanking fresh unfixed cells. The current study compared the quality and integrity of the DNA obtained from 2 types of FTA cards (Classic and Elute) using 2 different extraction protocols ("Classic" and "Elute") and assessed the feasibility of performing multiplex mutational screening using fine-needle aspiration (FNA) biopsy samples. Residual material from 42 FNA biopsies was collected in the cards (21 Classic and 21 Elute cards). DNA was extracted using the Classic protocol for Classic cards and both protocols for Elute cards. Polymerase chain reaction for p53 (1.5 kilobase) and CARD11 (500 base pair) was performed to assess DNA integrity. Successful p53 amplification was achieved in 95.2% of the samples from the Classic cards and in 80.9% of the samples from the Elute cards using the Classic protocol and 28.5% using the Elute protocol (P = .001). All samples (both cards) could be amplified for CARD11. There was no significant difference in the DNA concentration or 260/280 purity ratio when the 2 types of cards were compared. Five samples were also successfully analyzed by multiplex MassARRAY spectrometry, with a mutation in KRAS found in 1 case. High molecular weight DNA was extracted from the cards in sufficient amounts and quality to perform high-throughput multiplex mutation assays. The results of the current study also suggest that FTA Classic cards preserve better DNA integrity for molecular applications compared with the FTA Elute cards. Copyright © 2012 American Cancer Society.

  20. Optimizing SIEM Throughput on the Cloud Using Parallelization

    PubMed Central

    Alam, Masoom; Ihsan, Asif; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, M Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage. PMID:27851762

  1. Development and clinical performance of high throughput loop-mediated isothermal amplification for detection of malaria

    PubMed Central

    Perera, Rushini S.; Ding, Xavier C.; Tully, Frank; Oliver, James; Bright, Nigel; Bell, David; Chiodini, Peter L.; Gonzalez, Iveth J.; Polley, Spencer D.

    2017-01-01

    Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP system utilised dried blood spots (DBS) and liquid whole blood (WB), with parallel sample processing of 94 samples per run. The system was evaluated using 699 samples of known infection status pre-determined by gold standard nested PCR. Results The sensitivity and specificity of WB-HTP-LAMP was 98.6% (95% CI, 95.7–100), and 99.7% (95% CI, 99.2–100); sensitivity of DBS-HTP-LAMP was 97.1% (95% CI, 93.1–100), and specificity 100% against PCR. At parasite densities greater or equal to 2 parasites/μL, WB and DBS HTP-LAMP showed 100% sensitivity and specificity against PCR. At densities less than 2 p/μL, WB-HTP-LAMP sensitivity was 88.9% (95% CI, 77.1–100) and specificity was 99.7% (95% CI, 99.2–100); sensitivity and specificity of DBS-HTP-LAMP was 77.8% (95% CI, 54.3–99.5) and 100% respectively. Conclusions The HTP-LAMP system is a highly sensitive diagnostic test, with the potential to allow large scale population screening in malaria elimination campaigns. PMID:28166235

  2. Chromatin immunoprecipitation with fixed animal tissues and preparation for high-throughput sequencing.

    PubMed

    Cotney, Justin L; Noonan, James P

    2015-02-02

    Chromatin immunoprecipitation coupled with high-throughput sequencing (ChIP-Seq) is a powerful method used to identify genome-wide binding patterns of transcription factors and distribution of various histone modifications associated with different chromatin states. In most published studies, ChIP-Seq has been performed on cultured cells grown under controlled conditions, allowing generation of large amounts of material in a homogeneous biological state. Although such studies have provided great insight into the dynamic landscapes of animal genomes, they do not allow the examination of transcription factor binding and chromatin states in adult tissues, developing embryonic structures, or tumors. Such knowledge is critical to understanding the information required to create and maintain a complex biological tissue and to identify noncoding regions of the genome directly involved in tissues affected by complex diseases such as autism. Studying these tissue types with ChIP-Seq can be challenging due to the limited availability of tissues and the lack of complex biological states able to be achieved in culture. These inherent differences require alterations of standard cross-linking and chromatin extraction typically used in cell culture. Here we describe a general approach for using small amounts of animal tissue to perform ChIP-Seq directed at histone modifications and transcription factors. Tissue is homogenized before treatment with formaldehyde to ensure proper cross-linking, and a two-step nuclear isolation is performed to increase extraction of soluble chromatin. Small amounts of soluble chromatin are then used for immunoprecipitation (IP) and prepared for multiplexed high-throughput sequencing. © 2015 Cold Spring Harbor Laboratory Press.

  3. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Elo; Huang, Amy; Cadag, Eithon

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  4. Protein Sequence Annotation Tool (PSAT): A centralized web-based meta-server for high-throughput sequence annotations

    DOE PAGES

    Leung, Elo; Huang, Amy; Cadag, Eithon; ...

    2016-01-20

    In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less

  5. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  6. Generalized empirical Bayesian methods for discovery of differential data in high-throughput biology.

    PubMed

    Hardcastle, Thomas J

    2016-01-15

    High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  8. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    PubMed

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  9. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    PubMed Central

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  10. Development of a high-performance multichannel system for time-correlated single photon counting

    NASA Astrophysics Data System (ADS)

    Peronio, P.; Cominelli, A.; Acconcia, G.; Rech, I.; Ghioni, M.

    2017-05-01

    Time-Correlated Single Photon Counting (TCSPC) is one of the most effective techniques for measuring weak and fast optical signals. It outperforms traditional "analog" techniques due to its high sensitivity along with high temporal resolution. Despite those significant advantages, a main drawback still exists, which is related to the long acquisition time needed to perform a measurement. In past years many TCSPC systems have been developed with higher and higher number of channels, aimed to dealing with that limitation. Nevertheless, modern systems suffer from a strong trade-off between parallelism level and performance: the higher the number of channels the poorer the performance. In this work we present the design of a 32x32 TCSPC system meant for overtaking the existing trade-off. To this aim different technologies has been employed, to get the best performance both from detectors and sensing circuits. The exploitation of different technologies will be enabled by Through Silicon Vias (TSVs) which will be investigated as a possible solution for connecting the detectors to the sensing circuits. When dealing with a high number of channels, the count rate is inevitably set by the affordable throughput to the external PC. We targeted a throughput of 10Gb/s, which is beyond the state of the art, and designed the number of TCSPC channels accordingly. A dynamic-routing logic will connect the detectors to the lower number of acquisition chains.

  11. High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Plasma by LC-MS.

    PubMed

    Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan

    2017-01-01

    Nonesterified fatty acids are important biological molecules which have multiple functions such as energy storage, gene regulation, or cell signaling. Comprehensive profiling of nonesterified fatty acids in biofluids can facilitate studying and understanding their roles in biological systems. For these reasons, we have developed and validated a high-throughput, nontargeted lipidomics method coupling liquid chromatography to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. Sufficient chromatographic separation is achieved to separate positional isomers such as polyunsaturated and branched-chain species and quantify a wide range of nonesterified fatty acids in human plasma samples. However, this method is not limited only to these fatty acid species and offers the possibility to perform untargeted screening of additional nonesterified fatty acid species.

  12. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    PubMed

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H

    2017-09-21

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  13. Study of high resolution x-ray spectrometer concepts for NIF experiments

    NASA Astrophysics Data System (ADS)

    Hill, K. W.; Bitter, M.; Delgado-Aparicio, L.; Efthimion, P.; Gao, L.; Maddox, J.; Pablant, N. A.; Beiersdorfer, P.; Chen, H.; Coppari, F.; Ma, T.; Nora, R.; Scott, H.; Schneider, M.; Mancini, R.

    2015-11-01

    Options have been investigated for DIM-insertable (Diagnostic Instrument Manipulator) high resolution (E/ ΔE ~ 3000 - 5000) Bragg crystal x-ray spectrometers for experiments on the NIF. Of interest are time integrated Cu K- and Ta L-edge absorption spectra and time resolved Kr He- β emission from compressed symcaps for inference of electron temperature from dielectronic satellites and electron density from Stark broadening. Cylindrical and conical von Hamos, Johann, and advanced high throughput designs have been studied. Predicted x-ray intensities, spectrometer throughputs, spectral resolution, and spatial focusing properties, as well as lab evaluations of some spectrometer candidates will be presented. Performed under the auspices of the US DOE by PPPL under contract DE-AC02-09CH11466 and by LLNL under contract DE-AC52-07NA27344.

  14. QoS-aware integrated fiber-wireless standard compliant architecture based on XGPON and EDCA

    NASA Astrophysics Data System (ADS)

    Kaur, Ravneet; Srivastava, Anand

    2018-01-01

    Converged Fiber-Wireless (FiWi) broadband access network proves to be a promising candidate that is reliable, robust, cost efficient, ubiquitous and capable of providing huge amount of bandwidth. To meet the ever-increasing bandwidth requirements, it has become very crucial to investigate the performance issues that arise with the deployment of next-generation Passive Optical Network (PON) and its integration with various wireless technologies. Apart from providing high speed internet access for mass use, this combined architecture aims to enable delivery of high quality and effective e-services in different categories including health, education, finance, banking, agriculture and e-government. In this work, we present an integrated architecture of 10-Gigabit-capable PON (XG-PON) and Enhanced Distributed Channel Access (EDCA) that combines the benefits of both technologies to meet the QoS demands of subscribers. Performance evaluation of the standards-compliant hybrid network is done using discrete-event Network Simulator-3 (NS-3) and results are reported in terms of throughput, average delay, average packet loss rate and fairness index. Per-class throughput signifies effectiveness of QoS distribution whereas aggregate throughput indicates effective utilization of wireless channel. This work has not been reported so far to the best of our knowledge.

  15. Spatial tuning of acoustofluidic pressure nodes by altering net sonic velocity enables high-throughput, efficient cell sorting

    DOE PAGES

    Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...

    2015-01-07

    Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.

  16. Pneumatic Microvalve-Based Hydrodynamic Sample Injection for High-Throughput, Quantitative Zone Electrophoresis in Capillaries

    PubMed Central

    2015-01-01

    A hybrid microchip/capillary electrophoresis (CE) system was developed to allow unbiased and lossless sample loading and high-throughput repeated injections. This new hybrid CE system consists of a poly(dimethylsiloxane) (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel, and a fused-silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channel and the fused-silica capillary separation column. Analytes are rapidly separated in the fused-silica capillary, and following separation, high-sensitivity MS detection is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high-throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates, and CE separation voltages. PMID:24865952

  17. Pipeline for illumination correction of images for high-throughput microscopy.

    PubMed

    Singh, S; Bray, M-A; Jones, T R; Carpenter, A E

    2014-12-01

    The presence of systematic noise in images in high-throughput microscopy experiments can significantly impact the accuracy of downstream results. Among the most common sources of systematic noise is non-homogeneous illumination across the image field. This often adds an unacceptable level of noise, obscures true quantitative differences and precludes biological experiments that rely on accurate fluorescence intensity measurements. In this paper, we seek to quantify the improvement in the quality of high-content screen readouts due to software-based illumination correction. We present a straightforward illumination correction pipeline that has been used by our group across many experiments. We test the pipeline on real-world high-throughput image sets and evaluate the performance of the pipeline at two levels: (a) Z'-factor to evaluate the effect of the image correction on a univariate readout, representative of a typical high-content screen, and (b) classification accuracy on phenotypic signatures derived from the images, representative of an experiment involving more complex data mining. We find that applying the proposed post-hoc correction method improves performance in both experiments, even when illumination correction has already been applied using software associated with the instrument. To facilitate the ready application and future development of illumination correction methods, we have made our complete test data sets as well as open-source image analysis pipelines publicly available. This software-based solution has the potential to improve outcomes for a wide-variety of image-based HTS experiments. © 2014 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  18. Use of the melting curve assay as a means for high-throughput quantification of Illumina sequencing libraries.

    PubMed

    Shinozuka, Hiroshi; Forster, John W

    2016-01-01

    Background. Multiplexed sequencing is commonly performed on massively parallel short-read sequencing platforms such as Illumina, and the efficiency of library normalisation can affect the quality of the output dataset. Although several library normalisation approaches have been established, none are ideal for highly multiplexed sequencing due to issues of cost and/or processing time. Methods. An inexpensive and high-throughput library quantification method has been developed, based on an adaptation of the melting curve assay. Sequencing libraries were subjected to the assay using the Bio-Rad Laboratories CFX Connect(TM) Real-Time PCR Detection System. The library quantity was calculated through summation of reduction of relative fluorescence units between 86 and 95 °C. Results.PCR-enriched sequencing libraries are suitable for this quantification without pre-purification of DNA. Short DNA molecules, which ideally should be eliminated from the library for subsequent processing, were differentiated from the target DNA in a mixture on the basis of differences in melting temperature. Quantification results for long sequences targeted using the melting curve assay were correlated with those from existing methods (R (2) > 0.77), and that observed from MiSeq sequencing (R (2) = 0.82). Discussion.The results of multiplexed sequencing suggested that the normalisation performance of the described method is equivalent to that of another recently reported high-throughput bead-based method, BeNUS. However, costs for the melting curve assay are considerably lower and processing times shorter than those of other existing methods, suggesting greater suitability for highly multiplexed sequencing applications.

  19. MAPPER: high-throughput maskless lithography

    NASA Astrophysics Data System (ADS)

    Wieland, M. J.; de Boer, G.; ten Berge, G. F.; Jager, R.; van de Peut, T.; Peijster, J. J. M.; Slot, E.; Steenbrink, S. W. H. K.; Teepen, T. F.; van Veen, A. H. V.; Kampherbeek, B. J.

    2009-03-01

    Maskless electron beam lithography, or electron beam direct write, has been around for a long time in the semiconductor industry and was pioneered from the mid-1960s onwards. This technique has been used for mask writing applications as well as device engineering and in some cases chip manufacturing. However because of its relatively low throughput compared to optical lithography, electron beam lithography has never been the mainstream lithography technology. To extend optical lithography double patterning, as a bridging technology, and EUV lithography are currently explored. Irrespective of the technical viability of both approaches, one thing seems clear. They will be expensive [1]. MAPPER Lithography is developing a maskless lithography technology based on massively-parallel electron-beam writing with high speed optical data transport for switching the electron beams. In this way optical columns can be made with a throughput of 10-20 wafers per hour. By clustering several of these columns together high throughputs can be realized in a small footprint. This enables a highly cost-competitive alternative to double patterning and EUV alternatives. In 2007 MAPPER obtained its Proof of Lithography milestone by exposing in its Demonstrator 45 nm half pitch structures with 110 electron beams in parallel, where all the beams where individually switched on and off [2]. In 2008 MAPPER has taken a next step in its development by building several tools. The objective of building these tools is to involve semiconductor companies to be able to verify tool performance in their own environment. To enable this, the tools will have a 300 mm wafer stage in addition to a 110-beam optics column. First exposures at 45 nm half pitch resolution have been performed and analyzed. On the same wafer it is observed that all beams print and based on analysis of 11 beams the CD for the different patterns is within 2.2 nm from target and the CD uniformity for the different patterns is better than 2.8 nm.

  20. High Throughput, Real-time, Dual-readout Testing of Intracellular Antimicrobial Activity and Eukaryotic Cell Cytotoxicity

    PubMed Central

    Chiaraviglio, Lucius; Kang, Yoon-Suk; Kirby, James E.

    2016-01-01

    Traditional measures of intracellular antimicrobial activity and eukaryotic cell cytotoxicity rely on endpoint assays. Such endpoint assays require several additional experimental steps prior to readout, such as cell lysis, colony forming unit determination, or reagent addition. When performing thousands of assays, for example, during high-throughput screening, the downstream effort required for these types of assays is considerable. Therefore, to facilitate high-throughput antimicrobial discovery, we developed a real-time assay to simultaneously identify inhibitors of intracellular bacterial growth and assess eukaryotic cell cytotoxicity. Specifically, real-time intracellular bacterial growth detection was enabled by marking bacterial screening strains with either a bacterial lux operon (1st generation assay) or fluorescent protein reporters (2nd generation, orthogonal assay). A non-toxic, cell membrane-impermeant, nucleic acid-binding dye was also added during initial infection of macrophages. These dyes are excluded from viable cells. However, non-viable host cells lose membrane integrity permitting entry and fluorescent labeling of nuclear DNA (deoxyribonucleic acid). Notably, DNA binding is associated with a large increase in fluorescent quantum yield that provides a solution-based readout of host cell death. We have used this combined assay to perform a high-throughput screen in microplate format, and to assess intracellular growth and cytotoxicity by microscopy. Notably, antimicrobials may demonstrate synergy in which the combined effect of two or more antimicrobials when applied together is greater than when applied separately. Testing for in vitro synergy against intracellular pathogens is normally a prodigious task as combinatorial permutations of antibiotics at different concentrations must be assessed. However, we found that our real-time assay combined with automated, digital dispensing technology permitted facile synergy testing. Using these approaches, we were able to systematically survey action of a large number of antimicrobials alone and in combination against the intracellular pathogen, Legionella pneumophila. PMID:27911388

  1. High-throughput screening (HTS) and modeling of the retinoid ...

    EPA Pesticide Factsheets

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  2. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    EPA Science Inventory

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  3. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  4. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  5. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  6. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    PubMed

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  7. Outlook for Development of High-throughput Cryopreservation for Small-bodied Biomedical Model Fishes★

    PubMed Central

    Tiersch, Terrence R.; Yang, Huiping; Hu, E.

    2011-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heusinkveld, Harm J.; Westerink, Remco H.S., E-mail: R.Westerink@uu.nl

    Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca{sup 2+} concentration ([Ca{sup 2+}]{sub i}) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca{sup 2+}]{sub i}, e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca{sup 2+}]{sub i} are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca{sup 2+}]{sub i} in plate reader systems, though the results of such plate reader-based measurements have beenmore » questioned. In view of the increasing use of plate reader systems for measurements of [Ca{sup 2+}]{sub i} a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca{sup 2+}]{sub i} is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca{sup 2+}]{sub i}. Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca{sup 2+}]{sub i} is associated with caveats and limitations that require further investigation. - Research Highlights: > The use of plate readers for high-throughput screening of intracellular Ca{sup 2+} is associated with many pitfalls and limitations. > Single cell fluorescent microscopy is recommended for measurements of intracellular Ca{sup 2+}. > Dual-wavelength dyes (Fura-2) are preferred over single-wavelength dyes (Fluo-4) for measurements of intracellular Ca{sup 2+}. > Probenecid prevents dye leakage but abolishes depolarization-evoked Ca{sup 2+} influx, severely hampering measurements of Ca{sup 2+}. > In general, care should be taken when interpreting data from high-throughput kinetic measurements.« less

  9. Enabling a high throughput real time data pipeline for a large radio telescope array with GPUs

    NASA Astrophysics Data System (ADS)

    Edgar, R. G.; Clark, M. A.; Dale, K.; Mitchell, D. A.; Ord, S. M.; Wayth, R. B.; Pfister, H.; Greenhill, L. J.

    2010-10-01

    The Murchison Widefield Array (MWA) is a next-generation radio telescope currently under construction in the remote Western Australia Outback. Raw data will be generated continuously at 5 GiB s-1, grouped into 8 s cadences. This high throughput motivates the development of on-site, real time processing and reduction in preference to archiving, transport and off-line processing. Each batch of 8 s data must be completely reduced before the next batch arrives. Maintaining real time operation will require a sustained performance of around 2.5 TFLOP s-1 (including convolutions, FFTs, interpolations and matrix multiplications). We describe a scalable heterogeneous computing pipeline implementation, exploiting both the high computing density and FLOP-per-Watt ratio of modern GPUs. The architecture is highly parallel within and across nodes, with all major processing elements performed by GPUs. Necessary scatter-gather operations along the pipeline are loosely synchronized between the nodes hosting the GPUs. The MWA will be a frontier scientific instrument and a pathfinder for planned peta- and exa-scale facilities.

  10. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  11. Analysis of high-throughput biological data using their rank values.

    PubMed

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  12. High Performance Processors for Space Environments: A Subproject of the NASA Exploration Missions Systems Directorate "Radiation Hardened Electronics for Space Environments" Technology Development Program

    NASA Technical Reports Server (NTRS)

    Johnson, M.; Label, K.; McCabe, J.; Powell, W.; Bolotin, G.; Kolawa, E.; Ng, T.; Hyde, D.

    2007-01-01

    Implementation of challenging Exploration Systems Missions Directorate objectives and strategies can be constrained by onboard computing capabilities and power efficiencies. The Radiation Hardened Electronics for Space Environments (RHESE) High Performance Processors for Space Environments project will address this challenge by significantly advancing the sustained throughput and processing efficiency of high-per$ormance radiation-hardened processors, targeting delivery of products by the end of FY12.

  13. DoD High Performance Computing Modernization Program Users Group Conference (HPCMP UGC 2011) Held in Portland, Oregon on June 20-23, 2011

    DTIC Science & Technology

    2011-06-01

    4. Conclusion The Web -based AGeS system described in this paper is a computationally-efficient and scalable system for high- throughput genome...method for protecting web services involves making them more resilient to attack using autonomic computing techniques. This paper presents our initial...20–23, 2011 2011 DoD High Performance Computing Modernzation Program Users Group Conference HPCMP UGC 2011 The papers in this book comprise the

  14. High performance hybrid magnetic structure for biotechnology applications

    DOEpatents

    Humphries, David E [El Cerrito, CA; Pollard, Martin J [El Cerrito, CA; Elkin, Christopher J [San Ramon, CA

    2009-02-03

    The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are further improvements to aspects of the hybrid magnetic structure, including additional elements and for adapting the use of the hybrid magnetic structure for use in biotechnology and high throughput processes.

  15. Cyberinfrastructure and Scientific Collaboration: Application of a Virtual Team Performance Framework with Potential Relevance to Education. WCER Working Paper No. 2010-12

    ERIC Educational Resources Information Center

    Kraemer, Sara; Thorn, Christopher A.

    2010-01-01

    The purpose of this exploratory study was to identify and describe some of the dimensions of scientific collaborations using high throughput computing (HTC) through the lens of a virtual team performance framework. A secondary purpose was to assess the viability of using a virtual team performance framework to study scientific collaborations using…

  16. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  17. A Mathematical Approach for Compiling and Optimizing Hardware Implementations of DSP Transforms

    DTIC Science & Technology

    2010-08-01

    FPGA throughput [billion samples per second] performance [ Gflop /s] 0 30 60 90 120 150 0 1 2 3 4 5 0 5,000 10,000 15,000 20,000 25,000...30,000 35,000 40,000 45,000 area [slices] DFT 64 (floating point) on Xilinx Virtex-6 FPGA throughput [billion samples per second] performance [ Gflop ...Virtex-6 FPGA throughput [billion samples per second] performance [ Gflop /s] 0 50 100 150 200 250 0 1 2 3 4 5 0 10,000 20,000 30,000 40,000

  18. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  19. Diffraction efficiency of radially-profiled off-plane reflection gratings

    NASA Astrophysics Data System (ADS)

    Miles, Drew M.; Tutt, James H.; DeRoo, Casey T.; Marlowe, Hannah; Peterson, Thomas J.; McEntaffer, Randall L.; Menz, Benedikt; Burwitz, Vadim; Hartner, Gisela; Laubis, Christian; Scholze, Frank

    2015-09-01

    Future X-ray missions will require gratings with high throughput and high spectral resolution. Blazed off-plane reflection gratings are capable of meeting these demands. A blazed grating profile optimizes grating efficiency, providing higher throughput to one side of zero-order on the arc of diffraction. This paper presents efficiency measurements made in the 0.3 - 1.5 keV energy band at the Physikalisch-Technische Bundesanstalt (PTB) BESSY II facility for three holographically-ruled gratings, two of which are blazed. Each blazed grating was tested in both the Littrow configuration and anti-Littrow configuration in order to test the alignment sensitivity of these gratings with regard to throughput. This paper outlines the procedure of the grating experiment performed at BESSY II and discuss the resulting efficiency measurements across various energies. Experimental results are generally consistent with theory and demonstrate that the blaze does increase throughput to one side of zero-order. However, the total efficiency of the non-blazed, sinusoidal grating is greater than that of the blazed gratings, which suggests that the method of manufacturing these blazed profiles fails to produce facets with the desired level of precision. Finally, evidence of a successful blaze implementation from first diffraction results of prototype blazed gratings produce via a new fabrication technique at the University of Iowa are presented.

  20. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.93). We think that this protocol will be of significant value to those involved in performing high-throughput process development of process chromatography.

  1. DDBJ read annotation pipeline: a cloud computing-based pipeline for high-throughput analysis of next-generation sequencing data.

    PubMed

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-08-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.

  2. DDBJ Read Annotation Pipeline: A Cloud Computing-Based Pipeline for High-Throughput Analysis of Next-Generation Sequencing Data

    PubMed Central

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-01-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/. PMID:23657089

  3. Large-scale protein-protein interactions detection by integrating big biosensing data with computational model.

    PubMed

    You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen

    2014-01-01

    Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.

  4. Enhancing high throughput toxicology - development of putative adverse outcome pathways linking US EPA ToxCast screening targets to relevant apical hazards.

    EPA Science Inventory

    High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...

  5. Evaluation of High-Throughput Chemical Exposure Models via Analysis of Matched Environmental and Biological Media Measurements

    EPA Science Inventory

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...

  6. [Current applications of high-throughput DNA sequencing technology in antibody drug research].

    PubMed

    Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong

    2012-03-01

    Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.

  7. Application of visual basic in high-throughput mass spectrometry-directed purification of combinatorial libraries.

    PubMed

    Li, B; Chan, E C Y

    2003-01-01

    We present an approach to customize the sample submission process for high-throughput purification (HTP) of combinatorial parallel libraries using preparative liquid chromatography electrospray ionization mass spectrometry. In this study, Visual Basic and Visual Basic for Applications programs were developed using Microsoft Visual Basic 6 and Microsoft Excel 2000, respectively. These programs are subsequently applied for the seamless electronic submission and handling of data for HTP. Functions were incorporated into these programs where medicinal chemists can perform on-line verification of the purification status and on-line retrieval of postpurification data. The application of these user friendly and cost effective programs in our HTP technology has greatly increased our work efficiency by reducing paper work and manual manipulation of data.

  8. A high-throughput screen for mitochondrial function reveals known and novel mitochondrial toxicants in a library of environmental agents

    PubMed Central

    Datta, Sandipan; Sahdeo, Sunil; Gray, Jennifer A.; Morriseau, Christophe; Hammock, Bruce D.; Cortopassi, Gino

    2016-01-01

    Mitochondrial toxicity is emerging as a major mechanism underlying serious human health consequences. This work performs a high-throughput screen (HTS) of 176 environmental chemicals for mitochondrial toxicity utilizing a previously reported biosensor platform. This established HTS confirmed known mitochondrial toxins and identified novel mitotochondrial uncouplers such as 2, 2′-Methylenebis(4-chlorophenol) and pentachlorophenol. It also identified a mitochondrial ‘structure activity relationship’ (SAR) in the sense that multiple environmental chlorophenols are mitochondrial inhibitors and uncouplers. This study demonstrates proof-of-concept that a mitochondrial HTS assay detects known and novel environmental mitotoxicants, and could be used to quickly evaluate human health risks from mitotoxicants in the environment. PMID:27717841

  9. Fast liquid chromatography combined with mass spectrometry for the analysis of metabolites and proteins in human body fluids.

    PubMed

    Kortz, Linda; Helmschrodt, Christin; Ceglarek, Uta

    2011-03-01

    In the last decade various analytical strategies have been established to enhance separation speed and efficiency in high performance liquid chromatography applications. Chromatographic supports based on monolithic material, small porous particles, and porous layer beads have been developed and commercialized to improve throughput and separation efficiency. This paper provides an overview of current developments in fast chromatography combined with mass spectrometry for the analysis of metabolites and proteins in clinical applications. Advances and limitations of fast chromatography for the combination with mass spectrometry are discussed. Practical aspects of, recent developments in, and the present status of high-throughput analysis of human body fluids for therapeutic drug monitoring, toxicology, clinical metabolomics, and proteomics are presented.

  10. Reconstruction of metabolic networks from high-throughput metabolite profiling data: in silico analysis of red blood cell metabolism.

    PubMed

    Nemenman, Ilya; Escola, G Sean; Hlavacek, William S; Unkefer, Pat J; Unkefer, Clifford J; Wall, Michael E

    2007-12-01

    We investigate the ability of algorithms developed for reverse engineering of transcriptional regulatory networks to reconstruct metabolic networks from high-throughput metabolite profiling data. For benchmarking purposes, we generate synthetic metabolic profiles based on a well-established model for red blood cell metabolism. A variety of data sets are generated, accounting for different properties of real metabolic networks, such as experimental noise, metabolite correlations, and temporal dynamics. These data sets are made available online. We use ARACNE, a mainstream algorithm for reverse engineering of transcriptional regulatory networks from gene expression data, to predict metabolic interactions from these data sets. We find that the performance of ARACNE on metabolic data is comparable to that on gene expression data.

  11. When and Why Threats Go Undetected: Impacts of Event Rate and Shift Length on Threat Detection Accuracy During Airport Baggage Screening.

    PubMed

    Meuter, Renata F I; Lacherez, Philippe F

    2016-03-01

    We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety. © 2015, Human Factors and Ergonomics Society.

  12. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    PubMed Central

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  13. High-throughput screening based on label-free detection of small molecule microarrays

    NASA Astrophysics Data System (ADS)

    Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong

    2017-02-01

    Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.

  14. Experimental demonstration of a real-time high-throughput digital DC blocker for compensating ADC imperfections in optical fast-OFDM receivers.

    PubMed

    Zhang, Lu; Ouyang, Xing; Shao, Xiaopeng; Zhao, Jian

    2016-06-27

    Performance degradation induced by the DC components at the output of real-time analogue-to-digital converter (ADC) is experimentally investigated for optical fast-OFDM receiver. To compensate this degradation, register transfer level (RTL) circuits for real-time digital DC blocker with 20GS/s throughput are proposed and implemented in field programmable gate array (FPGA). The performance of the proposed real-time digital DC blocker is experimentally investigated in a 15Gb/s optical fast-OFDM system with intensity modulation and direct detection over 40 km standard single-mode fibre. The results show that the fixed-point DC blocker has negligible performance penalty compared to the offline floating point one, and can overcome the error floor of the fast OFDM receiver caused by the DC components from the real-time ADC output.

  15. High-throughput analysis of yeast replicative aging using a microfluidic system

    PubMed Central

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-01-01

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317

  16. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Low-Rank Coal Grinding Performance Versus Power Plant Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajive Ganguli; Sukumar Bandopadhyay

    2008-12-31

    The intent of this project was to demonstrate that Alaskan low-rank coal, which is high in volatile content, need not be ground as fine as bituminous coal (typically low in volatile content) for optimum combustion in power plants. The grind or particle size distribution (PSD), which is quantified by percentage of pulverized coal passing 74 microns (200 mesh), affects the pulverizer throughput in power plants. The finer the grind, the lower the throughput. For a power plant to maintain combustion levels, throughput needs to be high. The problem of particle size is compounded for Alaskan coal since it has amore » low Hardgrove grindability index (HGI); that is, it is difficult to grind. If the thesis of this project is demonstrated, then Alaskan coal need not be ground to the industry standard, thereby alleviating somewhat the low HGI issue (and, hopefully, furthering the salability of Alaskan coal). This project studied the relationship between PSD and power plant efficiency, emissions, and mill power consumption for low-rank high-volatile-content Alaskan coal. The emissions studied were CO, CO{sub 2}, NO{sub x}, SO{sub 2}, and Hg (only two tests). The tested PSD range was 42 to 81 percent passing 76 microns. Within the tested range, there was very little correlation between PSD and power plant efficiency, CO, NO{sub x}, and SO{sub 2}. Hg emissions were very low and, therefore, did not allow comparison between grind sizes. Mill power consumption was lower for coarser grinds.« less

  18. Advantages and application of label-free detection assays in drug screening.

    PubMed

    Cunningham, Brian T; Laing, Lance G

    2008-08-01

    Adoption is accelerating for a new family of label-free optical biosensors incorporated into standard format microplates owing to their ability to enable highly sensitive detection of small molecules, proteins and cells for high-throughput drug discovery applications. Label-free approaches are displacing other detection technologies owing to their ability to provide simple assay procedures for hit finding/validation, accessing difficult target classes, screening the interaction of cells with drugs and analyzing the affinity of small molecule inhibitors to target proteins. This review describes several new drug discovery applications that are under development for microplate-based photonic crystal optical biosensors and the key issues that will drive adoption of the technology. Microplate-based optical biosensors are enabling a variety of cell-based assays, inhibition assays, protein-protein binding assays and protein-small molecule binding assays to be performed with high-throughput and high sensitivity.

  19. Advanced phenotyping and phenotype data analysis for the study of plant growth and development.

    PubMed

    Rahaman, Md Matiur; Chen, Dijun; Gillani, Zeeshan; Klukas, Christian; Chen, Ming

    2015-01-01

    Due to an increase in the consumption of food, feed, fuel and to meet global food security needs for the rapidly growing human population, there is a necessity to breed high yielding crops that can adapt to the future climate changes, particularly in developing countries. To solve these global challenges, novel approaches are required to identify quantitative phenotypes and to explain the genetic basis of agriculturally important traits. These advances will facilitate the screening of germplasm with high performance characteristics in resource-limited environments. Recently, plant phenomics has offered and integrated a suite of new technologies, and we are on a path to improve the description of complex plant phenotypes. High-throughput phenotyping platforms have also been developed that capture phenotype data from plants in a non-destructive manner. In this review, we discuss recent developments of high-throughput plant phenotyping infrastructure including imaging techniques and corresponding principles for phenotype data analysis.

  20. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  1. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  2. Energy efficient strategy for throughput improvement in wireless sensor networks.

    PubMed

    Jabbar, Sohail; Minhas, Abid Ali; Imran, Muhammad; Khalid, Shehzad; Saleem, Kashif

    2015-01-23

    Network lifetime and throughput are one of the prime concerns while designing routing protocols for wireless sensor networks (WSNs). However, most of the existing schemes are either geared towards prolonging network lifetime or improving throughput. This paper presents an energy efficient routing scheme for throughput improvement in WSN. The proposed scheme exploits multilayer cluster design for energy efficient forwarding node selection, cluster heads rotation and both inter- and intra-cluster routing. To improve throughput, we rotate the role of cluster head among various nodes based on two threshold levels which reduces the number of dropped packets. We conducted simulations in the NS2 simulator to validate the performance of the proposed scheme. Simulation results demonstrate the performance efficiency of the proposed scheme in terms of various metrics compared to similar approaches published in the literature.

  3. Energy Efficient Strategy for Throughput Improvement in Wireless Sensor Networks

    PubMed Central

    Jabbar, Sohail; Minhas, Abid Ali; Imran, Muhammad; Khalid, Shehzad; Saleem, Kashif

    2015-01-01

    Network lifetime and throughput are one of the prime concerns while designing routing protocols for wireless sensor networks (WSNs). However, most of the existing schemes are either geared towards prolonging network lifetime or improving throughput. This paper presents an energy efficient routing scheme for throughput improvement in WSN. The proposed scheme exploits multilayer cluster design for energy efficient forwarding node selection, cluster heads rotation and both inter- and intra-cluster routing. To improve throughput, we rotate the role of cluster head among various nodes based on two threshold levels which reduces the number of dropped packets. We conducted simulations in the NS2 simulator to validate the performance of the proposed scheme. Simulation results demonstrate the performance efficiency of the proposed scheme in terms of various metrics compared to similar approaches published in the literature. PMID:25625902

  4. CA resist with high sensitivity and sub-100-nm resolution for advanced mask making

    NASA Astrophysics Data System (ADS)

    Huang, Wu-Song; Kwong, Ranee W.; Hartley, John G.; Moreau, Wayne M.; Angelopoulos, Marie; Magg, Christopher; Lawliss, Mark

    2000-07-01

    Recently, there is significant interest in using CA resist for electron beam (E-beam) applications including mask making, direct write, and projection printing. CA resists provide superior lithographic performance in comparison to traditional non-CA E-beam resist in particular high contrast, resolution, and sensitivity. However, most of the commercially available CA resist have the concern of airborne base contaminants and sensitivity to PAB and/or PEB temperatures. In this presentation, we will discuss a new improved ketal resists system referred to as KRS-XE which exhibits excellent lithography, is robust toward airborne base, compatible with 0.263N TMAH aqueous developer and exhibits excellent lithography, is robust toward airborne base, compatible with 0.263N TMAH aqueous developer and exhibits a large PAB/PEB latitude. With the combination of a high performance mask making E-beam exposure tool, high kV shaped beam system EL4+ and the KRS-XE resist, we have printed 75nm lines/space feature with excellent profile control at a dose of 13(mu) C/cm2 at 75kV. The shaped beam vector scan system used here provides a unique property in resolving small features in lithography and throughput. Overhead in EL4+$ limits the systems ability to fully exploit the sensitivity of the new resist for throughput. The EL5 system has sufficiently low overhead that it is projected to print a 4X, 16G DRAM mask with OPC in under 3 hours with the CA resist. We will discuss the throughput advantages of the next generation EL5 system over the existing EL4+.

  5. A high-throughput fluorescence polarization assay for inhibitors of gyrase B.

    PubMed

    Glaser, Bryan T; Malerich, Jeremiah P; Duellman, Sarah J; Fong, Julie; Hutson, Christopher; Fine, Richard M; Keblansky, Boris; Tang, Mary J; Madrid, Peter B

    2011-02-01

    DNA gyrase, a type II topoisomerase that introduces negative supercoils into DNA, is a validated antibacterial drug target. The holoenzyme is composed of 2 subunits, gyrase A (GyrA) and gyrase B (GyrB), which form a functional A(2)B(2) heterotetramer required for bacterial viability. A novel fluorescence polarization (FP) assay has been developed and optimized to detect inhibitors that bind to the adenosine triphosphate (ATP) binding domain of GyrB. Guided by the crystal structure of the natural product novobiocin bound to GyrB, a novel novobiocin-Texas Red probe (Novo-TRX) was designed and synthesized for use in a high-throughput FP assay. The binding kinetics of the interaction of Novo-TRX with GyrB from Francisella tularensis has been characterized, as well as the effect of common buffer additives on the interaction. The assay was developed into a 21-µL, 384-well assay format and has been validated for use in high-throughput screening against a collection of Food and Drug Administration-approved compounds. The assay performed with an average Z' factor of 0.80 and was able to identify GyrB inhibitors from a screening library.

  6. Proteomic Analysis of Metabolic Responses to Biofuels and Chemicals in Photosynthetic Cyanobacteria.

    PubMed

    Sun, T; Chen, L; Zhang, W

    2017-01-01

    Recent progresses in various "omics" technologies have enabled quantitative measurements of biological molecules in a high-throughput manner. Among them, high-throughput proteomics is a rapidly advancing field that offers a new means to quantify metabolic changes at protein level, which has significantly facilitated our understanding of cellular process, such as protein synthesis, posttranslational modifications, and degradation in responding to environmental perturbations. Cyanobacteria are autotrophic prokaryotes that can perform oxygenic photosynthesis and have recently attracted significant attentions as one promising alternative to traditionally biomass-based "microbial cell factories" to produce green fuels and chemicals. However, early studies have shown that the low tolerance to toxic biofuels and chemicals represented one major hurdle for further improving productivity of the cyanobacterial production systems. To address the issue, metabolic responses and their regulation of cyanobacterial cells to toxic end-products need to be defined. In this chapter, we discuss recent progresses in interpreting cyanobacterial responses to biofuels and chemicals using high-throughput proteomics approach, aiming to provide insights and guidelines on how to enhance tolerance and productivity of biofuels or chemicals in the renewable cyanobacteria systems in the future. © 2017 Elsevier Inc. All rights reserved.

  7. Cytopathological image analysis using deep-learning networks in microfluidic microscopy.

    PubMed

    Gopakumar, G; Hari Babu, K; Mishra, Deepak; Gorthi, Sai Siva; Sai Subrahmanyam, Gorthi R K

    2017-01-01

    Cytopathologic testing is one of the most critical steps in the diagnosis of diseases, including cancer. However, the task is laborious and demands skill. Associated high cost and low throughput drew considerable interest in automating the testing process. Several neural network architectures were designed to provide human expertise to machines. In this paper, we explore and propose the feasibility of using deep-learning networks for cytopathologic analysis by performing the classification of three important unlabeled, unstained leukemia cell lines (K562, MOLT, and HL60). The cell images used in the classification are captured using a low-cost, high-throughput cell imaging technique: microfluidics-based imaging flow cytometry. We demonstrate that without any conventional fine segmentation followed by explicit feature extraction, the proposed deep-learning algorithms effectively classify the coarsely localized cell lines. We show that the designed deep belief network as well as the deeply pretrained convolutional neural network outperform the conventionally used decision systems and are important in the medical domain, where the availability of labeled data is limited for training. We hope that our work enables the development of a clinically significant high-throughput microfluidic microscopy-based tool for disease screening/triaging, especially in resource-limited settings.

  8. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development.

    PubMed

    Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V

    2013-02-01

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.

  9. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  10. Towards high-throughput automated targeted femtosecond laser-based transfection of adherent cells

    NASA Astrophysics Data System (ADS)

    Antkowiak, Maciej; Torres-Mapa, Maria Leilani; Gunn-Moore, Frank; Dholakia, Kishan

    2011-03-01

    Femtosecond laser induced cell membrane poration has proven to be an attractive alternative to the classical methods of drug and gene delivery. It is a selective, sterile, non-contact technique that offers a highly localized operation, low toxicity and consistent performance. However, its broader application still requires the development of robust, high-throughput and user-friendly systems. We present a system capable of unassisted enhanced targeted optoinjection and phototransfection of adherent mammalian cells with a femtosecond laser. We demonstrate the advantages of a dynamic diffractive optical element, namely a spatial light modulator (SLM) for precise three dimensional positioning of the beam. It enables the implementation of a "point-and-shoot" system in which using the software interface a user simply points at the cell and a predefined sequence of precisely positioned doses can be applied. We show that irradiation in three axial positions alleviates the problem of exact beam positioning on the cell membrane and doubles the number of viably optoinjected cells when compared with a single dose. The presented system enables untargeted raster scan irradiation which provides transfection of adherent cells at the throughput of 1 cell per second.

  11. High-Throughput Fabrication of Flexible and Transparent All-Carbon Nanotube Electronics.

    PubMed

    Chen, Yong-Yang; Sun, Yun; Zhu, Qian-Bing; Wang, Bing-Wei; Yan, Xin; Qiu, Song; Li, Qing-Wen; Hou, Peng-Xiang; Liu, Chang; Sun, Dong-Ming; Cheng, Hui-Ming

    2018-05-01

    This study reports a simple and effective technique for the high-throughput fabrication of flexible all-carbon nanotube (CNT) electronics using a photosensitive dry film instead of traditional liquid photoresists. A 10 in. sized photosensitive dry film is laminated onto a flexible substrate by a roll-to-roll technology, and a 5 µm pattern resolution of the resulting CNT films is achieved for the construction of flexible and transparent all-CNT thin-film transistors (TFTs) and integrated circuits. The fabricated TFTs exhibit a desirable electrical performance including an on-off current ratio of more than 10 5 , a carrier mobility of 33 cm 2 V -1 s -1 , and a small hysteresis. The standard deviations of on-current and mobility are, respectively, 5% and 2% of the average value, demonstrating the excellent reproducibility and uniformity of the devices, which allows constructing a large noise margin inverter circuit with a voltage gain of 30. This study indicates that a photosensitive dry film is very promising for the low-cost, fast, reliable, and scalable fabrication of flexible and transparent CNT-based integrated circuits, and opens up opportunities for future high-throughput CNT-based printed electronics.

  12. Arabidopsis Seed Content QTL Mapping Using High-Throughput Phenotyping: The Assets of Near Infrared Spectroscopy

    PubMed Central

    Jasinski, Sophie; Lécureuil, Alain; Durandet, Monique; Bernard-Moulin, Patrick; Guerche, Philippe

    2016-01-01

    Seed storage compounds are of crucial importance for human diet, feed and industrial uses. In oleo-proteaginous species like rapeseed, seed oil and protein are the qualitative determinants that conferred economic value to the harvested seed. To date, although the biosynthesis pathways of oil and storage protein are rather well-known, the factors that determine how these types of reserves are partitioned in seeds have to be identified. With the aim of implementing a quantitative genetics approach, requiring phenotyping of 100s of plants, our first objective was to establish near-infrared reflectance spectroscopic (NIRS) predictive equations in order to estimate oil, protein, carbon, and nitrogen content in Arabidopsis seed with high-throughput level. Our results demonstrated that NIRS is a powerful non-destructive, high-throughput method to assess the content of these four major components studied in Arabidopsis seed. With this tool in hand, we analyzed Arabidopsis natural variation for these four components and illustrated that they all displayed a wide range of variation. Finally, NIRS was used in order to map QTL for these four traits using seeds from the Arabidopsis thaliana Ct-1 × Col-0 recombinant inbred line population. Some QTL co-localized with QTL previously identified, but others mapped to chromosomal regions never identified so far for such traits. This paper illustrates the usefulness of NIRS predictive equations to perform accurate high-throughput phenotyping of Arabidopsis seed content, opening new perspectives in gene identification following QTL mapping and genome wide association studies. PMID:27891138

  13. SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.

    PubMed

    Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro

    2012-12-01

    Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.

  14. A solid-phase glycosyltransferase assay for high-throughput screening in drug discovery research.

    PubMed

    Donovan, R S; Datti, A; Baek, M G; Wu, Q; Sas, I J; Korczak, B; Berger, E G; Roy, R; Dennis, J W

    1999-10-01

    Glycosyltransferases mediate changes in glycosylation patterns which, in turn, may affect the function of glycoproteins and/or glycolipids and, further downstream, processes of development, differentiation, transformation and cell-cell recognition. Such enzymes, therefore, represent valid targets for drug discovery. We have developed a solid-phase glycosyltransferase assay for use in a robotic high-throughput format. Carbohydrate acceptors coupled covalently to polyacrylamide are coated onto 96-well plastic plates. The glycosyltransferase reaction is performed with recombinant enzymes and radiolabeled sugar-nucleotide donor at 37 degrees C, followed by washing, addition of scintillation counting fluid, and measurement of radioactivity using a 96-well beta-counter. Glycopolymer construction and coating of the plastic plates, enzyme and substrate concentrations, and linearity with time were optimized using recombinant Core 2 beta1-6-N-acetylglucosaminyltransferase (Core 2 GlcNAc-T). This enzyme catalyzes a rate-limiting reaction for expression of polylactosamine and the selectin ligand sialyl-Lewis(x) in O-glycans. A glycopolymer acceptor for beta1-6-N-acetylglucosaminyltransferase V was also designed and shown to be effective in the solid-phase assay. In a high-throughput screen of a microbial extract library, the coefficient of variance for positive controls was 9.4%, and high concordance for hit validation was observed between the Core 2 GlcNAc-T solid-phase assay and a standard solution-phase assay. The solid-phase assay format, which can be adapted for a variety of glycosyltransferase enzymes, allowed a 5-6 fold increase in throughput compared to the corresponding solution-phase assay.

  15. High-speed spectral nanocytology for early cancer screening

    PubMed Central

    Subramanian, Hariharan; Maneval, Charles D.; White, Craig A.; Levenson, Richard M.; Backman, Vadim

    2013-01-01

    Abstract. High-throughput partial wave spectroscopy (HTPWS) is introduced as a high-speed spectral nanocytology technique that utilizes the field effect of carcinogenesis to perform minimally invasive cancer screening on at-risk populations. HTPWS uses fully automated hardware and an acousto-optic tunable filter to scan slides at low magnification, to select cells, and to rapidly acquire spectra at each spatial pixel in a cell between 450 and 700 nm, completing measurements of 30 cells in 40 min. Statistical quantitative analysis on the size and density of intracellular nanostructures extracted from the spectra at each pixel in a cell yields the diagnostic biomarker, disorder strength (Ld). Linear correlation between Ld and the length scale of nanostructures was measured in phantoms with R2=0.93. Diagnostic sensitivity was demonstrated by measuring significantly higher Ld from a human colon cancer cell line (HT29 control vector) than a less aggressive variant (epidermal growth factor receptor knockdown). Clinical diagnostic performance for lung cancer screening was tested on 23 patients, yielding a significant difference in Ld between smokers and cancer patients, p=0.02 and effect size=1.00. The high-throughput performance, nanoscale sensitivity, and diagnostic sensitivity make HTPWS a potentially clinically relevant modality for risk stratification of the large populations at risk of developing cancer. PMID:24193949

  16. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  17. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    PubMed Central

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  18. Continuous cell introduction and rapid dynamic lysis for high-throughput single-cell analysis on microfludic chips with hydrodynamic focusing.

    PubMed

    Xu, Chun-Xiu; Yin, Xue-Feng

    2011-02-04

    A chip-based microfluidic system for high-throughput single-cell analysis is described. The system was integrated with continuous introduction of individual cells, rapid dynamic lysis, capillary electrophoretic (CE) separation and laser induced fluorescence (LIF) detection. A cross microfluidic chip with one sheath-flow channel located on each side of the sampling channel was designed. The labeled cells were hydrodynamically focused by sheath-flow streams and sequentially introduced into the cross section of the microchip under hydrostatic pressure generated by adjusting liquid levels in the reservoirs. Combined with the electric field applied on the separation channel, the aligned cells were driven into the separation channel and rapidly lysed within 33ms at the entry of the separation channel by Triton X-100 added in the sheath-flow solution. The maximum rate for introducing individual cells into the separation channel was about 150cells/min. The introduction of sheath-flow streams also significantly reduced the concentration of phosphate-buffered saline (PBS) injected into the separation channel along with single cells, thus reducing Joule heating during electrophoretic separation. The performance of this microfluidic system was evaluated by analysis of reduced glutathione (GSH) and reactive oxygen species (ROS) in single erythrocytes. A throughput of 38cells/min was obtained. The proposed method is simple and robust for high-throughput single-cell analysis, allowing for analysis of cell population with considerable size to generate results with statistical significance. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  20. High-Throughput Analysis of Dynamic Gene Expression Associated with Sleep Deprivation and Recovery Sleep in the Mouse Brain

    DTIC Science & Technology

    2006-12-01

    CONTRACTING ORGANIZATION : Allen Institute for Brain Science Seattle, WA 98103 REPORT DATE...5e. TASK NUMBER Email: edl@alleninstitute.org 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING... ORGANIZATION REPORT NUMBER Allen Institute for Brain Science Seattle, WA 98103 9. SPONSORING / MONITORING AGENCY NAME(S) AND

  1. RoCoMAR: robots' controllable mobility aided routing and relay architecture for mobile sensor networks.

    PubMed

    Le, Duc Van; Oh, Hoon; Yoon, Seokhoon

    2013-07-05

    In a practical deployment, mobile sensor network (MSN) suffers from a low performance due to high node mobility, time-varying wireless channel properties, and obstacles between communicating nodes. In order to tackle the problem of low network performance and provide a desired end-to-end data transfer quality, in this paper we propose a novel ad hoc routing and relaying architecture, namely RoCoMAR (Robots' Controllable Mobility Aided Routing) that uses robotic nodes' controllable mobility. RoCoMAR repeatedly performs link reinforcement process with the objective of maximizing the network throughput, in which the link with the lowest quality on the path is identified and replaced with high quality links by placing a robotic node as a relay at an optimal position. The robotic node resigns as a relay if the objective is achieved or no more gain can be obtained with a new relay. Once placed as a relay, the robotic node performs adaptive link maintenance by adjusting its position according to the movements of regular nodes. The simulation results show that RoCoMAR outperforms existing ad hoc routing protocols for MSN in terms of network throughput and end-to-end delay.

  2. S-MART, a software toolbox to aid RNA-Seq data analysis.

    PubMed

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci.

  3. S-MART, A Software Toolbox to Aid RNA-seq Data Analysis

    PubMed Central

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci. PMID:21998740

  4. RoCoMAR: Robots' Controllable Mobility Aided Routing and Relay Architecture for Mobile Sensor Networks

    PubMed Central

    Van Le, Duc; Oh, Hoon; Yoon, Seokhoon

    2013-01-01

    In a practical deployment, mobile sensor network (MSN) suffers from a low performance due to high node mobility, time-varying wireless channel properties, and obstacles between communicating nodes. In order to tackle the problem of low network performance and provide a desired end-to-end data transfer quality, in this paper we propose a novel ad hoc routing and relaying architecture, namely RoCoMAR (Robots' Controllable Mobility Aided Routing) that uses robotic nodes' controllable mobility. RoCoMAR repeatedly performs link reinforcement process with the objective of maximizing the network throughput, in which the link with the lowest quality on the path is identified and replaced with high quality links by placing a robotic node as a relay at an optimal position. The robotic node resigns as a relay if the objective is achieved or no more gain can be obtained with a new relay. Once placed as a relay, the robotic node performs adaptive link maintenance by adjusting its position according to the movements of regular nodes. The simulation results show that RoCoMAR outperforms existing ad hoc routing protocols for MSN in terms of network throughput and end-to-end delay. PMID:23881134

  5. Throughput of Coded Optical CDMA Systems with AND Detectors

    NASA Astrophysics Data System (ADS)

    Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.

    2012-09-01

    Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.

  6. A highly efficient, high-throughput lipidomics platform for the quantitative detection of eicosanoids in human whole blood.

    PubMed

    Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E

    2013-02-15

    We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Combinatorial chemoenzymatic synthesis and high-throughput screening of sialosides.

    PubMed

    Chokhawala, Harshal A; Huang, Shengshu; Lau, Kam; Yu, Hai; Cheng, Jiansong; Thon, Vireak; Hurtado-Ziola, Nancy; Guerrero, Juan A; Varki, Ajit; Chen, Xi

    2008-09-19

    Although the vital roles of structures containing sialic acid in biomolecular recognition are well documented, limited information is available on how sialic acid structural modifications, sialyl linkages, and the underlying glycan structures affect the binding or the activity of sialic acid-recognizing proteins and related downstream biological processes. A novel combinatorial chemoenzymatic method has been developed for the highly efficient synthesis of biotinylated sialosides containing different sialic acid structures and different underlying glycans in 96-well plates from biotinylated sialyltransferase acceptors and sialic acid precursors. By transferring the reaction mixtures to NeutrAvidin-coated plates and assaying for the yields of enzymatic reactions using lectins recognizing sialyltransferase acceptors but not the sialylated products, the biotinylated sialoside products can be directly used, without purification, for high-throughput screening to quickly identify the ligand specificity of sialic acid-binding proteins. For a proof-of-principle experiment, 72 biotinylated alpha2,6-linked sialosides were synthesized in 96-well plates from 4 biotinylated sialyltransferase acceptors and 18 sialic acid precursors using a one-pot three-enzyme system. High-throughput screening assays performed in NeutrAvidin-coated microtiter plates show that whereas Sambucus nigra Lectin binds to alpha2,6-linked sialosides with high promiscuity, human Siglec-2 (CD22) is highly selective for a number of sialic acid structures and the underlying glycans in its sialoside ligands.

  8. High-Throughput, Adaptive FFT Architecture for FPGA-Based Spaceborne Data Processors

    NASA Technical Reports Server (NTRS)

    NguyenKobayashi, Kayla; Zheng, Jason X.; He, Yutao; Shah, Biren N.

    2011-01-01

    Exponential growth in microelectronics technology such as field-programmable gate arrays (FPGAs) has enabled high-performance spaceborne instruments with increasing onboard data processing capabilities. As a commonly used digital signal processing (DSP) building block, fast Fourier transform (FFT) has been of great interest in onboard data processing applications, which needs to strike a reasonable balance between high-performance (throughput, block size, etc.) and low resource usage (power, silicon footprint, etc.). It is also desirable to be designed so that a single design can be reused and adapted into instruments with different requirements. The Multi-Pass Wide Kernel FFT (MPWK-FFT) architecture was developed, in which the high-throughput benefits of the parallel FFT structure and the low resource usage of Singleton s single butterfly method is exploited. The result is a wide-kernel, multipass, adaptive FFT architecture. The 32K-point MPWK-FFT architecture includes 32 radix-2 butterflies, 64 FIFOs to store the real inputs, 64 FIFOs to store the imaginary inputs, complex twiddle factor storage, and FIFO logic to route the outputs to the correct FIFO. The inputs are stored in sequential fashion into the FIFOs, and the outputs of each butterfly are sequentially written first into the even FIFO, then the odd FIFO. Because of the order of the outputs written into the FIFOs, the depth of the even FIFOs, which are 768 each, are 1.5 times larger than the odd FIFOs, which are 512 each. The total memory needed for data storage, assuming that each sample is 36 bits, is 2.95 Mbits. The twiddle factors are stored in internal ROM inside the FPGA for fast access time. The total memory size to store the twiddle factors is 589.9Kbits. This FFT structure combines the benefits of high throughput from the parallel FFT kernels and low resource usage from the multi-pass FFT kernels with desired adaptability. Space instrument missions that need onboard FFT capabilities such as the proposed DESDynl, SWOT (Surface Water Ocean Topography), and Europa sounding radar missions would greatly benefit from this technology with significant reductions in non-recurring cost and risk.

  9. Modeling and Simulation Reliable Spacecraft On-Board Computing

    NASA Technical Reports Server (NTRS)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  10. PLAStiCC: Predictive Look-Ahead Scheduling for Continuous dataflows on Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumbhare, Alok; Simmhan, Yogesh; Prasanna, Viktor K.

    2014-05-27

    Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application’s throughput. In this paper wemore » propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based look-ahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from public and private IaaS clouds. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.« less

  11. Design space exploration of high throughput finite field multipliers for channel coding on Xilinx FPGAs

    NASA Astrophysics Data System (ADS)

    de Schryver, C.; Weithoffer, S.; Wasenmüller, U.; Wehn, N.

    2012-09-01

    Channel coding is a standard technique in all wireless communication systems. In addition to the typically employed methods like convolutional coding, turbo coding or low density parity check (LDPC) coding, algebraic codes are used in many cases. For example, outer BCH coding is applied in the DVB-S2 standard for satellite TV broadcasting. A key operation for BCH and the related Reed-Solomon codes are multiplications in finite fields (Galois Fields), where extension fields of prime fields are used. A lot of architectures for multiplications in finite fields have been published over the last decades. This paper examines four different multiplier architectures in detail that offer the potential for very high throughputs. We investigate the implementation performance of these multipliers on FPGA technology in the context of channel coding. We study the efficiency of the multipliers with respect to area, frequency and throughput, as well as configurability and scalability. The implementation data of the fully verified circuits are provided for a Xilinx Virtex-4 device after place and route.

  12. High-throughput Molecular Simulations of MOFs for CO2 Separation: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Erucar, Ilknur; Keskin, Seda

    2018-02-01

    Metal organic frameworks (MOFs) have emerged as great alternatives to traditional nanoporous materials for CO2 separation applications. MOFs are porous materials that are formed by self-assembly of transition metals and organic ligands. The most important advantage of MOFs over well-known porous materials is the possibility to generate multiple materials with varying structural properties and chemical functionalities by changing the combination of metal centers and organic linkers during the synthesis. This leads to a large diversity of materials with various pore sizes and shapes that can be efficiently used for CO2 separations. Since the number of synthesized MOFs has already reached to several thousand, experimental investigation of each MOF at the lab-scale is not practical. High-throughput computational screening of MOFs is a great opportunity to identify the best materials for CO2 separation and to gain molecular-level insights into the structure-performance relationships. This type of knowledge can be used to design new materials with the desired structural features that can lead to extraordinarily high CO2 selectivities. In this mini-review, we focused on developments in high-throughput molecular simulations of MOFs for CO2 separations. After reviewing the current studies on this topic, we discussed the opportunities and challenges in the field and addressed the potential future developments.

  13. A Simple, High-Throughput Assay for Fragile X Expanded Alleles Using Triple Repeat Primed PCR and Capillary Electrophoresis

    PubMed Central

    Lyon, Elaine; Laver, Thomas; Yu, Ping; Jama, Mohamed; Young, Keith; Zoccoli, Michael; Marlowe, Natalia

    2010-01-01

    Population screening has been proposed for Fragile X syndrome to identify premutation carrier females and affected newborns. We developed a PCR-based assay capable of quickly detecting the presence or absence of an expanded FMR1 allele with high sensitivity and specificity. This assay combines a triplet repeat primed PCR with high-throughput automated capillary electrophoresis. We evaluated assay performance using archived samples sent for Fragile X diagnostic testing representing a range of Fragile X CGG-repeat expansions. Two hundred five previously genotyped samples were tested with the new assay. Data were analyzed for the presence of a trinucleotide “ladder” extending beyond 55 repeats, which was set as a cut-off to identify expanded FMR1 alleles. We identified expanded FMR1 alleles in 132 samples (59 premutation, 71 full mutation, 2 mosaics) and normal FMR1 alleles in 73 samples. We found 100% concordance with previous results from PCR and Southern blot analyses. In addition, we show feasibility of using this assay with DNA extracted from dried-blood spots. Using a single PCR combined with high-throughput fragment analysis on the automated capillary electrophoresis instrument, we developed a rapid and reproducible PCR-based laboratory assay that meets many of the requirements for a first-tier test for population screening. PMID:20431035

  14. Application of extrinsic fluorescence spectroscopy for the high throughput formulation screening of aluminum-adjuvanted vaccines.

    PubMed

    Ausar, Salvador F; Chan, Judy; Hoque, Warda; James, Olive; Jayasundara, Kavisha; Harper, Kevin

    2011-02-01

    High throughput screening (HTS) of excipients for proteins in solution can be achieved by several analytical techniques. The screening of stabilizers for proteins adsorbed onto adjuvants, however, may be difficult due to the limited amount of techniques that can measure stability of adsorbed protein in high throughput mode. Here, we demonstrate that extrinsic fluorescence spectroscopy can be successfully applied to study the physical stability of adsorbed antigens at low concentrations in 96-well plates, using a real-time polymerase chain reaction (RT-PCR) instrument. HTS was performed on three adjuvanted pneumococcal proteins as model antigens in the presence of a standard library of stabilizers. Aluminum hydroxide appeared to decrease the stability of all three proteins at relatively high and low pH values, showing a bell-shaped curve as the pH was increased from 5 to 9 with a maximum stability at near neutral pH. Nonspecific stabilizers such as mono- and disaccharides could increase the conformational stability of the antigens. In addition, those excipients that increased the melting temperature of adsorbed antigens could improve antigenicity and chemical stability. To the best of our knowledge, this is the first report describing an HTS technology amenable for low concentration of antigens adsorbed onto aluminum-containing adjuvants. Copyright © 2010 Wiley-Liss, Inc.

  15. A high-throughput liquid bead array-based screening technology for Bt presence in GMO manipulation.

    PubMed

    Fu, Wei; Wang, Huiyu; Wang, Chenguang; Mei, Lin; Lin, Xiangmei; Han, Xueqing; Zhu, Shuifang

    2016-03-15

    The number of species and planting areas of genetically modified organisms (GMOs) has been rapidly developed during the past ten years. For the purpose of GMO inspection, quarantine and manipulation, we have now devised a high-throughput Bt-based GMOs screening method based on the liquid bead array. This novel method is based on the direct competitive recognition between biotinylated antibodies and beads-coupled antigens, searching for Bt presence in samples if it contains Bt Cry1 Aa, Bt Cry1 Ab, Bt Cry1 Ac, Bt Cry1 Ah, Bt Cry1 B, Bt Cry1 C, Bt Cry1 F, Bt Cry2 A, Bt Cry3 or Bt Cry9 C. Our method has a wide GMO species coverage so that more than 90% of the whole commercialized GMO species can be identified throughout the world. Under our optimization, specificity, sensitivity, repeatability and availability validation, the method shows a high specificity and 10-50 ng/mL sensitivity of quantification. We then assessed more than 1800 samples in the field and food market to prove capacity of our method in performing a high throughput screening work for GMO manipulation. Our method offers an applicant platform for further inspection and research on GMO plants. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Blood group genotyping: from patient to high-throughput donor screening.

    PubMed

    Veldhuisen, B; van der Schoot, C E; de Haas, M

    2009-10-01

    Blood group antigens, present on the cell membrane of red blood cells and platelets, can be defined either serologically or predicted based on the genotypes of genes encoding for blood group antigens. At present, the molecular basis of many antigens of the 30 blood group systems and 17 human platelet antigens is known. In many laboratories, blood group genotyping assays are routinely used for diagnostics in cases where patient red cells cannot be used for serological typing due to the presence of auto-antibodies or after recent transfusions. In addition, DNA genotyping is used to support (un)-expected serological findings. Fetal genotyping is routinely performed when there is a risk of alloimmune-mediated red cell or platelet destruction. In case of patient blood group antigen typing, it is important that a genotyping result is quickly available to support the selection of donor blood, and high-throughput of the genotyping method is not a prerequisite. In addition, genotyping of blood donors will be extremely useful to obtain donor blood with rare phenotypes, for example lacking a high-frequency antigen, and to obtain a fully typed donor database to be used for a better matching between recipient and donor to prevent adverse transfusion reactions. Serological typing of large cohorts of donors is a labour-intensive and expensive exercise and hampered by the lack of sufficient amounts of approved typing reagents for all blood group systems of interest. Currently, high-throughput genotyping based on DNA micro-arrays is a very feasible method to obtain a large pool of well-typed blood donors. Several systems for high-throughput blood group genotyping are developed and will be discussed in this review.

  17. An efficient and high-throughput protocol for Agrobacterium-mediated transformation based on phosphomannose isomerase positive selection in Japonica rice (Oryza sativa L.).

    PubMed

    Duan, Yongbo; Zhai, Chenguang; Li, Hao; Li, Juan; Mei, Wenqian; Gui, Huaping; Ni, Dahu; Song, Fengshun; Li, Li; Zhang, Wanggen; Yang, Jianbo

    2012-09-01

    A number of Agrobacterium-mediated rice transformation systems have been developed and widely used in numerous laboratories and research institutes. However, those systems generally employ antibiotics like kanamycin and hygromycin, or herbicide as selectable agents, and are used for the small-scale experiments. To address high-throughput production of transgenic rice plants via Agrobacterium-mediated transformation, and to eliminate public concern on antibiotic markers, we developed a comprehensive efficient protocol, covering from explant preparation to the acquisition of low copy events by real-time PCR analysis before transplant to field, for high-throughput production of transgenic plants of Japonica rice varieties Wanjing97 and Nipponbare using Escherichia coli phosphomannose isomerase gene (pmi) as a selectable marker. The transformation frequencies (TF) of Wanjing97 and Nipponbare were achieved as high as 54.8 and 47.5%, respectively, in one round of selection of 7.5 or 12.5 g/L mannose appended with 5 g/L sucrose. High-throughput transformation from inoculation to transplant of low copy events was accomplished within 55-60 days. Moreover, the Taqman assay data from a large number of transformants showed 45.2% in Wanjing97 and 31.5% in Nipponbare as a low copy rate, and the transformants are fertile and follow the Mendelian segregation ratio. This protocol facilitates us to perform genome-wide functional annotation of the open reading frames and utilization of the agronomically important genes in rice under a reduced public concern on selectable markers. We describe a comprehensive protocol for large scale production of transgenic Japonica rice plants using non-antibiotic selectable agent, at simplified, cost- and labor-saving manners.

  18. Steroid Profiling by Gas Chromatography–Mass Spectrometry and High Performance Liquid Chromatography–Mass Spectrometry for Adrenal Diseases

    PubMed Central

    McDonald, Jeffrey G.; Matthew, Susan

    2012-01-01

    The ability to measure steroid hormone concentrations in blood and urine specimens is central to the diagnosis and proper treatment of adrenal diseases. The traditional approach has been to assay each steroid hormone, precursor, or metabolite using individual aliquots of serum, each with a separate immunoassay. For complex diseases, such as congenital adrenal hyperplasia and adrenocortical cancer, in which the assay of several steroids is essential for management, this approach is time consuming and costly, in addition to using large amounts of serum. Gas chromatography/mass spectrometry profiling of steroid metabolites in urine has been employed for many years but only in a small number of specialized laboratories and suffers from slow throughput. The advent of commercial high-performance liquid chromatography instruments coupled to tandem mass spectrometers offers the potential for medium- to high-throughput profiling of serum steroids using small quantities of sample. Here, we review the physical principles of mass spectrometry, the instrumentation used for these techniques, the terminology used in this field and applications to steroid analysis. PMID:22170384

  19. 40 CFR Table 9 to Subpart Eeee of... - Continuous Compliance With Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...

  20. Air Ground Data Link VHF Airline Communications and Reporting System (ACARS) Preliminary Test Report

    DOT National Transportation Integrated Search

    1995-02-01

    An effort was conducted to determine actual ground-to-air, and air-to-ground : performance of the Airline Communications and Reporting system (ACARS), Very : High Frequency (VHF) Data Link System. Parameters of system throughput, error : rates, and a...

  1. AOP-informed assessment of endocrine disruption in freshwater crustaceans

    EPA Science Inventory

    To date, most research focused on developing more efficient and cost effective methods to predict toxicity have focused on human biology. However, there is also a need for effective high throughput tools to predict toxicity to other species that perform critical ecosystem functio...

  2. Activity profiles of 676 ToxCast Phase II compounds in 231 biochemical high-throughput screening assays

    EPA Science Inventory

    Understanding potential health risks posed by environmental chemicals is a significant challenge elevated by large numbers of diverse chemicals with generally uncharacterized exposures, mechanisms and toxicities. The present study is a performance evaluation and critical analysis...

  3. Profiling optimization for big data transfer over dedicated channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, D.; Wu, Qishi; Rao, Nageswara S

    The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less

  4. Sophia: A Expedient UMLS Concept Extraction Annotator.

    PubMed

    Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H

    2014-01-01

    An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks.

  5. Microbial Community in a Biofilter for Removal of Low Load Nitrobenzene Waste Gas

    PubMed Central

    Zhai, Jian; Wang, Zhu; Shi, Peng; Long, Chao

    2017-01-01

    To improve biofilter performance, the microbial community of a biofilter must be clearly defined. In this study, the performance of a lab-scale polyurethane biofilter for treating waste gas with low loads of nitrobenzene (NB) (< 20 g m-3 h-1) was investigated when using different empty bed residence times (EBRT) (64, 55.4 and 34 s, respectively). In addition, the variations of the bacterial community in the biofilm on the longitudinal distribution of the biofilters were analysed by using Illumina MiSeq high-throughput sequencing. The results showed that NB waste gas was successfully degraded in the biofilter. High-throughput sequencing data suggested that the phylum Actinobacteria and genus Rhodococcus played important roles in the degradation of NB. The variations of the microbial community were attributed to the different intermediate degradation products of NB in each layer. The strains identified in this study were potential candidates for purifying waste gas effluents containing NB. PMID:28114416

  6. 384 hanging drop arrays give excellent Z-factors and allow versatile formation of co-culture spheroids.

    PubMed

    Hsiao, Amy Y; Tung, Yi-Chung; Qu, Xianggui; Patel, Lalit R; Pienta, Kenneth J; Takayama, Shuichi

    2012-05-01

    We previously reported the development of a simple, user-friendly, and versatile 384 hanging drop array plate for 3D spheroid culture and the importance of utilizing 3D cellular models in anti-cancer drug sensitivity testing. The 384 hanging drop array plate allows for high-throughput capabilities and offers significant improvements over existing 3D spheroid culture methods. To allow for practical 3D cell-based high-throughput screening and enable broader use of the plate, we characterize the robustness of the 384 hanging drop array plate in terms of assay performance and demonstrate the versatility of the plate. We find that the 384 hanging drop array plate performance is robust in fluorescence- and colorimetric-based assays through Z-factor calculations. Finally, we demonstrate different plate capabilities and applications, including: spheroid transfer and retrieval for Janus spheroid formation, sequential addition of cells for concentric layer patterning of different cell types, and culture of a wide variety of cell types. Copyright © 2011 Wiley Periodicals, Inc.

  7. 384 Hanging Drop Arrays Give Excellent Z-factors and Allow Versatile Formation of Co-culture Spheroids

    PubMed Central

    Hsiao, Amy Y.; Tung, Yi-Chung; Qu, Xianggui; Patel, Lalit R.; Pienta, Kenneth J.; Takayama, Shuichi

    2012-01-01

    We previously reported the development of a simple, user-friendly, and versatile 384 hanging drop array plate for 3D spheroid culture and the importance of utilizing 3D cellular models in anti-cancer drug sensitivity testing. The 384 hanging drop array plate allows for high-throughput capabilities and offers significant improvements over existing 3D spheroid culture methods. To allow for practical 3D cell-based high-throughput screening and enable broader use of the plate, we characterize the robustness of the 384 hanging drop array plate in terms of assay performance and demonstrate the versatility of the plate. We find that the 384 hanging drop array plate performance is robust in fluorescence- and colorimetric-based assays through z-factor calculations. Finally, we demonstrate different plate capabilities and applications, including: spheroid transfer and retrieval for Janus spheroid formation, sequential addition of cells for concentric layer patterning of different cell types, and culture of a wide variety of cell types. PMID:22161651

  8. An automated compound screening for anti-aging effects on the function of C. elegans sensory neurons.

    PubMed

    Bazopoulou, Daphne; Chaudhury, Amrita R; Pantazis, Alexandros; Chronis, Nikos

    2017-08-24

    Discovery of molecular targets or compounds that alter neuronal function can lead to therapeutic advances that ameliorate age-related neurodegenerative pathologies. Currently, there is a lack of in vivo screening technologies for the discovery of compounds that affect the age-dependent neuronal physiology. Here, we present a high-throughput, microfluidic-based assay for automated manipulation and on-chip monitoring and analysis of stimulus-evoked calcium responses of intact C. elegans at various life stages. First, we successfully applied our technology to quantify the effects of aging and age-related genetic and chemical factors in the calcium transients of the ASH sensory neuron. We then performed a large-scale screen of a library of 107 FDA-approved compounds to identify hits that prevented the age-dependent functional deterioration of ASH. The robust performance of our assay makes it a valuable tool for future high-throughput applications based on in vivo functional imaging.

  9. Sophia: A Expedient UMLS Concept Extraction Annotator

    PubMed Central

    Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V.; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H.

    2014-01-01

    An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks. PMID:25954351

  10. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    PubMed

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Accelerating the design of solar thermal fuel materials through high throughput simulations.

    PubMed

    Liu, Yun; Grossman, Jeffrey C

    2014-12-10

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.

  12. A low cost and high throughput magnetic bead-based immuno-agglutination assay in confined droplets.

    PubMed

    Teste, Bruno; Ali-Cherif, Anaïs; Viovy, Jean Louis; Malaquin, Laurent

    2013-06-21

    Although passive immuno-agglutination assays consist of one step and simple procedures, they are usually not adapted for high throughput analyses and they require expensive and bulky equipment for quantitation steps. Here we demonstrate a low cost, multimodal and high throughput immuno-agglutination assay that relies on a combination of magnetic beads (MBs), droplets microfluidics and magnetic tweezers. Antibody coated MBs were used as a capture support in the homogeneous phase. Following the immune interaction, water in oil droplets containing MBs and analytes were generated and transported in Teflon tubing. When passing in between magnetic tweezers, the MBs contained in the droplets were magnetically confined in order to enhance the agglutination rate and kinetics. When releasing the magnetic field, the internal recirculation flows in the droplet induce shear forces that favor MBs redispersion. In the presence of the analyte, the system preserves specific interactions and MBs stay in the aggregated state while in the case of a non-specific analyte, redispersion of particles occurs. The analyte quantitation procedure relies on the MBs redispersion rate within the droplet. The influence of different parameters such as magnetic field intensity, flow rate and MBs concentration on the agglutination performances have been investigated and optimized. Although the immuno-agglutination assay described in this work may not compete with enzyme linked immunosorbent assay (ELISA) in terms of sensitivity, it offers major advantages regarding the reagents consumption (analysis is performed in sub microliter droplet) and the platform cost that yields to very cheap analyses. Moreover the fully automated analysis procedure provides reproducible analyses with throughput well above those of existing technologies. We demonstrated the detection of biotinylated phosphatase alkaline in 100 nL sample volumes with an analysis rate of 300 assays per hour and a limit of detection of 100 pM.

  13. The combination of gas-phase fluorophore technology and automation to enable high-throughput analysis of plant respiration.

    PubMed

    Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K

    2017-01-01

    Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.

  14. Simultaneous Measurements of Auto-Immune and Infectious Disease Specific Antibodies Using a High Throughput Multiplexing Tool

    PubMed Central

    Asati, Atul; Kachurina, Olga; Kachurin, Anatoly

    2012-01-01

    Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605

  15. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  16. High-performance web services for querying gene and variant annotation.

    PubMed

    Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei

    2016-05-06

    Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.

  17. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE PAGES

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    2016-09-01

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  18. Isotonic Regression Based-Method in Quantitative High-Throughput Screenings for Genotoxicity

    PubMed Central

    Fujii, Yosuke; Narita, Takeo; Tice, Raymond Richard; Takeda, Shunich

    2015-01-01

    Quantitative high-throughput screenings (qHTSs) for genotoxicity are conducted as part of comprehensive toxicology screening projects. The most widely used method is to compare the dose-response data of a wild-type and DNA repair gene knockout mutants, using model-fitting to the Hill equation (HE). However, this method performs poorly when the observed viability does not fit the equation well, as frequently happens in qHTS. More capable methods must be developed for qHTS where large data variations are unavoidable. In this study, we applied an isotonic regression (IR) method and compared its performance with HE under multiple data conditions. When dose-response data were suitable to draw HE curves with upper and lower asymptotes and experimental random errors were small, HE was better than IR, but when random errors were big, there was no difference between HE and IR. However, when the drawn curves did not have two asymptotes, IR showed better performance (p < 0.05, exact paired Wilcoxon test) with higher specificity (65% in HE vs. 96% in IR). In summary, IR performed similarly to HE when dose-response data were optimal, whereas IR clearly performed better in suboptimal conditions. These findings indicate that IR would be useful in qHTS for comparing dose-response data. PMID:26673567

  19. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  20. Combinatorial materials research applied to the development of new surface coatings VII: An automated system for adhesion testing

    NASA Astrophysics Data System (ADS)

    Chisholm, Bret J.; Webster, Dean C.; Bennett, James C.; Berry, Missy; Christianson, David; Kim, Jongsoo; Mayo, Bret; Gubbins, Nathan

    2007-07-01

    An automated, high-throughput adhesion workflow that enables pseudobarnacle adhesion and coating/substrate adhesion to be measured on coating patches arranged in an array format on 4×8in.2 panels was developed. The adhesion workflow consists of the following process steps: (1) application of an adhesive to the coating array; (2) insertion of panels into a clamping device; (3) insertion of aluminum studs into the clamping device and onto coating surfaces, aligned with the adhesive; (4) curing of the adhesive; and (5) automated removal of the aluminum studs. Validation experiments comparing data generated using the automated, high-throughput workflow to data obtained using conventional, manual methods showed that the automated system allows for accurate ranking of relative coating adhesion performance.

  1. Molecular profiling of single circulating tumor cells from lung cancer patients.

    PubMed

    Park, Seung-Min; Wong, Dawson J; Ooi, Chin Chun; Kurtz, David M; Vermesh, Ophir; Aalipour, Amin; Suh, Susie; Pian, Kelsey L; Chabon, Jacob J; Lee, Sang Hun; Jamali, Mehran; Say, Carmen; Carter, Justin N; Lee, Luke P; Kuschner, Ware G; Schwartz, Erich J; Shrager, Joseph B; Neal, Joel W; Wakelee, Heather A; Diehn, Maximilian; Nair, Viswam S; Wang, Shan X; Gambhir, Sanjiv S

    2016-12-27

    Circulating tumor cells (CTCs) are established cancer biomarkers for the "liquid biopsy" of tumors. Molecular analysis of single CTCs, which recapitulate primary and metastatic tumor biology, remains challenging because current platforms have limited throughput, are expensive, and are not easily translatable to the clinic. Here, we report a massively parallel, multigene-profiling nanoplatform to compartmentalize and analyze hundreds of single CTCs. After high-efficiency magnetic collection of CTC from blood, a single-cell nanowell array performs CTC mutation profiling using modular gene panels. Using this approach, we demonstrated multigene expression profiling of individual CTCs from non-small-cell lung cancer (NSCLC) patients with remarkable sensitivity. Thus, we report a high-throughput, multiplexed strategy for single-cell mutation profiling of individual lung cancer CTCs toward minimally invasive cancer therapy prediction and disease monitoring.

  2. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    DOEpatents

    Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William F.; Yegian, Derek T.; Earnest, Thomas N.; Jaklevich, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.

    2007-09-25

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  3. Integrated crystal mounting and alignment system for high-throughput biological crystallography

    DOEpatents

    Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.

    2005-07-19

    A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.

  4. A Microfluidic Platform for High-Throughput Multiplexed Protein Quantitation

    PubMed Central

    Volpetti, Francesca; Garcia-Cordero, Jose; Maerkl, Sebastian J.

    2015-01-01

    We present a high-throughput microfluidic platform capable of quantitating up to 384 biomarkers in 4 distinct samples by immunoassay. The microfluidic device contains 384 unit cells, which can be individually programmed with pairs of capture and detection antibody. Samples are quantitated in each unit cell by four independent MITOMI detection areas, allowing four samples to be analyzed in parallel for a total of 1,536 assays per device. We show that the device can be pre-assembled and stored for weeks at elevated temperature and we performed proof-of-concept experiments simultaneously quantitating IL-6, IL-1β, TNF-α, PSA, and GFP. Finally, we show that the platform can be used to identify functional antibody combinations by screening 64 antibody combinations requiring up to 384 unique assays per device. PMID:25680117

  5. Measurements of file transfer rates over dedicated long-haul connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Settlemyer, Bradley W; Imam, Neena

    2016-01-01

    Wide-area file transfers are an integral part of several High-Performance Computing (HPC) scenarios. Dedicated network connections with high capacity, low loss rate and low competing traffic, are increasingly being provisioned over current HPC infrastructures to support such transfers. To gain insights into these file transfers, we collected transfer rate measurements for Lustre and xfs file systems between dedicated multi-core servers over emulated 10 Gbps connections with round trip times (rtt) in 0-366 ms range. Memory transfer throughput over these connections is measured using iperf, and file IO throughput on host systems is measured using xddprof. We consider two file systemmore » configurations: Lustre over IB network and xfs over SSD connected to PCI bus. Files are transferred using xdd across these connections, and the transfer rates are measured, which indicate the need to jointly optimize the connection and host file IO parameters to achieve peak transfer rates. In particular, these measurements indicate that (i) peak file transfer rate is lower than peak connection and host IO throughput, in some cases by as much as 50% or lower, (ii) xdd request sizes that achieve peak throughput for host file IO do not necessarily lead to peak file transfer rates, and (iii) parallelism in host IO and TCP transport does not always improve the file transfer rates.« less

  6. High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.

    2016-09-23

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less

  7. Ultra-High Throughput Synthesis of Nanoparticles with Homogeneous Size Distribution Using a Coaxial Turbulent Jet Mixer

    PubMed Central

    2015-01-01

    High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296

  8. Running High-Throughput Jobs on Peregrine | High-Performance Computing |

    Science.gov Websites

    unique name (using "name=") and usse the task name to create a unique output file name. For runs on and how many tasks to give to each worker at a time using the NITRO_COORD_OPTIONS environment . Finally, you start Nitro by executing launch_nitro.sh. Sample Nitro job script To run a job using the

  9. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  10. Characterization of matrix effects in developing rugged high-throughput LC-MS/MS methods for bioanalysis.

    PubMed

    Li, Fumin; Wang, Jun; Jenkins, Rand

    2016-05-01

    There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.

  11. RAMICS: trainable, high-speed and biologically relevant alignment of high-throughput sequencing reads to coding DNA

    PubMed Central

    Wright, Imogen A.; Travers, Simon A.

    2014-01-01

    The challenge presented by high-throughput sequencing necessitates the development of novel tools for accurate alignment of reads to reference sequences. Current approaches focus on using heuristics to map reads quickly to large genomes, rather than generating highly accurate alignments in coding regions. Such approaches are, thus, unsuited for applications such as amplicon-based analysis and the realignment phase of exome sequencing and RNA-seq, where accurate and biologically relevant alignment of coding regions is critical. To facilitate such analyses, we have developed a novel tool, RAMICS, that is tailored to mapping large numbers of sequence reads to short lengths (<10 000 bp) of coding DNA. RAMICS utilizes profile hidden Markov models to discover the open reading frame of each sequence and aligns to the reference sequence in a biologically relevant manner, distinguishing between genuine codon-sized indels and frameshift mutations. This approach facilitates the generation of highly accurate alignments, accounting for the error biases of the sequencing machine used to generate reads, particularly at homopolymer regions. Performance improvements are gained through the use of graphics processing units, which increase the speed of mapping through parallelization. RAMICS substantially outperforms all other mapping approaches tested in terms of alignment quality while maintaining highly competitive speed performance. PMID:24861618

  12. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    PubMed

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Quantitative High-Throughput Luciferase Screening in Identifying CAR Modulators.

    PubMed

    Lynch, Caitlin; Zhao, Jinghua; Wang, Hongbing; Xia, Menghang

    2016-01-01

    The constitutive androstane receptor (CAR, NR1I3) is responsible for the transcription of multiple drug metabolizing enzymes and transporters. There are two possible methods of activation for CAR, direct ligand binding and a ligand-independent method, which makes this a unique nuclear receptor. Both of these mechanisms require translocation of CAR from the cytoplasm into the nucleus. Interestingly, CAR is constitutively active in immortalized cell lines due to the basal nuclear location of this receptor. This creates an important challenge in most in vitro assay models because immortalized cells cannot be used without inhibiting the high basal activity. In this book chapter, we go into detail of how to perform quantitative high-throughput screens to identify hCAR1 modulators through the employment of a double stable cell line. Using this line, we are able to identify activators, as well as deactivators, of the challenging nuclear receptor, CAR.

  14. An automated high throughput screening-compatible assay to identify regulators of stem cell neural differentiation.

    PubMed

    Casalino, Laura; Magnani, Dario; De Falco, Sandro; Filosa, Stefania; Minchiotti, Gabriella; Patriarca, Eduardo J; De Cesare, Dario

    2012-03-01

    The use of Embryonic Stem Cells (ESCs) holds considerable promise both for drug discovery programs and the treatment of degenerative disorders in regenerative medicine approaches. Nevertheless, the successful use of ESCs is still limited by the lack of efficient control of ESC self-renewal and differentiation capabilities. In this context, the possibility to modulate ESC biological properties and to obtain homogenous populations of correctly specified cells will help developing physiologically relevant screens, designed for the identification of stem cell modulators. Here, we developed a high throughput screening-suitable ESC neural differentiation assay by exploiting the Cell(maker) robotic platform and demonstrated that neural progenies can be generated from ESCs in complete automation, with high standards of accuracy and reliability. Moreover, we performed a pilot screening providing proof of concept that this assay allows the identification of regulators of ESC neural differentiation in full automation.

  15. High-throughput measurement of recombination rates and genetic interference in Saccharomyces cerevisiae.

    PubMed

    Raffoux, Xavier; Bourge, Mickael; Dumas, Fabrice; Martin, Olivier C; Falque, Matthieu

    2018-06-01

    Allelic recombination owing to meiotic crossovers is a major driver of genome evolution, as well as a key player for the selection of high-performing genotypes in economically important species. Therefore, we developed a high-throughput and low-cost method to measure recombination rates and crossover patterning (including interference) in large populations of the budding yeast Saccharomyces cerevisiae. Recombination and interference were analysed by flow cytometry, which allows time-consuming steps such as tetrad microdissection or spore growth to be avoided. Moreover, our method can also be used to compare recombination in wild-type vs. mutant individuals or in different environmental conditions, even if the changes in recombination rates are small. Furthermore, meiotic mutants often present recombination and/or pairing defects affecting spore viability but our method does not involve growth steps and thus avoids filtering out non-viable spores. Copyright © 2018 John Wiley & Sons, Ltd.

  16. On the tip of the tongue: learning typing and pointing with an intra-oral computer interface.

    PubMed

    Caltenco, Héctor A; Breidegard, Björn; Struijk, Lotte N S Andreasen

    2014-07-01

    To evaluate typing and pointing performance and improvement over time of four able-bodied participants using an intra-oral tongue-computer interface for computer control. A physically disabled individual may lack the ability to efficiently control standard computer input devices. There have been several efforts to produce and evaluate interfaces that provide individuals with physical disabilities the possibility to control personal computers. Training with the intra-oral tongue-computer interface was performed by playing games over 18 sessions. Skill improvement was measured through typing and pointing exercises at the end of each training session. Typing throughput improved from averages of 2.36 to 5.43 correct words per minute. Pointing throughput improved from averages of 0.47 to 0.85 bits/s. Target tracking performance, measured as relative time on target, improved from averages of 36% to 47%. Path following throughput improved from averages of 0.31 to 0.83 bits/s and decreased to 0.53 bits/s with more difficult tasks. Learning curves support the notion that the tongue can rapidly learn novel motor tasks. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, which makes the tongue a feasible input organ for computer control. Intra-oral computer interfaces could provide individuals with severe upper-limb mobility impairments the opportunity to control computers and automatic equipment. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, but does not cause fatigue easily and might be invisible to other people, which is highly prioritized by assistive device users. Combination of visual and auditory feedback is vital for a good performance of an intra-oral computer interface and helps to reduce involuntary or erroneous activations.

  17. Carbohydrate Microarray Technology Applied to High-Throughput Mapping of Plant Cell Wall Glycans Using Comprehensive Microarray Polymer Profiling (CoMPP).

    PubMed

    Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho

    2017-01-01

    Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.

  18. A Rapid Method for the Determination of Fucoxanthin in Diatom

    PubMed Central

    Wang, Li-Juan; Fan, Yong; Parsons, Ronald L.; Hu, Guang-Rong; Zhang, Pei-Yu

    2018-01-01

    Fucoxanthin is a natural pigment found in microalgae, especially diatoms and Chrysophyta. Recently, it has been shown to have anti-inflammatory, anti-tumor, and anti-obesityactivity in humans. Phaeodactylum tricornutum is a diatom with high economic potential due to its high content of fucoxanthin and eicosapentaenoic acid. In order to improve fucoxanthin production, physical and chemical mutagenesis could be applied to generate mutants. An accurate and rapid method to assess the fucoxanthin content is a prerequisite for a high-throughput screen of mutants. In this work, the content of fucoxanthin in P. tricornutum was determined using spectrophotometry instead of high performance liquid chromatography (HPLC). This spectrophotometric method is easier and faster than liquid chromatography and the standard error was less than 5% when compared to the HPLC results. Also, this method can be applied to other diatoms, with standard errors of 3–14.6%. It provides a high throughput screening method for microalgae strains producing fucoxanthin. PMID:29361768

  19. A space- and time-resolved single photon counting detector for fluorescence microscopy and spectroscopy

    PubMed Central

    Michalet, X.; Siegmund, O.H.W.; Vallerga, J.V.; Jelinsky, P.; Millaud, J.E.; Weiss, S.

    2017-01-01

    We have recently developed a wide-field photon-counting detector having high-temporal and high-spatial resolutions and capable of high-throughput (the H33D detector). Its design is based on a 25 mm diameter multi-alkali photocathode producing one photo electron per detected photon, which are then multiplied up to 107 times by a 3-microchannel plate stack. The resulting electron cloud is proximity focused on a cross delay line anode, which allows determining the incident photon position with high accuracy. The imaging and fluorescence lifetime measurement performances of the H33D detector installed on a standard epifluorescence microscope will be presented. We compare them to those of standard single-molecule detectors such as single-photon avalanche photodiode (SPAD) or electron-multiplying camera using model samples (fluorescent beads, quantum dots and live cells). Finally, we discuss the design and applications of future generation of H33D detectors for single-molecule imaging and high-throughput study of biomolecular interactions. PMID:29479130

  20. A High-Content Live-Cell Viability Assay and Its Validation on a Diverse 12K Compound Screen.

    PubMed

    Chiaravalli, Jeanne; Glickman, J Fraser

    2017-08-01

    We have developed a new high-content cytotoxicity assay using live cells, called "ImageTOX." We used a high-throughput fluorescence microscope system, image segmentation software, and the combination of Hoechst 33342 and SYTO 17 to simultaneously score the relative size and the intensity of the nuclei, the nuclear membrane permeability, and the cell number in a 384-well microplate format. We then performed a screen of 12,668 diverse compounds and compared the results to a standard cytotoxicity assay. The ImageTOX assay identified similar sets of compounds to the standard cytotoxicity assay, while identifying more compounds having adverse effects on cell structure, earlier in treatment time. The ImageTOX assay uses inexpensive commercially available reagents and facilitates the use of live cells in toxicity screens. Furthermore, we show that we can measure the kinetic profile of compound toxicity in a high-content, high-throughput format, following the same set of cells over an extended period of time.

  1. Performance, throughput, and cost of in-home training for the Army Reserve: Using asynchronous computer conferencing as an alternative to resident training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hahn, H.A.; Ashworth, R.L. Jr.; Phelps, R.H.

    1990-01-01

    Asynchronous computer conferencing (ACC) was investigated as an alternative to resident training for the Army Reserve Component (RC). Specifically, the goals were to (1) evaluate the performance and throughput of ACC as compared with traditional Resident School instruction and (2) determine the cost-effectiveness of developing and implementing ACC. Fourteen RC students took a module of the Army Engineer Officer Advanced Course (EOAC) via ACC. Course topics included Army doctrine, technical engineering subjects, leadership, and presentation skills. Resident content was adapted for presentation via ACC. The programs of instruction for ACC and the equivalent resident course were identical; only the mediamore » used for presentation were changed. Performance on tests, homework, and practical exercises; self-assessments of learning; throughput; and cost data wee the measures of interest. Comparison data were collected on RC students taking the course in residence. Results indicated that there were no performance differences between the two groups. Students taking the course via ACC perceived greater learning benefit than did students taking the course in residence. Resident throughput was superior to ACC throughput, both in terms of numbers of students completing and time to complete the course. In spite of this fact, however, ACC was more cost-effective than resident training.« less

  2. High-throughput analysis of sub-visible mAb aggregate particles using automated fluorescence microscopy imaging.

    PubMed

    Paul, Albert Jesuran; Bickel, Fabian; Röhm, Martina; Hospach, Lisa; Halder, Bettina; Rettich, Nina; Handrick, René; Herold, Eva Maria; Kiefer, Hans; Hesse, Friedemann

    2017-07-01

    Aggregation of therapeutic proteins is a major concern as aggregates lower the yield and can impact the efficacy of the drug as well as the patient's safety. It can occur in all production stages; thus, it is essential to perform a detailed analysis for protein aggregates. Several methods such as size exclusion high-performance liquid chromatography (SE-HPLC), light scattering, turbidity, light obscuration, and microscopy-based approaches are used to analyze aggregates. None of these methods allows determination of all types of higher molecular weight (HMW) species due to a limited size range. Furthermore, quantification and specification of different HMW species are often not possible. Moreover, automation is a perspective challenge coming up with automated robotic laboratory systems. Hence, there is a need for a fast, high-throughput-compatible method, which can detect a broad size range and enable quantification and classification. We describe a novel approach for the detection of aggregates in the size range 1 to 1000 μm combining fluorescent dyes for protein aggregate labelling and automated fluorescence microscope imaging (aFMI). After appropriate selection of the dye and method optimization, our method enabled us to detect various types of HMW species of monoclonal antibodies (mAbs). Using 10 μmol L -1 4,4'-dianilino-1,1'-binaphthyl-5,5'-disulfonate (Bis-ANS) in combination with aFMI allowed the analysis of mAb aggregates induced by different stresses occurring during downstream processing, storage, and administration. Validation of our results was performed by SE-HPLC, UV-Vis spectroscopy, and dynamic light scattering. With this new approach, we could not only reliably detect different HMW species but also quantify and classify them in an automated approach. Our method achieves high-throughput requirements and the selection of various fluorescent dyes enables a broad range of applications.

  3. High-throughput cultivation and screening platform for unicellular phototrophs.

    PubMed

    Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus

    2014-09-16

    High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.

  4. High-throughput screening of dye-ligands for chromatography.

    PubMed

    Kumar, Sunil; Punekar, Narayan S

    2014-01-01

    Dye-ligand-based chromatography has become popular after Cibacron Blue, the first reactive textile dye, found application for protein purification. Many other textile dyes have since been successfully used to purify a number of proteins and enzymes. While the exact nature of their interaction with target proteins is often unclear, dye-ligands are thought to mimic the structural features of their corresponding substrates, cofactors, etc. The dye-ligand affinity matrices are therefore considered pseudo-affinity matrices. In addition, dye-ligands may simply bind with proteins due to electrostatic, hydrophobic, and hydrogen-bonding interactions. Because of their low cost, ready availability, and structural stability, dye-ligand affinity matrices have gained much popularity. Choice of a large number of dye structures offers a range of matrices to be prepared and tested. When presented in the high-throughput screening mode, these dye-ligand matrices provide a formidable tool for protein purification. One could pick from the list of dye-ligands already available or build a systematic library of such structures for use. A high-throughput screen may be set up to choose best dye-ligand matrix as well as ideal conditions for binding and elution, for a given protein. The mode of operation could be either manual or automated. The technology is available to test the performance of dye-ligand matrices in small volumes in an automated liquid-handling workstation. Screening a systematic library of dye-ligand structures can help establish a structure-activity relationship. While the origins of dye-ligand chromatography lay in exploiting pseudo-affinity, it is now possible to design very specific biomimetic dye structures. High-throughput screening will be of value in this endeavor as well.

  5. WE-E-BRE-03: Biological Validation of a Novel High-Throughput Irradiator for Predictive Radiation Sensitivity Bioassays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, TL; Martin, JA; Shepard, AJ

    2014-06-15

    Purpose: The large dose-response variation in both tumor and normal cells between individual patients has led to the recent implementation of predictive bioassays of patient-specific radiation sensitivity in order to personalize radiation therapy. This exciting new clinical paradigm has led us to develop a novel high-throughput, variable dose-rate irradiator to accompany these efforts. Here we present the biological validation of this irradiator through the use of human cells as a relative dosimeter assessed by two metrics, DNA double-strand break repair pathway modulation and intercellular reactive oxygen species production. Methods: Immortalized human tonsilar epithelial cells were cultured in 96-well micro titermore » plates and irradiated in groups of eight wells to absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy. High-throughput immunofluorescent microscopy was used to detect γH2AX, a DNA double-strand break repair mechanism recruiter. The same analysis was performed with the cells stained with CM-H2DCFDA that produces a fluorescent adduct when exposed to reactive oxygen species during the irradiation cycle. Results: Irradiations of the immortalized human tonsilar epithelial cells at absorbed doses of 0, 0.5, 1, 2, 4, and 8 Gy produced excellent linearity in γH2AX and CM-H2DCFDA with R2 values of 0.9939 and 0.9595 respectively. Single cell gel electrophoresis experimentation for the detection of physical DNA double-strand breaks in ongoing. Conclusions: This work indicates significant potential for our high-throughput variable dose rate irradiator for patient-specific predictive radiation sensitivity bioassays. This irradiator provides a powerful tool by increasing the efficiency and number of assay techniques available to help personalize radiation therapy.« less

  6. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties ofmore » suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.« less

  7. TreeMAC: Localized TDMA MAC protocol for real-time high-data-rate sensor networks

    USGS Publications Warehouse

    Song, W.-Z.; Huang, R.; Shirazi, B.; LaHusen, R.

    2009-01-01

    Earlier sensor network MAC protocols focus on energy conservation in low-duty cycle applications, while some recent applications involve real-time high-data-rate signals. This motivates us to design an innovative localized TDMA MAC protocol to achieve high throughput and low congestion in data collection sensor networks, besides energy conservation. TreeMAC divides a time cycle into frames and each frame into slots. A parent node determines the children's frame assignment based on their relative bandwidth demand, and each node calculates its own slot assignment based on its hop-count to the sink. This innovative 2-dimensional frame-slot assignment algorithm has the following nice theory properties. First, given any node, at any time slot, there is at most one active sender in its neighborhood (including itself). Second, the packet scheduling with TreeMAC is bufferless, which therefore minimizes the probability of network congestion. Third, the data throughput to the gateway is at least 1/3 of the optimum assuming reliable links. Our experiments on a 24-node testbed show that TreeMAC protocol significantly improves network throughput, fairness, and energy efficiency compared to TinyOS's default CSMA MAC protocol and a recent TDMA MAC protocol Funneling-MAC. Partial results of this paper were published in Song, Huang, Shirazi and Lahusen [W.-Z. Song, R. Huang, B. Shirazi, and R. Lahusen, TreeMAC: Localized TDMA MAC protocol for high-throughput and fairness in sensor networks, in: The 7th Annual IEEE International Conference on Pervasive Computing and Communications, PerCom, March 2009]. Our new contributions include analyses of the performance of TreeMAC from various aspects. We also present more implementation detail and evaluate TreeMAC from other aspects. ?? 2009 Elsevier B.V.

  8. Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y; Grossman, JC

    2014-12-01

    Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less

  9. High-Throughput Screening of Chemical Effects on Steroidogenesis Using H295R Human Adrenocortical Carcinoma Cells.

    PubMed

    Karmaus, Agnes L; Toole, Colleen M; Filer, Dayne L; Lewis, Kenneth C; Martin, Matthew T

    2016-04-01

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  10. CA resist with high sensitivity and sub-100-nm resolution for advanced mask and device making

    NASA Astrophysics Data System (ADS)

    Kwong, Ranee W.; Huang, Wu-Song; Hartley, John G.; Moreau, Wayne M.; Robinson, Christopher F.; Angelopoulos, Marie; Magg, Christopher; Lawliss, Mark

    2000-07-01

    Recently, there is significant interest in using CA resists for electron beam (E-Beam) applications including mask making, direct write, and projection printing. CA resists provide superior lithographic performance in comparison to traditional non CA E-beam resists in particular high contrast, resolution, and sensitivity. However, most of the commercially available CA resists have the concern of airborne base contaminants and sensitivity to PAB and/or PEB temperatures. In this presentation, we will discuss a new improved ketal resist system referred to as KRS-XE which exhibits excellent lithography, is robust toward airborne base, compatible with 0.263 N TMAH aqueous developer and exhibits a large PAB/PEB latitude. With the combination of a high performance mask making E-beam exposure tool, high kV (75 kV) shaped beam system EL4+ and the KRS-XE resist, we have printed 75 nm lines/space features with excellent profile control at a dose of 13 (mu) C/cm2 at 75 kV. The shaped beam vector scan system used here provides an unique property in resolving small features in lithography and throughput. Overhead in EL4+ limits the systems ability to fully exploit the sensitivity of the new resist for throughput. The EL5 system, currently in the build phase, has sufficiently low overhead that it is projected to print a 4X, 16G, DRAM mask with OPC in under 3 hours with the CA resist. We will discuss the throughput advantages of the next generation EL5 system over the existing EL4+. In addition we will show the resolution of KRS-XE down to 70 nm using the PREVAIL projection printing system.

  11. High-Throughput Screening of Chemical Effects on Steroidogenesis Using H295R Human Adrenocortical Carcinoma Cells

    PubMed Central

    Toole, Colleen M.; Filer, Dayne L.; Lewis, Kenneth C.; Martin, Matthew T.

    2016-01-01

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. PMID:26781511

  12. Fully-automated, high-throughput micro-computed tomography analysis of body composition enables therapeutic efficacy monitoring in preclinical models.

    PubMed

    Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D

    2015-11-01

    The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.

  13. High-throughput assessment of oxidative respiration in fish embryos: Advancing adverse outcome pathways for mitochondrial dysfunction.

    PubMed

    Souders, Christopher L; Liang, Xuefang; Wang, Xiaohong; Ector, Naomi; Zhao, Yuan H; Martyniuk, Christopher J

    2018-06-01

    Mitochondrial dysfunction is a prevalent molecular event that can result in multiple adverse outcomes. Recently, a novel high throughput method to assess metabolic capacity in fish embryos following exposure to chemicals has been adapted for environmental toxicology. Assessments of oxygen consumption rates using the Seahorse XF(e) 24/96 Extracellular Flux Analyzer (Agilent Technologies) can be used to garner insight into toxicant effects at early stages of development. Here we synthesize the current state of the science using high throughput metabolic profiling in zebrafish embryos, and present considerations for those wishing to adopt high throughput methods for mitochondrial bioenergetics into their research. Chemicals that have been investigated in zebrafish using this metabolic platform include herbicides (e.g. paraquat, diquat), industrial compounds (e.g. benzo-[a]-pyrene, tributyltin), natural products (e.g. quercetin), and anti-bacterial chemicals (i.e. triclosan). Some of these chemicals inhibit mitochondrial endpoints in the μM-mM range, and reduce basal respiration, maximum respiration, and spare capacity. We present a theoretical framework for how one can use mitochondrial performance data in zebrafish to categorize chemicals of concern and prioritize mitochondrial toxicants. Noteworthy is that our studies demonstrate that there can be considerable variation in basal respiration of untreated zebrafish embryos due to clutch-specific effects as well as individual variability, and basal oxygen consumption rates (OCR) can vary on average between 100 and 300 pmol/min/embryo. We also compare OCR between chorionated and dechorionated embryos, as both models are employed to test chemicals. After 24 h, dechorionated embryos remain responsive to mitochondrial toxicants, although they show a blunted response to the uncoupling agent carbonylcyanide-4-trifluoromethoxyphenylhydrazone (FCCP); dechorionated embryos are therefore a viable option for investigations into mitochondrial bioenergetics. We present an adverse outcome pathway framework that incorporates endpoints related to mitochondrial bioenergetics. High throughput bioenergetics assays conducted using whole embryos are expected to support adverse outcome pathways for mitochondrial dysfunction. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. A quantitative and high-throughput assay of human papillomavirus DNA replication.

    PubMed

    Gagnon, David; Fradet-Turcotte, Amélie; Archambault, Jacques

    2015-01-01

    Replication of the human papillomavirus (HPV) double-stranded DNA genome is accomplished by the two viral proteins E1 and E2 in concert with host DNA replication factors. HPV DNA replication is an established model of eukaryotic DNA replication and a potential target for antiviral therapy. Assays to measure the transient replication of HPV DNA in transfected cells have been developed, which rely on a plasmid carrying the viral origin of DNA replication (ori) together with expression vectors for E1 and E2. Replication of the ori-plasmid is typically measured by Southern blotting or PCR analysis of newly replicated DNA (i.e., DpnI digested DNA) several days post-transfection. Although extremely valuable, these assays have been difficult to perform in a high-throughput and quantitative manner. Here, we describe a modified version of the transient DNA replication assay that circumvents these limitations by incorporating a firefly luciferase expression cassette in cis of the ori. Replication of this ori-plasmid by E1 and E2 results in increased levels of firefly luciferase activity that can be accurately quantified and normalized to those of Renilla luciferase expressed from a control plasmid, thus obviating the need for DNA extraction, digestion, and analysis. We provide a detailed protocol for performing the HPV type 31 DNA replication assay in a 96-well plate format suitable for small-molecule screening and EC50 determinations. The quantitative and high-throughput nature of the assay should greatly facilitate the study of HPV DNA replication and the identification of inhibitors thereof.

  15. Rapid analysis of aminoglycoside antibiotics in bovine tissues using disposable pipette extraction and ultrahigh performance liquid chromatography - tandem mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pip...

  16. EPA's ToxCast Program for Predicting Hazard and Prioritizing the Toxicity Testing of Environmental Chemicals

    EPA Science Inventory

    An alternative is to perform a set of relatively inexpensive and rapid high throughput screening (HTS) assays, derive signatures predictive of effects or modes of chemical toxicity from the HTS data, then use these predictions to prioritize chemicals for more detailed analysis. T...

  17. RootScan: Software for high-throughput analysis of root anatomical traits

    USDA-ARS?s Scientific Manuscript database

    RootScan is a program for semi-automated image analysis of anatomical phenes in root cross-sections. RootScan uses pixel value thresholds to separate the cross-section from its background and to visually dissect it into tissue regions. Area measurements and object counts are performed within various...

  18. Development and Implementation of High-Throughput SNP Genotyping in Barley

    USDA-ARS?s Scientific Manuscript database

    Approximately 22,000 SNPs were identified from barley ESTs and sequenced amplicons; 4,596 of them were tested for performance in three pilot phase Illumina GoldenGate assays. Pilot phase data from three barley doubled haploid mapping populations supported the production of an initial consensus map, ...

  19. 40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...

  20. Studies of Several New Modifications of Aggressive Packet Combining to Achieve Higher Throughput, Based on Correction Capability of Disjoint Error Vectors

    NASA Astrophysics Data System (ADS)

    Chakraborty, Swarnendu Kumar; Goswami, Rajat Subhra; Bhunia, Chandan Tilak; Bhunia, Abhinandan

    2016-06-01

    Aggressive packet combining (APC) scheme is well-established in literature. Several modifications were studied earlier for improving throughput. In this paper, three new modifications of APC are proposed. The performance of proposed modified APC is studied by simulation and is reported here. A hybrid scheme is proposed here for getting higher throughput and also the disjoint factor is compared among conventional APC with proposed schemes for getting higher throughput.

  1. The MaNGA integral field unit fiber feed system for the Sloan 2.5 m telescope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drory, N.; MacDonald, N.; Byler, N.

    2015-02-01

    We describe the design, manufacture, and performance of bare-fiber integral field units (IFUs) for the SDSS-IV survey Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) on the the Sloan 2.5 m telescope at Apache Point Observatory. MaNGA is a luminosity-selected integral-field spectroscopic survey of 10{sup 4} local galaxies covering 360–1030 nm at R∼2200. The IFUs have hexagonal dense packing of fibers with packing regularity of 3 μm (rms), and throughput of 96 ± 0.5% from 350 nm to 1 μm in the lab. Their sizes range from 19 to 127 fibers (3–7 hexagonal layers) using Polymicro FBP 120:132:150 μm core:clad:buffermore » fibers to reach a fill fraction of 56%. High throughput (and low focal-ratio degradation (FRD)) is achieved by maintaining the fiber cladding and buffer intact, ensuring excellent surface polish, and applying a multi-layer anti-reflection (AR) coating of the input and output surfaces. In operations on-sky, the IFUs show only an additional 2.3% FRD-related variability in throughput despite repeated mechanical stressing during plate plugging (however other losses are present). The IFUs achieve on-sky throughput 5% above the single-fiber feeds used in SDSS-III/BOSS, attributable to equivalent performance compared to single fibers and additional gains from the AR coating. The manufacturing process is geared toward mass-production of high-multiplex systems. The low-stress process involves a precision ferrule with a hexagonal inner shape designed to lead inserted fibers to settle in a dense hexagonal pattern. The ferrule ID is tapered at progressively shallower angles toward its tip and the final 2 mm are straight and only a few microns larger than necessary to hold the desired number of fibers. Our IFU manufacturing process scales easily to accommodate other fiber sizes and can produce IFUs with substantially larger fiber counts. To assure quality, automated testing in a simple and inexpensive system enables complete characterization of throughput and fiber metrology. Future applications include larger IFUs, higher fill factors with stripped buffer, de-cladding, and lenslet coupling.« less

  2. The MaNGA Integral Field Unit Fiber Feed System for the Sloan 2.5 m Telescope

    NASA Astrophysics Data System (ADS)

    Drory, N.; MacDonald, N.; Bershady, M. A.; Bundy, K.; Gunn, J.; Law, D. R.; Smith, M.; Stoll, R.; Tremonti, C. A.; Wake, D. A.; Yan, R.; Weijmans, A. M.; Byler, N.; Cherinka, B.; Cope, F.; Eigenbrot, A.; Harding, P.; Holder, D.; Huehnerhoff, J.; Jaehnig, K.; Jansen, T. C.; Klaene, M.; Paat, A. M.; Percival, J.; Sayres, C.

    2015-02-01

    We describe the design, manufacture, and performance of bare-fiber integral field units (IFUs) for the SDSS-IV survey Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) on the the Sloan 2.5 m telescope at Apache Point Observatory. MaNGA is a luminosity-selected integral-field spectroscopic survey of 104 local galaxies covering 360-1030 nm at R˜ 2200. The IFUs have hexagonal dense packing of fibers with packing regularity of 3 μm (rms), and throughput of 96 ± 0.5% from 350 nm to 1 μm in the lab. Their sizes range from 19 to 127 fibers (3-7 hexagonal layers) using Polymicro FBP 120:132:150 μm core:clad:buffer fibers to reach a fill fraction of 56%. High throughput (and low focal-ratio degradation (FRD)) is achieved by maintaining the fiber cladding and buffer intact, ensuring excellent surface polish, and applying a multi-layer anti-reflection (AR) coating of the input and output surfaces. In operations on-sky, the IFUs show only an additional 2.3% FRD-related variability in throughput despite repeated mechanical stressing during plate plugging (however other losses are present). The IFUs achieve on-sky throughput 5% above the single-fiber feeds used in SDSS-III/BOSS, attributable to equivalent performance compared to single fibers and additional gains from the AR coating. The manufacturing process is geared toward mass-production of high-multiplex systems. The low-stress process involves a precision ferrule with a hexagonal inner shape designed to lead inserted fibers to settle in a dense hexagonal pattern. The ferrule ID is tapered at progressively shallower angles toward its tip and the final 2 mm are straight and only a few microns larger than necessary to hold the desired number of fibers. Our IFU manufacturing process scales easily to accommodate other fiber sizes and can produce IFUs with substantially larger fiber counts. To assure quality, automated testing in a simple and inexpensive system enables complete characterization of throughput and fiber metrology. Future applications include larger IFUs, higher fill factors with stripped buffer, de-cladding, and lenslet coupling.

  3. Compound Transfer by Acoustic Droplet Ejection Promotes Quality and Efficiency in Ultra-High-Throughput Screening Campaigns.

    PubMed

    Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H

    2016-02-01

    Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.

  4. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    PubMed

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  5. Validation of high-throughput single cell analysis methodology.

    PubMed

    Devonshire, Alison S; Baradez, Marc-Olivier; Morley, Gary; Marshall, Damian; Foy, Carole A

    2014-05-01

    High-throughput quantitative polymerase chain reaction (qPCR) approaches enable profiling of multiple genes in single cells, bringing new insights to complex biological processes and offering opportunities for single cell-based monitoring of cancer cells and stem cell-based therapies. However, workflows with well-defined sources of variation are required for clinical diagnostics and testing of tissue-engineered products. In a study of neural stem cell lines, we investigated the performance of lysis, reverse transcription (RT), preamplification (PA), and nanofluidic qPCR steps at the single cell level in terms of efficiency, precision, and limit of detection. We compared protocols using a separate lysis buffer with cell capture directly in RT-PA reagent. The two methods were found to have similar lysis efficiencies, whereas the direct RT-PA approach showed improved precision. Digital PCR was used to relate preamplified template copy numbers to Cq values and reveal where low-quality signals may affect the analysis. We investigated the impact of calibration and data normalization strategies as a means of minimizing the impact of inter-experimental variation on gene expression values and found that both approaches can improve data comparability. This study provides validation and guidance for the application of high-throughput qPCR workflows for gene expression profiling of single cells. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Improved Selection of Internal Transcribed Spacer-Specific Primers Enables Quantitative, Ultra-High-Throughput Profiling of Fungal Communities

    PubMed Central

    Bokulich, Nicholas A.

    2013-01-01

    Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949

  7. ChemHTPS - A virtual high-throughput screening program suite for the chemical and materials sciences

    NASA Astrophysics Data System (ADS)

    Afzal, Mohammad Atif Faiz; Evangelista, William; Hachmann, Johannes

    The discovery of new compounds, materials, and chemical reactions with exceptional properties is the key for the grand challenges in innovation, energy and sustainability. This process can be dramatically accelerated by means of the virtual high-throughput screening (HTPS) of large-scale candidate libraries. The resulting data can further be used to study the underlying structure-property relationships and thus facilitate rational design capability. This approach has been extensively used for many years in the drug discovery community. However, the lack of openly available virtual HTPS tools is limiting the use of these techniques in various other applications such as photovoltaics, optoelectronics, and catalysis. Thus, we developed ChemHTPS, a general-purpose, comprehensive and user-friendly suite, that will allow users to efficiently perform large in silico modeling studies and high-throughput analyses in these applications. ChemHTPS also includes a massively parallel molecular library generator which offers a multitude of options to customize and restrict the scope of the enumerated chemical space and thus tailor it for the demands of specific applications. To streamline the non-combinatorial exploration of chemical space, we incorporate genetic algorithms into the framework. In addition to implementing smarter algorithms, we also focus on the ease of use, workflow, and code integration to make this technology more accessible to the community.

  8. Clinical application of high throughput molecular screening techniques for pharmacogenomics

    PubMed Central

    Wiita, Arun P; Schrijver, Iris

    2011-01-01

    Genetic analysis is one of the fastest-growing areas of clinical diagnostics. Fortunately, as our knowledge of clinically relevant genetic variants rapidly expands, so does our ability to detect these variants in patient samples. Increasing demand for genetic information may necessitate the use of high throughput diagnostic methods as part of clinically validated testing. Here we provide a general overview of our current and near-future abilities to perform large-scale genetic testing in the clinical laboratory. First we review in detail molecular methods used for high throughput mutation detection, including techniques able to monitor thousands of genetic variants for a single patient or to genotype a single genetic variant for thousands of patients simultaneously. These methods are analyzed in the context of pharmacogenomic testing in the clinical laboratories, with a focus on tests that are currently validated as well as those that hold strong promise for widespread clinical application in the near future. We further discuss the unique economic and clinical challenges posed by pharmacogenomic markers. Our ability to detect genetic variants frequently outstrips our ability to accurately interpret them in a clinical context, carrying implications both for test development and introduction into patient management algorithms. These complexities must be taken into account prior to the introduction of any pharmacogenomic biomarker into routine clinical testing. PMID:23226057

  9. Validation of a high-throughput real-time polymerase chain reaction assay for the detection of capripoxviral DNA.

    PubMed

    Stubbs, Samuel; Oura, Chris A L; Henstock, Mark; Bowden, Timothy R; King, Donald P; Tuppurainen, Eeva S M

    2012-02-01

    Capripoxviruses, which are endemic in much of Africa and Asia, are the aetiological agents of economically devastating poxviral diseases in cattle, sheep and goats. The aim of this study was to validate a high-throughput real-time PCR assay for routine diagnostic use in a capripoxvirus reference laboratory. The performance of two previously published real-time PCR methods were compared using commercially available reagents including the amplification kits recommended in the original publication. Furthermore, both manual and robotic extraction methods used to prepare template nucleic acid were evaluated using samples collected from experimentally infected animals. The optimised assay had an analytical sensitivity of at least 63 target DNA copies per reaction, displayed a greater diagnostic sensitivity compared to conventional gel-based PCR, detected capripoxviruses isolated from outbreaks around the world and did not amplify DNA from related viruses in the genera Orthopoxvirus or Parapoxvirus. The high-throughput robotic DNA extraction procedure did not adversely affect the sensitivity of the assay compared to manual preparation of PCR templates. This laboratory-based assay provides a rapid and robust method to detect capripoxviruses following suspicion of disease in endemic or disease-free countries. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  10. High-throughput design and optimization of fast lithium ion conductors by the combination of bond-valence method and density functional theory

    NASA Astrophysics Data System (ADS)

    Xiao, Ruijuan; Li, Hong; Chen, Liquan

    2015-09-01

    Looking for solid state electrolytes with fast lithium ion conduction is an important prerequisite for developing all-solid-state lithium secondary batteries. By combining the simulation techniques in different levels of accuracy, e.g. the bond-valence (BV) method and the density functional theory (DFT), a high-throughput design and optimization scheme is proposed for searching fast lithium ion conductors as candidate solid state electrolytes for lithium rechargeable batteries. The screening from more than 1000 compounds is performed through BV-based method, and the ability to predict reliable tendency of the Li+ migration energy barriers is confirmed by comparing with the results from DFT calculations. β-Li3PS4 is taken as a model system to demonstrate the application of this combination method in optimizing properties of solid electrolytes. By employing the high-throughput DFT simulations to more than 200 structures of the doping derivatives of β-Li3PS4, the effects of doping on the ionic conductivities in this material are predicted by the BV calculations. The O-doping scheme is proposed as a promising way to improve the kinetic properties of this materials, and the validity of the optimization is proved by the first-principles molecular dynamics (FPMD) simulations.

  11. Development of a high-throughput screening system for identification of novel reagents regulating DNA damage in human dermal fibroblasts.

    PubMed

    Bae, Seunghee; An, In-Sook; An, Sungkwan

    2015-09-01

    Ultraviolet (UV) radiation is a major inducer of skin aging and accumulated exposure to UV radiation increases DNA damage in skin cells, including dermal fibroblasts. In the present study, we developed a novel DNA repair regulating material discovery (DREAM) system for the high-throughput screening and identification of putative materials regulating DNA repair in skin cells. First, we established a modified lentivirus expressing the luciferase and hypoxanthine phosphoribosyl transferase (HPRT) genes. Then, human dermal fibroblast WS-1 cells were infected with the modified lentivirus and selected with puromycin to establish cells that stably expressed luciferase and HPRT (DREAM-F cells). The first step in the DREAM protocol was a 96-well-based screening procedure, involving the analysis of cell viability and luciferase activity after pretreatment of DREAM-F cells with reagents of interest and post-treatment with UVB radiation, and vice versa. In the second step, we validated certain effective reagents identified in the first step by analyzing the cell cycle, evaluating cell death, and performing HPRT-DNA sequencing in DREAM-F cells treated with these reagents and UVB. This DREAM system is scalable and forms a time-saving high-throughput screening system for identifying novel anti-photoaging reagents regulating DNA damage in dermal fibroblasts.

  12. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis

    PubMed Central

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S.

    2016-01-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical ‘large p, small n’ problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package “adaptiveHM”, which is freely available from https://github.com/benliemory/adaptiveHM. PMID:28919931

  13. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  14. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    NASA Astrophysics Data System (ADS)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  15. Design and implementation of a high performance network security processor

    NASA Astrophysics Data System (ADS)

    Wang, Haixin; Bai, Guoqiang; Chen, Hongyi

    2010-03-01

    The last few years have seen many significant progresses in the field of application-specific processors. One example is network security processors (NSPs) that perform various cryptographic operations specified by network security protocols and help to offload the computation intensive burdens from network processors (NPs). This article presents a high performance NSP system architecture implementation intended for both internet protocol security (IPSec) and secure socket layer (SSL) protocol acceleration, which are widely employed in virtual private network (VPN) and e-commerce applications. The efficient dual one-way pipelined data transfer skeleton and optimised integration scheme of the heterogenous parallel crypto engine arrays lead to a Gbps rate NSP, which is programmable with domain specific descriptor-based instructions. The descriptor-based control flow fragments large data packets and distributes them to the crypto engine arrays, which fully utilises the parallel computation resources and improves the overall system data throughput. A prototyping platform for this NSP design is implemented with a Xilinx XC3S5000 based FPGA chip set. Results show that the design gives a peak throughput for the IPSec ESP tunnel mode of 2.85 Gbps with over 2100 full SSL handshakes per second at a clock rate of 95 MHz.

  16. Performance Improvement in Geographic Routing for Vehicular Ad Hoc Networks

    PubMed Central

    Kaiwartya, Omprakash; Kumar, Sushil; Lobiyal, D. K.; Abdullah, Abdul Hanan; Hassan, Ahmed Nazar

    2014-01-01

    Geographic routing is one of the most investigated themes by researchers for reliable and efficient dissemination of information in Vehicular Ad Hoc Networks (VANETs). Recently, different Geographic Distance Routing (GEDIR) protocols have been suggested in the literature. These protocols focus on reducing the forwarding region towards destination to select the Next Hop Vehicles (NHV). Most of these protocols suffer from the problem of elevated one-hop link disconnection, high end-to-end delay and low throughput even at normal vehicle speed in high vehicle density environment. This paper proposes a Geographic Distance Routing protocol based on Segment vehicle, Link quality and Degree of connectivity (SLD-GEDIR). The protocol selects a reliable NHV using the criteria segment vehicles, one-hop link quality and degree of connectivity. The proposed protocol has been simulated in NS-2 and its performance has been compared with the state-of-the-art protocols: P-GEDIR, J-GEDIR and V-GEDIR. The empirical results clearly reveal that SLD-GEDIR has lower link disconnection and end-to-end delay, and higher throughput as compared to the state-of-the-art protocols. It should be noted that the performance of the proposed protocol is preserved irrespective of vehicle density and speed. PMID:25429415

  17. Performance improvement in geographic routing for Vehicular Ad Hoc Networks.

    PubMed

    Kaiwartya, Omprakash; Kumar, Sushil; Lobiyal, D K; Abdullah, Abdul Hanan; Hassan, Ahmed Nazar

    2014-11-25

    Geographic routing is one of the most investigated themes by researchers for reliable and efficient dissemination of information in Vehicular Ad Hoc Networks (VANETs). Recently, different Geographic Distance Routing (GEDIR) protocols have been suggested in the literature. These protocols focus on reducing the forwarding region towards destination to select the Next Hop Vehicles (NHV). Most of these protocols suffer from the problem of elevated one-hop link disconnection, high end-to-end delay and low throughput even at normal vehicle speed in high vehicle density environment. This paper proposes a Geographic Distance Routing protocol based on Segment vehicle, Link quality and Degree of connectivity (SLD-GEDIR). The protocol selects a reliable NHV using the criteria segment vehicles, one-hop link quality and degree of connectivity. The proposed protocol has been simulated in NS-2 and its performance has been compared with the state-of-the-art protocols: P-GEDIR, J-GEDIR and V-GEDIR. The empirical results clearly reveal that SLD-GEDIR has lower link disconnection and end-to-end delay, and higher throughput as compared to the state-of-the-art protocols. It should be noted that the performance of the proposed protocol is preserved irrespective of vehicle density and speed.

  18. HTS techniques for patch clamp-based ion channel screening - advances and economy.

    PubMed

    Farre, Cecilia; Fertig, Niels

    2012-06-01

    Ten years ago, the first publication appeared showing patch clamp recordings performed on a planar glass chip instead of using a conventional patch clamp pipette. "Going planar" proved to revolutionize ion channel drug screening as we know it, by allowing high quality measurements of ion channels and their effectors at a higher throughput and at the same time de-skilling the highly laborious technique. Over the years, platforms evolved in response to user requirements regarding experimental features, data handling plus storage, and suitable target diversity. This article gives a snapshot image of patch clamp-based ion channel screening with focus on platforms developed to meet requirements of high-throughput screening environments. The commercially available platforms are described, along with their benefits and drawbacks in ion channel drug screening. Automated patch clamp (APC) platforms allow faster investigation of a larger number of ion channel active compounds or cell clones than previously possible. Since patch clamp is the only method allowing direct, real-time measurements of ion channel activity, APC holds the promise of picking up high quality leads, where they otherwise would have been overseen using indirect methods. In addition, drug candidate safety profiling can be performed earlier in the drug discovery process, avoiding late-phase compound withdrawal due to safety liability issues, which is highly costly and inefficient.

  19. High throughput atmospheric pressure plasma-induced graft polymerization for identifying protein-resistant surfaces.

    PubMed

    Gu, Minghao; Kilduff, James E; Belfort, Georges

    2012-02-01

    Three critical aspects of searching for and understanding how to find highly resistant surfaces to protein adhesion are addressed here with specific application to synthetic membrane filtration. They include the (i) discovery of a series of previously unreported monomers from a large library of monomers with high protein resistance and subsequent low fouling characteristics for membrane ultrafiltration of protein-containing fluids, (ii) development of a new approach to investigate protein-resistant mechanisms from structure-property relationships, and (iii) adaptation of a new surface modification method, called atmospheric pressure plasma-induced graft polymerization (APP), together with a high throughput platform (HTP), for low cost vacuum-free synthesis of anti-fouling membranes. Several new high-performing chemistries comprising two polyethylene glycol (PEG), two amines and one zwitterionic monomers were identified from a library (44 commercial monomers) of five different classes of monomers as strong protein-resistant monomers. Combining our analysis here, using the Hansen solubility parameters (HSP) approach, and data from the literature, we conclude that strong interactions with water (hydrogen bonding) and surface flexibility are necessary for producing the highest protein resistance. Superior protein-resistant surfaces and subsequent anti-fouling performance was obtained with the HTP-APP as compared with our earlier HTP-photo graft-induced polymerization (PGP). Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Algorithm for fast event parameters estimation on GEM acquired data

    NASA Astrophysics Data System (ADS)

    Linczuk, Paweł; Krawczyk, Rafał D.; Poźniak, Krzysztof T.; Kasprowicz, Grzegorz; Wojeński, Andrzej; Chernyshova, Maryna; Czarski, Tomasz

    2016-09-01

    We present study of a software-hardware environment for developing fast computation with high throughput and low latency methods, which can be used as back-end in High Energy Physics (HEP) and other High Performance Computing (HPC) systems, based on high amount of input from electronic sensor based front-end. There is a parallelization possibilities discussion and testing on Intel HPC solutions with consideration of applications with Gas Electron Multiplier (GEM) measurement systems presented in this paper.

  1. Method validation for 243 pesticides and environmental contaminants in meats and poultry by tandem mass spectrometry coupled to low-pressure gas chromatography and ultra high-performance liquid chromatography

    USDA-ARS?s Scientific Manuscript database

    An easy and reliable high-throughput analysis method was developed and validated for 192 diverse pesticides and 51 environmental contaminants (13 PCB congeners, 14 PAHs, 7 PBDE congeners, and 17 novel flame retardants) in cattle, swine, and poultry muscle. Sample preparation was based on the “quick,...

  2. Robust high-performance nanoliter-volume single-cell multiple displacement amplification on planar substrates.

    PubMed

    Leung, Kaston; Klaus, Anders; Lin, Bill K; Laks, Emma; Biele, Justina; Lai, Daniel; Bashashati, Ali; Huang, Yi-Fei; Aniba, Radhouane; Moksa, Michelle; Steif, Adi; Mes-Masson, Anne-Marie; Hirst, Martin; Shah, Sohrab P; Aparicio, Samuel; Hansen, Carl L

    2016-07-26

    The genomes of large numbers of single cells must be sequenced to further understanding of the biological significance of genomic heterogeneity in complex systems. Whole genome amplification (WGA) of single cells is generally the first step in such studies, but is prone to nonuniformity that can compromise genomic measurement accuracy. Despite recent advances, robust performance in high-throughput single-cell WGA remains elusive. Here, we introduce droplet multiple displacement amplification (MDA), a method that uses commercially available liquid dispensing to perform high-throughput single-cell MDA in nanoliter volumes. The performance of droplet MDA is characterized using a large dataset of 129 normal diploid cells, and is shown to exceed previously reported single-cell WGA methods in amplification uniformity, genome coverage, and/or robustness. We achieve up to 80% coverage of a single-cell genome at 5× sequencing depth, and demonstrate excellent single-nucleotide variant (SNV) detection using targeted sequencing of droplet MDA product to achieve a median allelic dropout of 15%, and using whole genome sequencing to achieve false and true positive rates of 9.66 × 10(-6) and 68.8%, respectively, in a G1-phase cell. We further show that droplet MDA allows for the detection of copy number variants (CNVs) as small as 30 kb in single cells of an ovarian cancer cell line and as small as 9 Mb in two high-grade serous ovarian cancer samples using only 0.02× depth. Droplet MDA provides an accessible and scalable method for performing robust and accurate CNV and SNV measurements on large numbers of single cells.

  3. A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing.

    PubMed

    Hsiao, Yi-Hsing; Hsu, Chia-Hsien; Chen, Chihchen

    2016-07-08

    The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca(2+) concentration. However, glucose evoked a rapid elevation of intracellular Ca(2+) followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.

  4. Vortex coronagraphs for the Habitable Exoplanet Imaging Mission concept: theoretical performance and telescope requirements

    NASA Astrophysics Data System (ADS)

    Ruane, Garreth; Mawet, Dimitri; Mennesson, Bertrand; Jewell, Jeffrey; Shaklan, Stuart

    2018-01-01

    The Habitable Exoplanet Imaging Mission concept requires an optical coronagraph that provides deep starlight suppression over a broad spectral bandwidth, high throughput for point sources at small angular separation, and insensitivity to temporally varying, low-order aberrations. Vortex coronagraphs are a promising solution that performs optimally on off-axis, monolithic telescopes and may also be designed for segmented telescopes with minor losses in performance. We describe the key advantages of vortex coronagraphs on off-axis telescopes such as (1) unwanted diffraction due to aberrations is passively rejected in several low-order Zernike modes relaxing the wavefront stability requirements for imaging Earth-like planets from <10 to >100 pm rms, (2) stars with angular diameters >0.1 λ / D may be sufficiently suppressed, (3) the absolute planet throughput is >10 % , even for unfavorable telescope architectures, and (4) broadband solutions (Δλ / λ > 0.1) are readily available for both monolithic and segmented apertures. The latter make use of grayscale apodizers in an upstream pupil plane to provide suppression of diffracted light from amplitude discontinuities in the telescope pupil without inducing additional stroke on the deformable mirrors. We set wavefront stability requirements on the telescope, based on a stellar irradiance threshold set at an angular separation of 3 ± 0.5λ / D from the star, and discuss how some requirements may be relaxed by trading robustness to aberrations for planet throughput.

  5. On the design of turbo codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    In this article, we design new turbo codes that can achieve near-Shannon-limit performance. The design criterion for random interleavers is based on maximizing the effective free distance of the turbo code, i.e., the minimum output weight of codewords due to weight-2 input sequences. An upper bound on the effective free distance of a turbo code is derived. This upper bound can be achieved if the feedback connection of convolutional codes uses primitive polynomials. We review multiple turbo codes (parallel concatenation of q convolutional codes), which increase the so-called 'interleaving gain' as q and the interleaver size increase, and a suitable decoder structure derived from an approximation to the maximum a posteriori probability decision rule. We develop new rate 1/3, 2/3, 3/4, and 4/5 constituent codes to be used in the turbo encoder structure. These codes, for from 2 to 32 states, are designed by using primitive polynomials. The resulting turbo codes have rates b/n (b = 1, 2, 3, 4 and n = 2, 3, 4, 5, 6), and include random interleavers for better asymptotic performance. These codes are suitable for deep-space communications with low throughput and for near-Earth communications where high throughput is desirable. The performance of these codes is within 1 dB of the Shannon limit at a bit-error rate of 10(exp -6) for throughputs from 1/15 up to 4 bits/s/Hz.

  6. High-throughput measurements of the optical redox ratio using a commercial microplate reader.

    PubMed

    Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C

    2015-01-01

    There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.

  7. Performance and Thermal Characterization of the NASA-300MS 20 kW Hall Effect Thruster

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hani; Huang, Wensheng; Haag, Thomas; Shastry, Rohit; Soulas, George; Smith, Timothy; Mikellides, Ioannis; Hofer, Richard

    2013-01-01

    NASA's Space Technology Mission Directorate is sponsoring the development of a high fidelity 15 kW-class long-life high performance Hall thruster for candidate NASA technology demonstration missions. An essential element of the development process is demonstration that incorporation of magnetic shielding on a 20 kW-class Hall thruster will yield significant improvements in the throughput capability of the thruster without any significant reduction in thruster performance. As such, NASA Glenn Research Center and the Jet Propulsion Laboratory collaborated on modifying the NASA-300M 20 kW Hall thruster to improve its propellant throughput capability. JPL and NASA Glenn researchers performed plasma numerical simulations with JPL's Hall2De and a commercially available magnetic modeling code that indicated significant enhancement in the throughput capability of the NASA-300M can be attained by modifying the thruster's magnetic circuit. This led to modifying the NASA-300M magnetic topology to a magnetically shielded topology. This paper presents performance evaluation results of the two NASA-300M magnetically shielded thruster configurations, designated 300MS and 300MS-2. The 300MS and 300MS-2 were operated at power levels between 2.5 and 20 kW at discharge voltages between 200 and 700 V. Discharge channel deposition from back-sputtered facility wall flux, and plasma potential and electron temperature measurements made on the inner and outer discharge channel surfaces confirmed that magnetic shielding was achieved. Peak total thrust efficiency of 64% and total specific impulse of 3,050 sec were demonstrated with the 300MS-2 at 20 kW. Thermal characterization results indicate that the boron nitride discharge chamber walls temperatures are approximately 100 C lower for the 300MS when compared to the NASA- 300M at the same thruster operating discharge power.

  8. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  9. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  10. High-throughput, pooled sequencing identifies mutations in NUBPL and FOXRED1 in human complex I deficiency

    PubMed Central

    Calvo, Sarah E; Tucker, Elena J; Compton, Alison G; Kirby, Denise M; Crawford, Gabriel; Burtt, Noel P; Rivas, Manuel A; Guiducci, Candace; Bruno, Damien L; Goldberger, Olga A; Redman, Michelle C; Wiltshire, Esko; Wilson, Callum J; Altshuler, David; Gabriel, Stacey B; Daly, Mark J; Thorburn, David R; Mootha, Vamsi K

    2010-01-01

    Discovering the molecular basis of mitochondrial respiratory chain disease is challenging given the large number of both mitochondrial and nuclear genes involved. We report a strategy of focused candidate gene prediction, high-throughput sequencing, and experimental validation to uncover the molecular basis of mitochondrial complex I (CI) disorders. We created five pools of DNA from a cohort of 103 patients and then performed deep sequencing of 103 candidate genes to spotlight 151 rare variants predicted to impact protein function. We used confirmatory experiments to establish genetic diagnoses in 22% of previously unsolved cases, and discovered that defects in NUBPL and FOXRED1 can cause CI deficiency. Our study illustrates how large-scale sequencing, coupled with functional prediction and experimental validation, can reveal novel disease-causing mutations in individual patients. PMID:20818383

  11. A High-Throughput Screening Method for Identification of Inhibitors of the Deubiquitinating Enzyme USP14

    PubMed Central

    Lee, Byung-Hoon; Finley, Daniel; King, Randall W.

    2013-01-01

    Deubiquitinating enzymes (DUBs) reverse the process of ubiquitination, and number nearly 100 in humans. In principle, DUBs represent promising drug targets, as several of the enzymes have been implicated in human diseases. The isopeptidase activity of DUBs can be selectively inhibited by targeting the catalytic site with drug-like compounds. Notably, the mammalian 26S proteasome is associated with three major DUBs: RPN11, UCH37 and USP14. Because the ubiquitin ‘chain-trimming’ activity of USP14 can inhibit proteasome function, inhibitors of USP14 can stimulate proteasomal degradation. We recently established a high-throughput screening (HTS) method to discover small-molecule inhibitors specific for USP14. The protocols in this article cover the necessary procedures for preparing assay reagents, performing HTS for USP14 inhibitors, and carrying out post-HTS analysis. PMID:23788557

  12. New classes of piezoelectrics, ferroelectrics, and antiferroelectrics by first-principles high-throughput materials design

    NASA Astrophysics Data System (ADS)

    Bennett, Joseph

    2013-03-01

    Functional materials, such as piezoelectrics, ferroelectrics, and antiferroelectrics, exhibit large changes with applied fields and stresses. This behavior enables their incorporation into a wide variety of devices in technological fields such as energy conversion/storage and information processing/storage. Discovery of functional materials with improved performance or even new types of responses is thus not only a scientific challenge, but can have major impacts on society. In this talk I will review our efforts to uncover new families of functional materials using a combined crystallographic database/high-throughput first-principles approach. I will describe our work on the design and discovery of thousands of new functional materials, specifically the LiAlSi family as piezoelectrics, the LiGaGe family as ferroelectrics, and the MgSrSi family as antiferroelectrics.

  13. The main challenges that remain in applying high-throughput sequencing to clinical diagnostics.

    PubMed

    Loeffelholz, Michael; Fofanov, Yuriy

    2015-01-01

    Over the last 10 years, the quality, price and availability of high-throughput sequencing instruments have improved to the point that this technology may be close to becoming a routine tool in the diagnostic microbiology laboratory. Two groups of challenges, however, have to be resolved in order to move this powerful research technology into routine use in the clinical microbiology laboratory. The computational/bioinformatics challenges include data storage cost and privacy concerns, requiring analysis to be performed without access to cloud storage or expensive computational infrastructure. The logistical challenges include interpretation of complex results and acceptance and understanding of the advantages and limitations of this technology by the medical community. This article focuses on the approaches to address these challenges, such as file formats, algorithms, data collection, reporting and good laboratory practices.

  14. A high-throughput screen for single gene activities: isolation of apoptosis inducers.

    PubMed

    Albayrak, Timur; Grimm, Stefan

    2003-05-16

    We describe a novel genetic screen that is performed by transfecting every individual clone of an expression library into a separate population of cells in a high-throughput mode. The screen allows one to achieve a hitherto unattained sensitivity in expression cloning which was exploited in a first read-out to clone apoptosis-inducing genes. This led to the isolation of several genes whose proteins induce distinct phenotypes of apoptosis in 293T cells. One of the isolated genes is the tumor suppressor cytochrome b(L) (cybL), a component of the respiratory chain complex II, that diminishes the activity of this complex for apoptosis induction. This gene is more efficient and specific for causing cell death than a drug with the same activity. These results suggest further applications, both of the isolated genes and the screen.

  15. Biofuel metabolic engineering with biosensors.

    PubMed

    Morgan, Stacy-Anne; Nadler, Dana C; Yokoo, Rayka; Savage, David F

    2016-12-01

    Metabolic engineering offers the potential to renewably produce important classes of chemicals, particularly biofuels, at an industrial scale. DNA synthesis and editing techniques can generate large pathway libraries, yet identifying the best variants is slow and cumbersome. Traditionally, analytical methods like chromatography and mass spectrometry have been used to evaluate pathway variants, but such techniques cannot be performed with high throughput. Biosensors - genetically encoded components that actuate a cellular output in response to a change in metabolite concentration - are therefore a promising tool for rapid and high-throughput evaluation of candidate pathway variants. Applying biosensors can also dynamically tune pathways in response to metabolic changes, improving balance and productivity. Here, we describe the major classes of biosensors and briefly highlight recent progress in applying them to biofuel-related metabolic pathway engineering. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Quantitative secondary electron imaging for work function extraction at atomic level and layer identification of graphene

    PubMed Central

    Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou

    2016-01-01

    Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907

  17. A high-throughput approach to profile RNA structure.

    PubMed

    Delli Ponti, Riccardo; Marti, Stefanie; Armaos, Alexandros; Tartaglia, Gian Gaetano

    2017-03-17

    Here we introduce the Computational Recognition of Secondary Structure (CROSS) method to calculate the structural profile of an RNA sequence (single- or double-stranded state) at single-nucleotide resolution and without sequence length restrictions. We trained CROSS using data from high-throughput experiments such as Selective 2΄-Hydroxyl Acylation analyzed by Primer Extension (SHAPE; Mouse and HIV transcriptomes) and Parallel Analysis of RNA Structure (PARS; Human and Yeast transcriptomes) as well as high-quality NMR/X-ray structures (PDB database). The algorithm uses primary structure information alone to predict experimental structural profiles with >80% accuracy, showing high performances on large RNAs such as Xist (17 900 nucleotides; Area Under the ROC Curve AUC of 0.75 on dimethyl sulfate (DMS) experiments). We integrated CROSS in thermodynamics-based methods to predict secondary structure and observed an increase in their predictive power by up to 30%. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Plastic straw: future of high-speed signaling

    NASA Astrophysics Data System (ADS)

    Song, Ha Il; Jin, Huxian; Bae, Hyeon-Min

    2015-11-01

    The ever-increasing demand for bandwidth triggered by mobile and video Internet traffic requires advanced interconnect solutions satisfying functional and economic constraints. A new interconnect called E-TUBE is proposed as a cost-and-power-effective all-electrical-domain wideband waveguide solution for high-speed high-volume short-reach communication links. The E-TUBE achieves an unprecedented level of performance in terms of bandwidth-per-carrier frequency, power, and density without requiring a precision manufacturing process unlike conventional optical/waveguide solutions. The E-TUBE exhibits a frequency-independent loss-profile of 4 dB/m and has nearly 20-GHz bandwidth over the V band. A single-sideband signal transmission enabled by the inherent frequency response of the E-TUBE renders two-times data throughput without any physical overhead compared to conventional radio frequency communication technologies. This new interconnect scheme would be attractive to parties interested in high throughput links, including but not limited to, 100/400 Gbps chip-to-chip communications.

  19. Automation of Technology for Cancer Research.

    PubMed

    van der Ent, Wietske; Veneman, Wouter J; Groenewoud, Arwin; Chen, Lanpeng; Tulotta, Claudia; Hogendoorn, Pancras C W; Spaink, Herman P; Snaar-Jagalska, B Ewa

    2016-01-01

    Zebrafish embryos can be obtained for research purposes in large numbers at low cost and embryos develop externally in limited space, making them highly suitable for high-throughput cancer studies and drug screens. Non-invasive live imaging of various processes within the larvae is possible due to their transparency during development, and a multitude of available fluorescent transgenic reporter lines.To perform high-throughput studies, handling large amounts of embryos and larvae is required. With such high number of individuals, even minute tasks may become time-consuming and arduous. In this chapter, an overview is given of the developments in the automation of various steps of large scale zebrafish cancer research for discovering important cancer pathways and drugs for the treatment of human disease. The focus lies on various tools developed for cancer cell implantation, embryo handling and sorting, microfluidic systems for imaging and drug treatment, and image acquisition and analysis. Examples will be given of employment of these technologies within the fields of toxicology research and cancer research.

  20. Advanced phenotyping and phenotype data analysis for the study of plant growth and development

    PubMed Central

    Rahaman, Md. Matiur; Chen, Dijun; Gillani, Zeeshan; Klukas, Christian; Chen, Ming

    2015-01-01

    Due to an increase in the consumption of food, feed, fuel and to meet global food security needs for the rapidly growing human population, there is a necessity to breed high yielding crops that can adapt to the future climate changes, particularly in developing countries. To solve these global challenges, novel approaches are required to identify quantitative phenotypes and to explain the genetic basis of agriculturally important traits. These advances will facilitate the screening of germplasm with high performance characteristics in resource-limited environments. Recently, plant phenomics has offered and integrated a suite of new technologies, and we are on a path to improve the description of complex plant phenotypes. High-throughput phenotyping platforms have also been developed that capture phenotype data from plants in a non-destructive manner. In this review, we discuss recent developments of high-throughput plant phenotyping infrastructure including imaging techniques and corresponding principles for phenotype data analysis. PMID:26322060

  1. Enabling inspection solutions for future mask technologies through the development of massively parallel E-Beam inspection

    NASA Astrophysics Data System (ADS)

    Malloy, Matt; Thiel, Brad; Bunday, Benjamin D.; Wurm, Stefan; Jindal, Vibhu; Mukhtar, Maseeh; Quoi, Kathy; Kemen, Thomas; Zeidler, Dirk; Eberle, Anna Lena; Garbowski, Tomasz; Dellemann, Gregor; Peters, Jan Hendrik

    2015-09-01

    The new device architectures and materials being introduced for sub-10nm manufacturing, combined with the complexity of multiple patterning and the need for improved hotspot detection strategies, have pushed current wafer inspection technologies to their limits. In parallel, gaps in mask inspection capability are growing as new generations of mask technologies are developed to support these sub-10nm wafer manufacturing requirements. In particular, the challenges associated with nanoimprint and extreme ultraviolet (EUV) mask inspection require new strategies that enable fast inspection at high sensitivity. The tradeoffs between sensitivity and throughput for optical and e-beam inspection are well understood. Optical inspection offers the highest throughput and is the current workhorse of the industry for both wafer and mask inspection. E-beam inspection offers the highest sensitivity but has historically lacked the throughput required for widespread adoption in the manufacturing environment. It is unlikely that continued incremental improvements to either technology will meet tomorrow's requirements, and therefore a new inspection technology approach is required; one that combines the high-throughput performance of optical with the high-sensitivity capabilities of e-beam inspection. To support the industry in meeting these challenges SUNY Poly SEMATECH has evaluated disruptive technologies that can meet the requirements for high volume manufacturing (HVM), for both the wafer fab [1] and the mask shop. Highspeed massively parallel e-beam defect inspection has been identified as the leading candidate for addressing the key gaps limiting today's patterned defect inspection techniques. As of late 2014 SUNY Poly SEMATECH completed a review, system analysis, and proof of concept evaluation of multiple e-beam technologies for defect inspection. A champion approach has been identified based on a multibeam technology from Carl Zeiss. This paper includes a discussion on the need for high-speed e-beam inspection and then provides initial imaging results from EUV masks and wafers from 61 and 91 beam demonstration systems. Progress towards high resolution and consistent intentional defect arrays (IDA) is also shown.

  2. Pre-amplification in the context of high-throughput qPCR gene expression experiment.

    PubMed

    Korenková, Vlasta; Scott, Justin; Novosadová, Vendula; Jindřichová, Marie; Langerová, Lucie; Švec, David; Šídová, Monika; Sjöback, Robert

    2015-03-11

    With the introduction of the first high-throughput qPCR instrument on the market it became possible to perform thousands of reactions in a single run compared to the previous hundreds. In the high-throughput reaction, only limited volumes of highly concentrated cDNA or DNA samples can be added. This necessity can be solved by pre-amplification, which became a part of the high-throughput experimental workflow. Here, we focused our attention on the limits of the specific target pre-amplification reaction and propose the optimal, general setup for gene expression experiment using BioMark instrument (Fluidigm). For evaluating different pre-amplification factors following conditions were combined: four human blood samples from healthy donors and five transcripts having high to low expression levels; each cDNA sample was pre-amplified at four cycles (15, 18, 21, and 24) and five concentrations (equivalent to 0.078 ng, 0.32 ng, 1.25 ng, 5 ng, and 20 ng of total RNA). Factors identified as critical for a success of cDNA pre-amplification were cycle of pre-amplification, total RNA concentration, and type of gene. The selected pre-amplification reactions were further tested for optimal Cq distribution in a BioMark Array. The following concentrations combined with pre-amplification cycles were optimal for good quality samples: 20 ng of total RNA with 15 cycles of pre-amplification, 20x and 40x diluted; and 5 ng and 20 ng of total RNA with 18 cycles of pre-amplification, both 20x and 40x diluted. We set up upper limits for the bulk gene expression experiment using gene expression Dynamic Array and provided an easy-to-obtain tool for measuring of pre-amplification success. We also showed that variability of the pre-amplification, introduced into the experimental workflow of reverse transcription-qPCR, is lower than variability caused by the reverse transcription step.

  3. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    PubMed Central

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  4. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  5. High-throughput, image-based screening of pooled genetic variant libraries

    PubMed Central

    Emanuel, George; Moffitt, Jeffrey R.; Zhuang, Xiaowei

    2018-01-01

    Image-based, high-throughput screening of genetic perturbations will advance both biology and biotechnology. We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in numerous individual cells. We achieve genotyping by introducing barcoded genetic variants into cells and using massively multiplexed FISH to measure the barcodes. We demonstrated this method by screening mutants of the fluorescent protein YFAST, yielding brighter and more photostable YFAST variants. PMID:29083401

  6. Experimental Design for Combinatorial and High Throughput Materials Development

    NASA Astrophysics Data System (ADS)

    Cawse, James N.

    2002-12-01

    In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.

  7. Real-time traffic sign detection and recognition

    NASA Astrophysics Data System (ADS)

    Herbschleb, Ernst; de With, Peter H. N.

    2009-01-01

    The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput

  8. A high-throughput microfluidic approach for 1000-fold leukocyte reduction of platelet-rich plasma

    NASA Astrophysics Data System (ADS)

    Xia, Hui; Strachan, Briony C.; Gifford, Sean C.; Shevkoplyas, Sergey S.

    2016-10-01

    Leukocyte reduction of donated blood products substantially reduces the risk of a number of transfusion-related complications. Current ‘leukoreduction’ filters operate by trapping leukocytes within specialized filtration material, while allowing desired blood components to pass through. However, the continuous release of inflammatory cytokines from the retained leukocytes, as well as the potential for platelet activation and clogging, are significant drawbacks of conventional ‘dead end’ filtration. To address these limitations, here we demonstrate our newly-developed ‘controlled incremental filtration’ (CIF) approach to perform high-throughput microfluidic removal of leukocytes from platelet-rich plasma (PRP) in a continuous flow regime. Leukocytes are separated from platelets within the PRP by progressively syphoning clarified PRP away from the concentrated leukocyte flowstream. Filtrate PRP collected from an optimally-designed CIF device typically showed a ~1000-fold (i.e. 99.9%) reduction in leukocyte concentration, while recovering >80% of the original platelets, at volumetric throughputs of ~1 mL/min. These results suggest that the CIF approach will enable users in many fields to now apply the advantages of microfluidic devices to particle separation, even for applications requiring macroscale flowrates.

  9. WIYN bench upgrade: a revitalized spectrograph

    NASA Astrophysics Data System (ADS)

    Bershady, M.; Barden, S.; Blanche, P.-A.; Blanco, D.; Corson, C.; Crawford, S.; Glaspey, J.; Habraken, S.; Jacoby, G.; Keyes, J.; Knezek, P.; Lemaire, P.; Liang, M.; McDougall, E.; Poczulp, G.; Sawyer, D.; Westfall, K.; Willmarth, D.

    2008-07-01

    We describe the redesign and upgrade of the versatile fiber-fed Bench Spectrograph on the WIYN 3.5m telescope. The spectrograph is fed by either the Hydra multi-object positioner or integral-field units (IFUs) at two other ports, and can be configured with an adjustable camera-collimator angle to use low-order and echelle gratings. The upgrade, including a new collimator, charge-coupled device (CCD) and modern controller, and volume-phase holographic gratings (VPHG), has high performance-to-cost ratio by combining new technology with a system reconfiguration that optimizes throughput while utilizing as much of the existing instrument as possible. A faster, all-refractive collimator enhances throughput by 60%, nearly eliminates the slit-function due to vignetting, and improves image quality to maintain instrumental resolution. Two VPH gratings deliver twice the diffraction efficiency of existing surface-relief gratings: A 740 l/mm grating (float-glass and post-polished) used in 1st and 2nd-order, and a large 3300 l/mm grating (spectral resolution comparable to the R2 echelle). The combination of collimator, high-quantum efficiency (QE) CCD, and VPH gratings yields throughput gain-factors of up to 3.5.

  10. High-Throughput Analysis and Automation for Glycomics Studies.

    PubMed

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  11. AI-augmented time stretch microscopy

    NASA Astrophysics Data System (ADS)

    Mahjoubfar, Ata; Chen, Claire L.; Lin, Jiahao; Jalali, Bahram

    2017-02-01

    Cell reagents used in biomedical analysis often change behavior of the cells that they are attached to, inhibiting their native signaling. On the other hand, label-free cell analysis techniques have long been viewed as challenging either due to insufficient accuracy by limited features, or because of low throughput as a sacrifice of improved precision. We present a recently developed artificial-intelligence augmented microscope, which builds upon high-throughput time stretch quantitative phase imaging (TS-QPI) and deep learning to perform label-free cell classification with record high-accuracy. Our system captures quantitative optical phase and intensity images simultaneously by frequency multiplexing, extracts multiple biophysical features of the individual cells from these images fused, and feeds these features into a supervised machine learning model for classification. The enhanced performance of our system compared to other label-free assays is demonstrated by classification of white blood T-cells versus colon cancer cells and lipid accumulating algal strains for biofuel production, which is as much as five-fold reduction in inaccuracy. This system obtains the accuracy required in practical applications such as personalized drug development, while the cells remain intact and the throughput is not sacrificed. Here, we introduce a data acquisition scheme based on quadrature phase demodulation that enables interruptionless storage of TS-QPI cell images. Our proof of principle demonstration is capable of saving 40 TB of cell images in about four hours, i.e. pictures of every single cell in 10 mL of a sample.

  12. NASA's Evolutionary Xenon Thruster (NEXT) Project Qualification Propellant Throughput Milestone: Performance, Erosion, and Thruster Service Life Prediction After 450 kg

    NASA Technical Reports Server (NTRS)

    Herman, Daniel A.

    2010-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) program is tasked with significantly improving and extending the capabilities of current state-of-the-art NSTAR thruster. The service life capability of the NEXT ion thruster is being assessed by thruster wear test and life-modeling of critical thruster components, such as the ion optics and cathodes. The NEXT Long-Duration Test (LDT) was initiated to validate and qualify the NEXT thruster propellant throughput capability. The NEXT thruster completed the primary goal of the LDT; namely to demonstrate the project qualification throughput of 450 kg by the end of calendar year 2009. The NEXT LDT has demonstrated 28,500 hr of operation and processed 466 kg of xenon throughput--more than double the throughput demonstrated by the NSTAR flight-spare. Thruster performance changes have been consistent with a priori predictions. Thruster erosion has been minimal and consistent with the thruster service life assessment, which predicts the first failure mode at greater than 750 kg throughput. The life-limiting failure mode for NEXT is predicted to be loss of structural integrity of the accelerator grid due to erosion by charge-exchange ions.

  13. Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing

    PubMed Central

    Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi

    2016-01-01

    Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039

  14. Interference-Robust Transmission in Wireless Sensor Networks

    PubMed Central

    Han, Jin-Seok; Lee, Yong-Hwan

    2016-01-01

    Low-power wireless sensor networks (WSNs) operating in unlicensed spectrum bands may seriously suffer from interference from other coexisting radio systems, such as IEEE 802.11 wireless local area networks. In this paper, we consider the improvement of the transmission performance of low-power WSNs by adjusting the transmission rate and the payload size in response to the change of co-channel interference. We estimate the probability of transmission failure and the data throughput and then determine the payload size to maximize the throughput performance. We investigate that the transmission time maximizing the normalized throughput is not much affected by the transmission rate, but rather by the interference condition. We adjust the transmission rate and the transmission time in response to the change of the channel and interference condition, respectively. Finally, we verify the performance of the proposed scheme by computer simulation. The simulation results show that the proposed scheme significantly improves data throughput compared with conventional schemes while preserving energy efficiency even in the presence of interference. PMID:27854249

  15. Interference-Robust Transmission in Wireless Sensor Networks.

    PubMed

    Han, Jin-Seok; Lee, Yong-Hwan

    2016-11-14

    Low-power wireless sensor networks (WSNs) operating in unlicensed spectrum bands may seriously suffer from interference from other coexisting radio systems, such as IEEE 802.11 wireless local area networks. In this paper, we consider the improvement of the transmission performance of low-power WSNs by adjusting the transmission rate and the payload size in response to the change of co-channel interference. We estimate the probability of transmission failure and the data throughput and then determine the payload size to maximize the throughput performance. We investigate that the transmission time maximizing the normalized throughput is not much affected by the transmission rate, but rather by the interference condition. We adjust the transmission rate and the transmission time in response to the change of the channel and interference condition, respectively. Finally, we verify the performance of the proposed scheme by computer simulation. The simulation results show that the proposed scheme significantly improves data throughput compared with conventional schemes while preserving energy efficiency even in the presence of interference.

  16. Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.

    PubMed

    Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S

    1994-01-01

    The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.

  17. Matrix-vector multiplication using digital partitioning for more accurate optical computing

    NASA Technical Reports Server (NTRS)

    Gary, C. K.

    1992-01-01

    Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.

  18. High-throughput determination of urinary hexosamines for diagnosis of mucopolysaccharidoses by capillary electrophoresis and high-performance liquid chromatography.

    PubMed

    Coppa, Giovanni V; Galeotti, Fabio; Zampini, Lucia; Maccari, Francesca; Galeazzi, Tiziana; Padelia, Lucia; Santoro, Lucia; Gabrielli, Orazio; Volpi, Nicola

    2011-04-01

    Mucopolysaccharidoses (MPS) diagnosis is often delayed and irreversible organ damage can occur, making possible therapies less effective. This highlights the importance of early and accurate diagnosis. A high-throughput procedure for the simultaneous determination of glucosamine and galactosamine produced from urinary galactosaminoglycans and glucosaminoglycans by capillary electrophoresis (CE) and HPLC has been performed and validated in subjects affected by various MPS including their mild and severe forms, Hurler and Hurler-Scheie, Hunter, Sanfilippo, Morquio, and Maroteaux-Lamy. Contrary to other analytical approaches, the present single analytical procedure, which is able to measure total abnormal amounts of urinary GAGs, high molecular mass, and related fragments, as well as specific hexosamines belonging to a group of GAGs, would be useful for possible application in their early diagnosis. After a rapid urine pretreatment, free hexosamines are generated by acidic hydrolysis, derivatized with 2-aminobenzoic acid and separated by CE/UV in ∼10min and reverse-phase (RP)-HPLC in fluorescence in ∼21min. The total content of hexosamines was found to be indicative of abnormal urinary excretion of GAGs in patients compared to the controls, and the galactosamine/glucosamine ratio was observed to be related to specific MPS syndromes in regard to both their mild and severe forms. As a consequence, important correlations between analytical response and clinical diagnosis and the severity of the disorders were observed. Furthermore, we can assume that the severity of the syndrome may be ascribed to the quantity of total GAGs, as high-molecular-mass polymers and fragments, accumulated in cells and directly excreted in the urine. Finally, due to the high-throughput nature of this approach and to the equipment commonly available in laboratories, this method is suitable for newborn screening in preventive public health programs for early detection of MPS disorders, diagnosis, and their treatment. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  20. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  1. Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing

    NASA Astrophysics Data System (ADS)

    Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey

    Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.

  2. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  3. Evaluation of FPGA to PC feedback loop

    NASA Astrophysics Data System (ADS)

    Linczuk, Pawel; Zabolotny, Wojciech M.; Wojenski, Andrzej; Krawczyk, Rafal D.; Pozniak, Krzysztof T.; Chernyshova, Maryna; Czarski, Tomasz; Gaska, Michal; Kasprowicz, Grzegorz; Kowalska-Strzeciwilk, Ewa; Malinowski, Karol

    2017-08-01

    The paper presents the evaluation study of the performance of the data transmission subsystem which can be used in High Energy Physics (HEP) and other High-Performance Computing (HPC) systems. The test environment consisted of Xilinx Artix-7 FPGA and server-grade PC connected via the PCIe 4xGen2 bus. The DMA engine was based on the Xilinx DMA for PCI Express Subsystem1 controlled by the modified Xilinx XDMA kernel driver.2 The research is focused on the influence of the system configuration on achievable throughput and latency of data transfer.

  4. High-throughput protein concentration and buffer exchange: comparison of ultrafiltration and ammonium sulfate precipitation.

    PubMed

    Moore, Priscilla A; Kery, Vladimir

    2009-01-01

    High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.

  5. An Evaluation of One-Sided and Two-Sided Communication Paradigms on Relaxed-Ordering Interconnect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Khaled Z.; Hargrove, Paul H.; Iancu, Costin

    The Cray Gemini interconnect hardware provides multiple transfer mechanisms and out-of-order message delivery to improve communication throughput. In this paper we quantify the performance of one-sided and two-sided communication paradigms with respect to: 1) the optimal available hardware transfer mechanism, 2) message ordering constraints, 3) per node and per core message concurrency. In addition to using Cray native communication APIs, we use UPC and MPI micro-benchmarks to capture one- and two-sided semantics respectively. Our results indicate that relaxing the message delivery order can improve performance up to 4.6x when compared with strict ordering. When hardware allows it, high-level one-sided programmingmore » models can already take advantage of message reordering. Enforcing the ordering semantics of two-sided communication comes with a performance penalty. Furthermore, we argue that exposing out-of-order delivery at the application level is required for the next-generation programming models. Any ordering constraints in the language specifications reduce communication performance for small messages and increase the number of active cores required for peak throughput.« less

  6. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  7. A QoS Optimization Approach in Cognitive Body Area Networks for Healthcare Applications.

    PubMed

    Ahmed, Tauseef; Le Moullec, Yannick

    2017-04-06

    Wireless body area networks are increasingly featuring cognitive capabilities. This work deals with the emerging concept of cognitive body area networks. In particular, the paper addresses two important issues, namely spectrum sharing and interferences. We propose methods for channel and power allocation. The former builds upon a reinforcement learning mechanism, whereas the latter is based on convex optimization. Furthermore, we also propose a mathematical channel model for off-body communication links in line with the IEEE 802.15.6 standard. Simulation results for a nursing home scenario show that the proposed approach yields the best performance in terms of throughput and QoS for dynamic environments. For example, in a highly demanding scenario our approach can provide throughput up to 7 Mbps, while giving an average of 97.2% of time QoS satisfaction in terms of throughput. Simulation results also show that the power optimization algorithm enables reducing transmission power by approximately 4.5 dBm, thereby sensibly and significantly reducing interference.

  8. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  9. Building Scientific Confidence in Read-Across: Progress in using HT Data to inform Read-Across Performance (Toxicology Forum)

    EPA Science Inventory

    Presentation at the 41st Annual Winter Meeting of The Toxicology Forum - From Assay to Assessment: Incorporating High Throughput Strategies into Health and Safety Evaluations on Building Scientific Confidence in Read-Across: Progress in using HT Data to inform Read-Across Perfor...

  10. Strategies for integrating transcriptional profiling into high throughput toxicity testing (SOT Symposium Workshop presentation)

    EPA Science Inventory

    Presentation Description: The release of the National Research Council’s Report “Toxicity Testing in the 21st Century: A Vision and a Strategy” in 2007 initiated a broad-based movement in the toxicology community to re-think how toxicity testing and risk assessment are performed....

  11. Performances of multiprocessor multidisk architectures for continuous media storage

    NASA Astrophysics Data System (ADS)

    Gennart, Benoit A.; Messerli, Vincent; Hersch, Roger D.

    1996-03-01

    Multimedia interfaces increase the need for large image databases, capable of storing and reading streams of data with strict synchronicity and isochronicity requirements. In order to fulfill these requirements, we consider a parallel image server architecture which relies on arrays of intelligent disk nodes, each disk node being composed of one processor and one or more disks. This contribution analyzes through bottleneck performance evaluation and simulation the behavior of two multi-processor multi-disk architectures: a point-to-point architecture and a shared-bus architecture similar to current multiprocessor workstation architectures. We compare the two architectures on the basis of two multimedia algorithms: the compute-bound frame resizing by resampling and the data-bound disk-to-client stream transfer. The results suggest that the shared bus is a potential bottleneck despite its very high hardware throughput (400Mbytes/s) and that an architecture with addressable local memories located closely to their respective processors could partially remove this bottleneck. The point- to-point architecture is scalable and able to sustain high throughputs for simultaneous compute- bound and data-bound operations.

  12. Low-Cost High-Precision PIAA Optics for High Contrast Imaging with Exo-Planet Coronagraphs

    NASA Technical Reports Server (NTRS)

    Balasubramanian, Kunjithapatham; Shaklan, Stuart B.; Pueyo, Laurent; Wilson, Daniel W.; Guyon, Olivier

    2010-01-01

    PIAA optics for high contrast imaging present challenges in manufacturing and testing due to their large surface departures from aspheric profiles at the aperture edges. With smaller form factors and consequent smaller surface deformations (<50 microns), fabrication of these mirrors with diamond turning followed by electron beam lithographic techniques becomes feasible. Though such a design reduces the system throughput to approx.50%, it still provides good performance down to 2 lambda/D inner working angle. With new achromatic focal plane mask designs, the system performance can be further improved. We report on the design, expected performance, fabrication challenges, and initial assessment of such novel PIAA optics.

  13. Strong and oriented immobilization of single domain antibodies from crude bacterial lysates for high-throughput compatible cost-effective antibody array generation

    PubMed Central

    Even-Desrumeaux, Klervi; Baty, Daniel; Chames, Patrick

    2010-01-01

    Antibodies microarrays are among the novel class of rapidly emerging proteomic technologies that will allow us to efficiently perform specific diagnosis and proteome analysis. Recombinant antibody fragments are especially suited for this approach but their stability is often a limiting factor. Camelids produce functional antibodies devoid of light chains (HCAbs) of which the single N-terminal domain is fully capable of antigen binding. When produced as an independent domain, these so-called single domain antibody fragments (sdAbs) have several advantages for biotechnological applications thanks to their unique properties of size (15 kDa), stability, solubility, and expression yield. These features should allow sdAbs to outperform other antibody formats in a number of applications, notably as capture molecule for antibody arrays. In this study, we have produced antibody microarrays using direct and oriented immobilization of sdAbs produced in crude bacterial lysates to generate proof-of-principle of a high-throughput compatible array design. Several sdAb immobilization strategies have been explored. Immobilization of in vivo biotinylated sdAbs by direct spotting of bacterial lysate on streptavidin and sandwich detection was developed to achieve high sensitivity and specificity, whereas immobilization of “multi-tagged” sdAbs via anti-tag antibodies and direct labeled sample detection strategy was optimized for the design of high-density antibody arrays for high-throughput proteomics and identification of potential biomarkers. PMID:20859568

  14. A Comparison of the Performance and Application Differences Between Manual and Automated Patch-Clamp Techniques

    PubMed Central

    Yajuan, Xiao; Xin, Liang; Zhiyuan, Li

    2012-01-01

    The patch clamp technique is commonly used in electrophysiological experiments and offers direct insight into ion channel properties through the characterization of ion channel activity. This technique can be used to elucidate the interaction between a drug and a specific ion channel at different conformational states to understand the ion channel modulators’ mechanisms. The patch clamp technique is regarded as a gold standard for ion channel research; however, it suffers from low throughput and high personnel costs. In the last decade, the development of several automated electrophysiology platforms has greatly increased the screen throughput of whole cell electrophysiological recordings. New advancements in the automated patch clamp systems have aimed to provide high data quality, high content, and high throughput. However, due to the limitations noted above, automated patch clamp systems are not capable of replacing manual patch clamp systems in ion channel research. While automated patch clamp systems are useful for screening large amounts of compounds in cell lines that stably express high levels of ion channels, the manual patch clamp technique is still necessary for studying ion channel properties in some research areas and for specific cell types, including primary cells that have mixed cell types and differentiated cells that derive from induced pluripotent stem cells (iPSCs) or embryonic stem cells (ESCs). Therefore, further improvements in flexibility with regard to cell types and data quality will broaden the applications of the automated patch clamp systems in both academia and industry. PMID:23346269

  15. High-Throughput Fabrication of Ultradense Annular Nanogap Arrays for Plasmon-Enhanced Spectroscopy.

    PubMed

    Cai, Hongbing; Meng, Qiushi; Zhao, Hui; Li, Mingling; Dai, Yanmeng; Lin, Yue; Ding, Huaiyi; Pan, Nan; Tian, Yangchao; Luo, Yi; Wang, Xiaoping

    2018-06-13

    The confinement of light into nanometer-sized metallic nanogaps can lead to an extremely high field enhancement, resulting in dramatically enhanced absorption, emission, and surface-enhanced Raman scattering (SERS) of molecules embedded in nanogaps. However, low-cost, high-throughput, and reliable fabrication of ultra-high-dense nanogap arrays with precise control of the gap size still remains a challenge. Here, by combining colloidal lithography and atomic layer deposition technique, a reproducible method for fabricating ultra-high-dense arrays of hexagonal close-packed annular nanogaps over large areas is demonstrated. The annular nanogap arrays with a minimum diameter smaller than 100 nm and sub-1 nm gap width have been produced, showing excellent SERS performance with a typical enhancement factor up to 3.1 × 10 6 and a detection limit of 10 -11 M. Moreover, it can also work as a high-quality field enhancement substrate for studying two-dimensional materials, such as MoSe 2 . Our method provides an attractive approach to produce controllable nanogaps for enhanced light-matter interaction at the nanoscale.

  16. High-Flux, High Performance H2O2 Catalyst Bed for ISTAR

    NASA Technical Reports Server (NTRS)

    Ponzo, J.

    2005-01-01

    On NASA's ISTAR RBCC program packaging and performance requirements exceeded traditional H2O2 catalyst bed capabilities. Aerojet refined a high performance, monolithic 90% H202 catalyst bed previously developed and demonstrated. This approach to catalyst bed design and fabrication was an enabling technology to the ISTAR tri-fluid engine. The catalyst bed demonstrated 55 starts at throughputs greater than 0.60 lbm/s/sq in for a duration of over 900 seconds in a physical envelope approximately 114 of traditional designs. The catalyst bed uses photoetched plates of metal bonded into a single piece monolithic structure. The precise control of the geometry and complete mixing results in repeatable, quick starting, high performing catalyst bed. Three different beds were designed and tested, with the best performing bed used for tri-fluid engine testing.

  17. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    PubMed

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  18. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  19. High-Throughput Sequencing of Germline and Tumor From Men with Early-Onset Metastatic Prostate Cancer

    DTIC Science & Technology

    2016-12-01

    AWARD NUMBER: W81XWH-13-1-0371 TITLE: High-Throughput Sequencing of Germline and Tumor From Men with Early- Onset Metastatic Prostate Cancer...DATES COVERED 30 Sep 2013 - 29 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER High-Throughput Sequencing of Germline and Tumor From Men with...presenting with metastatic prostate cancer at a young age (before age 60 years). Whole exome sequencing identified a panel of germline variants that have

  20. Assessment of the cPAS-based BGISEQ-500 platform for metagenomic sequencing.

    PubMed

    Fang, Chao; Zhong, Huanzi; Lin, Yuxiang; Chen, Bing; Han, Mo; Ren, Huahui; Lu, Haorong; Luber, Jacob M; Xia, Min; Li, Wangsheng; Stein, Shayna; Xu, Xun; Zhang, Wenwei; Drmanac, Radoje; Wang, Jian; Yang, Huanming; Hammarström, Lennart; Kostic, Aleksandar D; Kristiansen, Karsten; Li, Junhua

    2018-03-01

    More extensive use of metagenomic shotgun sequencing in microbiome research relies on the development of high-throughput, cost-effective sequencing. Here we present a comprehensive evaluation of the performance of the new high-throughput sequencing platform BGISEQ-500 for metagenomic shotgun sequencing and compare its performance with that of 2 Illumina platforms. Using fecal samples from 20 healthy individuals, we evaluated the intra-platform reproducibility for metagenomic sequencing on the BGISEQ-500 platform in a setup comprising 8 library replicates and 8 sequencing replicates. Cross-platform consistency was evaluated by comparing 20 pairwise replicates on the BGISEQ-500 platform vs the Illumina HiSeq 2000 platform and the Illumina HiSeq 4000 platform. In addition, we compared the performance of the 2 Illumina platforms against each other. By a newly developed overall accuracy quality control method, an average of 82.45 million high-quality reads (96.06% of raw reads) per sample, with 90.56% of bases scoring Q30 and above, was obtained using the BGISEQ-500 platform. Quantitative analyses revealed extremely high reproducibility between BGISEQ-500 intra-platform replicates. Cross-platform replicates differed slightly more than intra-platform replicates, yet a high consistency was observed. Only a low percentage (2.02%-3.25%) of genes exhibited significant differences in relative abundance comparing the BGISEQ-500 and HiSeq platforms, with a bias toward genes with higher GC content being enriched on the HiSeq platforms. Our study provides the first set of performance metrics for human gut metagenomic sequencing data using BGISEQ-500. The high accuracy and technical reproducibility confirm the applicability of the new platform for metagenomic studies, though caution is still warranted when combining metagenomic data from different platforms.

Top