Workshop Background and Summary of Webinars (IVIVE workshop)
Toxicokinetics (TK) provides a bridge between hazard and exposure by predicting tissue concentrations due to exposure. Higher throughput toxicokinetics (HTTK) appears to provide essential data to established context for in vitro bioactivity data obtained through high throughput ...
Higher Throughput Calorimetry: Opportunities, Approaches and Challenges
Recht, Michael I.; Coyle, Joseph E.; Bruce, Richard H.
2010-01-01
Higher throughput thermodynamic measurements can provide value in structure-based drug discovery during fragment screening, hit validation, and lead optimization. Enthalpy can be used to detect and characterize ligand binding, and changes that affect the interaction of protein and ligand can sometimes be detected more readily from changes in the enthalpy of binding than from the corresponding free-energy changes or from protein-ligand structures. Newer, higher throughput calorimeters are being incorporated into the drug discovery process. Improvements in titration calorimeters come from extensions of a mature technology and face limitations in scaling. Conversely, array calorimetry, an emerging technology, shows promise for substantial improvements in throughput and material utilization, but improved sensitivity is needed. PMID:20888754
NASA Astrophysics Data System (ADS)
Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin
2015-09-01
We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.
Chen, Yu-Chih; Yoon, Euisik
2017-01-01
Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.
NASA Technical Reports Server (NTRS)
Crozier, Stewart N.
1990-01-01
Random access signaling, which allows slotted packets to spill over into adjacent slots, is investigated. It is shown that sloppy-slotted ALOHA can always provide higher throughput than conventional slotted ALOHA. The degree of improvement depends on the timing error distribution. Throughput performance is presented for Gaussian timing error distributions, modified to include timing error corrections. A general channel capacity lower bound, independent of the specific timing error distribution, is also presented.
NASA Astrophysics Data System (ADS)
Chakraborty, Swarnendu Kumar; Goswami, Rajat Subhra; Bhunia, Chandan Tilak; Bhunia, Abhinandan
2016-06-01
Aggressive packet combining (APC) scheme is well-established in literature. Several modifications were studied earlier for improving throughput. In this paper, three new modifications of APC are proposed. The performance of proposed modified APC is studied by simulation and is reported here. A hybrid scheme is proposed here for getting higher throughput and also the disjoint factor is compared among conventional APC with proposed schemes for getting higher throughput.
Wake Vortex Systems Cost/Benefits Analysis
NASA Technical Reports Server (NTRS)
Crisp, Vicki K.
1997-01-01
The goals of cost/benefit assessments are to provide quantitative and qualitative data to aid in the decision-making process. Benefits derived from increased throughput (or decreased delays) used to balance life-cycle costs. Packaging technologies together may provide greater gains (demonstrate higher return on investment).
Asati, Atul; Kachurina, Olga; Kachurin, Anatoly
2012-01-01
Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605
A Formal Messaging Notation for Alaskan Aviation Data
NASA Technical Reports Server (NTRS)
Rios, Joseph L.
2015-01-01
Data exchange is an increasingly important aspect of the National Airspace System. While many data communication channels have become more capable of sending and receiving data at higher throughput rates, there is still a need to use communication channels efficiently with limited throughput. The limitation can be based on technological issues, financial considerations, or both. This paper provides a complete description of several important aviation weather data in Abstract Syntax Notation format. By doing so, data providers can take advantage of Abstract Syntax Notation's ability to encode data in a highly compressed format. When data such as pilot weather reports, surface weather observations, and various weather predictions are compressed in such a manner, it allows for the efficient use of throughput-limited communication channels. This paper provides details on the Abstract Syntax Notation One (ASN.1) implementation for Alaskan aviation data, and demonstrates its use on real-world aviation weather data samples as Alaska has sparse terrestrial data infrastructure and data are often sent via relatively costly satellite channels.
Real-Time Aggressive Image Data Compression
1990-03-31
implemented with higher degrees of modularity, concurrency, and higher levels of machine intelligence , thereby providing higher data -throughput rates...Project Summary Project Title: Real-Time Aggressive Image Data Compression Principal Investigators: Dr. Yih-Fang Huang and Dr. Ruey-wen Liu Institution...Summary The objective of the proposed research is to develop reliable algorithms !.hat can achieve aggressive image data compression (with a compression
Wu, Zhenlong; Chen, Yu; Wang, Moran; Chung, Aram J
2016-02-07
Fluid inertia which has conventionally been neglected in microfluidics has been gaining much attention for particle and cell manipulation because inertia-based methods inherently provide simple, passive, precise and high-throughput characteristics. Particularly, the inertial approach has been applied to blood separation for various biomedical research studies mainly using spiral microchannels. For higher throughput, parallelization is essential; however, it is difficult to realize using spiral channels because of their large two dimensional layouts. In this work, we present a novel inertial platform for continuous sheathless particle and blood cell separation in straight microchannels containing microstructures. Microstructures within straight channels exert secondary flows to manipulate particle positions similar to Dean flow in curved channels but with higher controllability. Through a balance between inertial lift force and microstructure-induced secondary flow, we deterministically position microspheres and cells based on their sizes to be separated downstream. Using our inertial platform, we successfully sorted microparticles and fractionized blood cells with high separation efficiencies, high purities and high throughputs. The inertial separation platform developed here can be operated to process diluted blood with a throughput of 10.8 mL min(-1)via radially arrayed single channels with one inlet and two rings of outlets.
Opportunistic data locality for end user data analysis
NASA Astrophysics Data System (ADS)
Fischer, M.; Heidecker, C.; Kuehn, E.; Quast, G.; Giffels, M.; Schnepf, M.; Heiss, A.; Petzold, A.
2017-10-01
With the increasing data volume of LHC Run2, user analyses are evolving towards increasing data throughput. This evolution translates to higher requirements for efficiency and scalability of the underlying analysis infrastructure. We approach this issue with a new middleware to optimise data access: a layer of coordinated caches transparently provides data locality for high-throughput analyses. We demonstrated the feasibility of this approach with a prototype used for analyses of the CMS working groups at KIT. In this paper, we present our experience both with the approach in general, and our prototype in specific.
Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng
2018-06-04
In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.
The application of the high throughput sequencing technology in the transposable elements.
Liu, Zhen; Xu, Jian-hong
2015-09-01
High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.
Spectrum Access In Cognitive Radio Using a Two-Stage Reinforcement Learning Approach
NASA Astrophysics Data System (ADS)
Raj, Vishnu; Dias, Irene; Tholeti, Thulasi; Kalyani, Sheetal
2018-02-01
With the advent of the 5th generation of wireless standards and an increasing demand for higher throughput, methods to improve the spectral efficiency of wireless systems have become very important. In the context of cognitive radio, a substantial increase in throughput is possible if the secondary user can make smart decisions regarding which channel to sense and when or how often to sense. Here, we propose an algorithm to not only select a channel for data transmission but also to predict how long the channel will remain unoccupied so that the time spent on channel sensing can be minimized. Our algorithm learns in two stages - a reinforcement learning approach for channel selection and a Bayesian approach to determine the optimal duration for which sensing can be skipped. Comparisons with other learning methods are provided through extensive simulations. We show that the number of sensing is minimized with negligible increase in primary interference; this implies that lesser energy is spent by the secondary user in sensing and also higher throughput is achieved by saving on sensing.
Collision Resolution Scheme with Offset for Improved Performance of Heterogeneous WLAN
NASA Astrophysics Data System (ADS)
Upadhyay, Raksha; Vyavahare, Prakash D.; Tokekar, Sanjiv
2016-03-01
CSMA/CA based DCF of 802.11 MAC layer employs best effort delivery model, in which all stations compete for channel access with same priority. Heterogeneous conditions result in unfairness among stations and degradation in throughput, therefore, providing different priorities to different applications for required quality of service in heterogeneous networks is challenging task. This paper proposes a collision resolution scheme with a novel concept of introducing offset, which is suitable for heterogeneous networks. Selection of random value by a station for its contention with offset results in reduced probability of collision. Expression for the optimum value of the offset is also derived. Results show that proposed scheme, when applied to heterogeneous networks, has improved throughput and fairness than conventional scheme. Results show that proposed scheme also exhibits higher throughput and fairness with reduced delay in homogeneous networks.
Pfannkoch, Edward A; Stuff, John R; Whitecavage, Jacqueline A; Blevins, John M; Seely, Kathryn A; Moran, Jeffery H
2015-01-01
National Oceanic and Atmospheric Administration (NOAA) Method NMFS-NWFSC-59 2004 is currently used to quantitatively analyze seafood for polycyclic aromatic hydrocarbon (PAH) contamination, especially following events such as the Deepwater Horizon oil rig explosion that released millions of barrels of crude oil into the Gulf of Mexico. This method has limited throughput capacity; hence, alternative methods are necessary to meet analytical demands after such events. Stir bar sorptive extraction (SBSE) is an effective technique to extract trace PAHs in water and the quick, easy, cheap, effective, rugged, and safe (QuEChERS) extraction strategy effectively extracts PAHs from complex food matrices. This study uses SBSE to concentrate PAHs and eliminate matrix interference from QuEChERS extracts of seafood, specifically oysters, fish, and shrimp. This method provides acceptable recovery (65-138%) linear calibrations and is sensitive (LOD = 0.02 ppb, LOQ = 0.06 ppb) while providing higher throughput and maintaining equivalency between NOAA 2004 as determined by analysis of NIST SRM 1974b mussel tissue.
Enhancing Bottom-up and Top-down Proteomic Measurements with Ion Mobility Separations
Baker, Erin Shammel; Burnum-Johnson, Kristin E.; Ibrahim, Yehia M.; ...
2015-07-03
Proteomic measurements with greater throughput, sensitivity and additional structural information enhance the in-depth characterization of complex mixtures and targeted studies with additional information and higher confidence. While liquid chromatography separation coupled with mass spectrometry (LC-MS) measurements have provided information on thousands of proteins in different sample types, the additional of another rapid separation stage providing structural information has many benefits for analyses. Technical advances in ion funnels and multiplexing have enabled ion mobility separations to be easily and effectively coupled with LC-MS proteomics to enhance the information content of measurements. Finally, herein, we report on applications illustrating increased sensitivity, throughput,more » and structural information by utilizing IMS-MS and LC-IMS-MS measurements for both bottom-up and top-down proteomics measurements.« less
Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules
Panzeri, Francesco
2017-01-01
We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions. PMID:28419142
High-throughput DNA extraction of forensic adhesive tapes.
Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes
2016-09-01
Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting
Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.
2016-01-01
Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945
NASA Astrophysics Data System (ADS)
Takeda, Kazuaki; Kojima, Yohei; Adachi, Fumiyuki
Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide a better bit error rate (BER) performance than rake combining. However, the residual inter-chip interference (ICI) is produced after MMSE-FDE and this degrades the BER performance. Recently, we showed that frequency-domain ICI cancellation can bring the BER performance close to the theoretical lower bound. To further improve the BER performance, transmit antenna diversity technique is effective. Cyclic delay transmit diversity (CDTD) can increase the number of equivalent paths and hence achieve a large frequency diversity gain. Space-time transmit diversity (STTD) can obtain antenna diversity gain due to the space-time coding and achieve a better BER performance than CDTD. Objective of this paper is to show that the BER performance degradation of CDTD is mainly due to the residual ICI and that the introduction of ICI cancellation gives almost the same BER performance as STTD. This study provides a very important result that CDTD has a great advantage of providing a higher throughput than STTD. This is confirmed by computer simulation. The computer simulation results show that CDTD can achieve higher throughput than STTD when ICI cancellation is introduced.
A comparison of high-throughput techniques for assaying circadian rhythms in plants.
Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony
2015-01-01
Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.
Mission Advantages of NEXT: Nasa's Evolutionary Xenon Thruster
NASA Technical Reports Server (NTRS)
Oleson, Steven; Gefert, Leon; Benson, Scott; Patterson, Michael; Noca, Muriel; Sims, Jon
2002-01-01
With the demonstration of the NSTAR propulsion system on the Deep Space One mission, the range of the Discovery class of NASA missions can now be expanded. NSTAR lacks, however, sufficient performance for many of the more challenging Office of Space Science (OSS) missions. Recent studies have shown that NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system is the best choice for many exciting potential OSS missions including outer planet exploration and inner solar system sample returns. The NEXT system provides the higher power, higher specific impulse, and higher throughput required by these science missions.
improved and higher throughput methods for analysis of biomass feedstocks Agronomics-using NIR spectroscopy in-house and external client training. She has also developed improved and high-throughput methods
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
Chen, Zhidan; Coy, Stephen L; Pannkuk, Evan L; Laiakis, Evagelia C; Fornace, Albert J; Vouros, Paul
2018-05-07
High-throughput methods to assess radiation exposure are a priority due to concerns that include nuclear power accidents, the spread of nuclear weapon capability, and the risk of terrorist attacks. Metabolomics, the assessment of small molecules in an easily accessible sample, is the most recent method to be applied for the identification of biomarkers of the biological radiation response with a useful dose-response profile. Profiling for biomarker identification is frequently done using an LC-MS platform which has limited throughput due to the time-consuming nature of chromatography. We present here a chromatography-free simplified method for quantitative analysis of seven metabolites in urine with radiation dose-response using urine samples provided from the Pannkuk et al. (2015) study of long-term (7-day) radiation response in nonhuman primates (NHP). The stable isotope dilution (SID) analytical method consists of sample preparation by strong cation exchange-solid phase extraction (SCX-SPE) to remove interferences and concentrate the metabolites of interest, followed by differential mobility spectrometry (DMS) ion filtration to select the ion of interest and reduce chemical background, followed by mass spectrometry (overall SID-SPE-DMS-MS). Since no chromatography is used, calibration curves were prepared rapidly, in under 2 h (including SPE) for six simultaneously analyzed radiation biomarkers. The seventh, creatinine, was measured separately after 2500× dilution. Creatinine plays a dual role, measuring kidney glomerular filtration rate (GFR), and indicating kidney damage at high doses. The current quantitative method using SID-SPE-DMS-MS provides throughput which is 7.5 to 30 times higher than that of LC-MS and provides a path to pre-clinical radiation dose estimation. Graphical Abstract.
NASA Astrophysics Data System (ADS)
Chen, Zhidan; Coy, Stephen L.; Pannkuk, Evan L.; Laiakis, Evagelia C.; Fornace, Albert J.; Vouros, Paul
2018-05-01
High-throughput methods to assess radiation exposure are a priority due to concerns that include nuclear power accidents, the spread of nuclear weapon capability, and the risk of terrorist attacks. Metabolomics, the assessment of small molecules in an easily accessible sample, is the most recent method to be applied for the identification of biomarkers of the biological radiation response with a useful dose-response profile. Profiling for biomarker identification is frequently done using an LC-MS platform which has limited throughput due to the time-consuming nature of chromatography. We present here a chromatography-free simplified method for quantitative analysis of seven metabolites in urine with radiation dose-response using urine samples provided from the Pannkuk et al. (2015) study of long-term (7-day) radiation response in nonhuman primates (NHP). The stable isotope dilution (SID) analytical method consists of sample preparation by strong cation exchange-solid phase extraction (SCX-SPE) to remove interferences and concentrate the metabolites of interest, followed by differential mobility spectrometry (DMS) ion filtration to select the ion of interest and reduce chemical background, followed by mass spectrometry (overall SID-SPE-DMS-MS). Since no chromatography is used, calibration curves were prepared rapidly, in under 2 h (including SPE) for six simultaneously analyzed radiation biomarkers. The seventh, creatinine, was measured separately after 2500× dilution. Creatinine plays a dual role, measuring kidney glomerular filtration rate (GFR), and indicating kidney damage at high doses. The current quantitative method using SID-SPE-DMS-MS provides throughput which is 7.5 to 30 times higher than that of LC-MS and provides a path to pre-clinical radiation dose estimation. [Figure not available: see fulltext.
A Fair Contention Access Scheme for Low-Priority Traffic in Wireless Body Area Networks
Sajeel, Muhammad; Bashir, Faisal; Asfand-e-yar, Muhammad; Tauqir, Muhammad
2017-01-01
Recently, wireless body area networks (WBANs) have attracted significant consideration in ubiquitous healthcare. A number of medium access control (MAC) protocols, primarily derived from the superframe structure of the IEEE 802.15.4, have been proposed in literature. These MAC protocols aim to provide quality of service (QoS) by prioritizing different traffic types in WBANs. A contention access period (CAP)with high contention in priority-based MAC protocols can result in higher number of collisions and retransmissions. During CAP, traffic classes with higher priority are dominant over low-priority traffic; this has led to starvation of low-priority traffic, thus adversely affecting WBAN throughput, delay, and energy consumption. Hence, this paper proposes a traffic-adaptive priority-based superframe structure that is able to reduce contention in the CAP period, and provides a fair chance for low-priority traffic. Simulation results in ns-3 demonstrate that the proposed MAC protocol, called traffic- adaptive priority-based MAC (TAP-MAC), achieves low energy consumption, high throughput, and low latency compared to the IEEE 802.15.4 standard, and the most recent priority-based MAC protocol, called priority-based MAC protocol (PA-MAC). PMID:28832495
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
Molecular Pathways: Extracting Medical Knowledge from High Throughput Genomic Data
Goldstein, Theodore; Paull, Evan O.; Ellis, Matthew J.; Stuart, Joshua M.
2013-01-01
High-throughput genomic data that measures RNA expression, DNA copy number, mutation status and protein levels provide us with insights into the molecular pathway structure of cancer. Genomic lesions (amplifications, deletions, mutations) and epigenetic modifications disrupt biochemical cellular pathways. While the number of possible lesions is vast, different genomic alterations may result in concordant expression and pathway activities, producing common tumor subtypes that share similar phenotypic outcomes. How can these data be translated into medical knowledge that provides prognostic and predictive information? First generation mRNA expression signatures such as Genomic Health's Oncotype DX already provide prognostic information, but do not provide therapeutic guidance beyond the current standard of care – which is often inadequate in high-risk patients. Rather than building molecular signatures based on gene expression levels, evidence is growing that signatures based on higher-level quantities such as from genetic pathways may provide important prognostic and diagnostic cues. We provide examples of how activities for molecular entities can be predicted from pathway analysis and how the composite of all such activities, referred to here as the “activitome,” help connect genomic events to clinical factors in order to predict the drivers of poor outcome. PMID:23430023
Diffraction efficiency of radially-profiled off-plane reflection gratings
NASA Astrophysics Data System (ADS)
Miles, Drew M.; Tutt, James H.; DeRoo, Casey T.; Marlowe, Hannah; Peterson, Thomas J.; McEntaffer, Randall L.; Menz, Benedikt; Burwitz, Vadim; Hartner, Gisela; Laubis, Christian; Scholze, Frank
2015-09-01
Future X-ray missions will require gratings with high throughput and high spectral resolution. Blazed off-plane reflection gratings are capable of meeting these demands. A blazed grating profile optimizes grating efficiency, providing higher throughput to one side of zero-order on the arc of diffraction. This paper presents efficiency measurements made in the 0.3 - 1.5 keV energy band at the Physikalisch-Technische Bundesanstalt (PTB) BESSY II facility for three holographically-ruled gratings, two of which are blazed. Each blazed grating was tested in both the Littrow configuration and anti-Littrow configuration in order to test the alignment sensitivity of these gratings with regard to throughput. This paper outlines the procedure of the grating experiment performed at BESSY II and discuss the resulting efficiency measurements across various energies. Experimental results are generally consistent with theory and demonstrate that the blaze does increase throughput to one side of zero-order. However, the total efficiency of the non-blazed, sinusoidal grating is greater than that of the blazed gratings, which suggests that the method of manufacturing these blazed profiles fails to produce facets with the desired level of precision. Finally, evidence of a successful blaze implementation from first diffraction results of prototype blazed gratings produce via a new fabrication technique at the University of Iowa are presented.
Higher Throughput Toxicokinetics to Allow Extrapolation (EPA-Japan Bilateral EDSP meeting)
As part of "Ongoing EDSP Directions & Activities" I will present CSS research on high throughput toxicokinetics, including in vitro data and models to allow rapid determination of the real world doses that may cause endocrine disruption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combs, S.K.; Foust, C.R.; Qualls, A.L.
Pellet injection systems for the next-generation fusion devices, such as the proposed International Thermonuclear Experimental Reactor (ITER), will require feed systems capable of providing a continuous supply of hydrogen ice at high throughputs. A straightforward concept in which multiple extruder units operate in tandem has been under development at the Oak Ridge National Laboratory. A prototype with three large-volume extruder units has been fabricated and tested in the laboratory. In experiments, it was found that each extruder could provide volumetric ice flow rates of up to {approximately}1.3 cm{sup 3}/s (for {approximately}10 s), which is sufficient for fueling fusion reactors atmore » the gigawatt power level. With the three extruders of the prototype operating in sequence, a steady rate of {approximately}0.33 cm{sup 3}/s was maintained for a duration of 1 h. Even steady-state rates approaching the full ITER design value ({approximately}1 cm{sup 3}/s) may be feasible with the prototype. However, additional extruder units (1{endash}3) would facilitate operations at the higher throughputs and reduce the duty cycle of each unit. The prototype can easily accommodate steady-state pellet fueling of present large tokamaks or other near-term plasma experiments.« less
NASA Technical Reports Server (NTRS)
Dutta, S.
1983-01-01
Applications of laser-based processing techniques to solar cell metallization are discussed. Laser-assisted thermal or photolytic maskless deposition from organometallic vapors or solutions may provide a viable alternative to photovoltaic metallization systems currently in use. High power, defocused excimer lasers may be used in conjunction with masks as an alternative to direct laser writing to provide higher throughput. Repeated pulsing with excimer lasers may eliminate the need for secondary plating techniques for metal film buildup. A comparison between the thermal and photochemical deposition processes is made.
A high performance totally ordered multicast protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Whetten, Brian; Kaplan, Simon
1995-01-01
This paper presents the Reliable Multicast Protocol (RMP). RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service such as IP Multicasting. RMP is fully and symmetrically distributed so that no site bears un undue portion of the communication load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These QoS guarantees are selectable on a per packet basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, an implicit naming service, mutually exclusive handlers for messages, and mutually exclusive locks. It has commonly been held that a large performance penalty must be paid in order to implement total ordering -- RMP discounts this. On SparcStation 10's on a 1250 KB/sec Ethernet, RMP provides totally ordered packet delivery to one destination at 842 KB/sec throughput and with 3.1 ms packet latency. The performance stays roughly constant independent of the number of destinations. For two or more destinations on a LAN, RMP provides higher throughput than any protocol that does not use multicast or broadcast.
Use of high-throughput mass spectrometry to elucidate host pathogen interactions in Salmonella
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodland, Karin D.; Adkins, Joshua N.; Ansong, Charles
Capabilities in mass spectrometry are evolving rapidly, with recent improvements in sensitivity, data analysis, and most important, from the standpoint of this review, much higher throughput allowing analysis of many samples in a single day. This short review describes how these improvements in mass spectrometry can be used to dissect host-pathogen interactions using Salmonella as a model system. This approach enabled direct identification of the majority of annotated Salmonella proteins, quantitation of expression changes under various in vitro growth conditions, and new insights into virulence and expression of Salmonella proteins within host cell cells. One of the most significant findingsmore » is that a very high percentage of the all annotated genes (>20%) in Salmonella are regulated post-transcriptionally. In addition, new and unexpected interactions have been identified for several Salmonella virulence regulators that involve protein-protein interactions, suggesting additional functions of these regulators in coordinating virulence expression. Overall high throughput mass spectrometry provides a new view of pathogen-host interactions emphasizing the protein products and defining how protein interactions determine the outcome of infection.« less
Throughput increase by adjustment of the BARC drying time with coat track process
NASA Astrophysics Data System (ADS)
Brakensiek, Nickolas L.; Long, Ryan
2005-05-01
Throughput of a coater module within the coater track is related to the solvent evaporation rate from the material that is being coated. Evaporation rate is controlled by the spin dynamics of the wafer and airflow dynamics over the wafer. Balancing these effects is the key to achieving very uniform coatings across a flat unpatterned wafer. As today"s coat tracks are being pushed to higher throughputs to match the scanner, the coat module throughput must be increased as well. For chemical manufacturers the evaporation rate of the material depends on the solvent used. One measure of relative evaporation rates is to compare flash points of a solvent. The lower the flash point, the quicker the solvent will evaporate. It is possible to formulate products with these volatile solvents although at a price. Shipping and manufacturing a more flammable product increase chances of fire, thereby increasing insurance premiums. Also, the end user of these chemicals will have to take extra precautions in the fab and in storage of these more flammable chemicals. An alternative coat process is possible which would allow higher throughput in a distinct coat module without sacrificing safety. A tradeoff is required for this process, that being a more complicated coat process and a higher viscosity chemical. The coat process uses the fact that evaporation rate depends on the spin dynamics of the wafer by utilizing a series of spin speeds that first would set the thickness of the material followed by a high spin speed to remove the residual solvent. This new process can yield a throughput of over 150 wafers per hour (wph) given two coat modules. The thickness uniformity of less than 2 nm (3 sigma) is still excellent, while drying times are shorter than 10 seconds to achieve the 150 wph throughput targets.
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
High Throughput Screening For Hazard and Risk of Environmental Contaminants
High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...
High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Shabbir, Faizan; Gong, Chao
2015-04-13
We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less
Identifying apicoplast-targeting antimalarials using high-throughput compatible approaches
Ekland, Eric H.; Schneider, Jessica; Fidock, David A.
2011-01-01
Malarial parasites have evolved resistance to all previously used therapies, and recent evidence suggests emerging resistance to the first-line artemisinins. To identify antimalarials with novel mechanisms of action, we have developed a high-throughput screen targeting the apicoplast organelle of Plasmodium falciparum. Antibiotics known to interfere with this organelle, such as azithromycin, exhibit an unusual phenotype whereby the progeny of drug-treated parasites die. Our screen exploits this phenomenon by assaying for “delayed death” compounds that exhibit a higher potency after two cycles of intraerythrocytic development compared to one. We report a primary assay employing parasites with an integrated copy of a firefly luciferase reporter gene and a secondary flow cytometry-based assay using a nucleic acid stain paired with a mitochondrial vital dye. Screening of the U.S. National Institutes of Health Clinical Collection identified known and novel antimalarials including kitasamycin. This inexpensive macrolide, used for agricultural applications, exhibited an in vitro IC50 in the 50 nM range, comparable to the 30 nM activity of our control drug, azithromycin. Imaging and pharmacologic studies confirmed kitasamycin action against the apicoplast, and in vivo activity was observed in a murine malaria model. These assays provide the foundation for high-throughput campaigns to identify novel chemotypes for combination therapies to treat multidrug-resistant malaria.—Ekland, E. H., Schneider, J., Fidock, D. A. Identifying apicoplast-targeting antimalarials using high-throughput compatible approaches. PMID:21746861
NASA Technical Reports Server (NTRS)
Li, Jing; Hylton, Alan; Budinger, James; Nappier, Jennifer; Downey, Joseph; Raible, Daniel
2012-01-01
Due to its simplicity and robustness against wavefront distortion, pulse position modulation (PPM) with photon counting detector has been seriously considered for long-haul optical wireless systems. This paper evaluates the dual-pulse case and compares it with the conventional single-pulse case. Analytical expressions for symbol error rate and bit error rate are first derived and numerically evaluated, for the strong, negative-exponential turbulent atmosphere; and bandwidth efficiency and throughput are subsequently assessed. It is shown that, under a set of practical constraints including pulse width and pulse repetition frequency (PRF), dual-pulse PPM enables a better channel utilization and hence a higher throughput than it single-pulse counterpart. This result is new and different from the previous idealistic studies that showed multi-pulse PPM provided no essential information-theoretic gains than single-pulse PPM.
TCP Throughput Profiles Using Measurements over Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less
Shimada, Tsutomu; Kelly, Joan; LaMarr, William A; van Vlies, Naomi; Yasuda, Eriko; Mason, Robert W.; Mackenzie, William; Kubaski, Francyne; Giugliani, Roberto; Chinen, Yasutsugu; Yamaguchi, Seiji; Suzuki, Yasuyuki; Orii, Kenji E.; Fukao, Toshiyuki; Orii, Tadao; Tomatsu, Shunji
2014-01-01
Mucopolysaccharidoses (MPS) are caused by deficiency of one of a group of specific lysosomal enzymes, resulting in excessive accumulation of glycosaminoglycans (GAGs). We previously developed GAG assay methods using liquid chromatography tandem mass spectrometry (LC-MS/MS); however, it takes 4–5 min per sample for analysis. For the large numbers of samples in a screening program, a more rapid process is desirable. The automated high-throughput mass spectrometry (HT-MS/MS) system (RapidFire) integrates a solid phase extraction robot to concentrate and desalt samples prior to direction into the MS/MS without chromatographic separation; thereby allowing each sample to be processed within ten seconds (enabling screening of more than one million samples per year). The aim of this study was to develop a higher throughput system to assay heparan sulfate (HS) using HT-MS/MS, and to compare its reproducibility, sensitivity and specificity with conventional LC-MS/MS. HS levels were measured in blood (plasma and serum) from control subjects and patients with MPS II, III, or IV and in dried blood spots (DBS) from newborn controls and patients with MPS I, II, or III. Results obtained from HT-MS/MS showed 1) that there was a strong correlation of levels of disaccharides derived from HS in blood, between those calculated using conventional LC-MS/MS and HT-MS/MS, 2) that levels of HS in blood were significantly elevated in patients with MPS II and III, but not in MPS IVA, 3) that the level of HS in patients with a severe form of MPS II was higher than that in an attenuated form, 4) that reduction of blood HS level was observed in MPS II patients treated with enzyme replacement therapy or hematopoietic stem cell transplantation, and 5) that levels of HS in newborn DBS were elevated in patients with MPS I, II or III, compared to control newborns. In conclusion, HT-MS/MS provides much higher throughput than LC-MS/MS-based methods with similar sensitivity and specificity in an HS assay, indicating that HT-MS/MS may be feasible for diagnosis, monitoring, and newborn screening of MPS. PMID:25092413
Asif, Muhammad; Guo, Xiangzhou; Zhang, Jing; Miao, Jungang
2018-04-17
Digital cross-correlation is central to many applications including but not limited to Digital Image Processing, Satellite Navigation and Remote Sensing. With recent advancements in digital technology, the computational demands of such applications have increased enormously. In this paper we are presenting a high throughput digital cross correlator, capable of processing 1-bit digitized stream, at the rate of up to 2 GHz, simultaneously on 64 channels i.e., approximately 4 Trillion correlation and accumulation operations per second. In order to achieve higher throughput, we have focused on frequency based partitioning of our design and tried to minimize and localize high frequency operations. This correlator is designed for a Passive Millimeter Wave Imager intended for the detection of contraband items concealed on human body. The goals are to increase the system bandwidth, achieve video rate imaging, improve sensitivity and reduce the size. Design methodology is detailed in subsequent sections, elaborating the techniques enabling high throughput. The design is verified for Xilinx Kintex UltraScale device in simulation and the implementation results are given in terms of device utilization and power consumption estimates. Our results show considerable improvements in throughput as compared to our baseline design, while the correlator successfully meets the functional requirements.
Adaptive Packet Combining Scheme in Three State Channel Model
NASA Astrophysics Data System (ADS)
Saring, Yang; Bulo, Yaka; Bhunia, Chandan Tilak
2018-01-01
The two popular techniques of packet combining based error correction schemes are: Packet Combining (PC) scheme and Aggressive Packet Combining (APC) scheme. PC scheme and APC scheme have their own merits and demerits; PC scheme has better throughput than APC scheme, but suffers from higher packet error rate than APC scheme. The wireless channel state changes all the time. Because of this random and time varying nature of wireless channel, individual application of SR ARQ scheme, PC scheme and APC scheme can't give desired levels of throughput. Better throughput can be achieved if appropriate transmission scheme is used based on the condition of channel. Based on this approach, adaptive packet combining scheme has been proposed to achieve better throughput. The proposed scheme adapts to the channel condition to carry out transmission using PC scheme, APC scheme and SR ARQ scheme to achieve better throughput. Experimentally, it was observed that the error correction capability and throughput of the proposed scheme was significantly better than that of SR ARQ scheme, PC scheme and APC scheme.
High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes
USDA-ARS?s Scientific Manuscript database
High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
Fast Infrared Chemical Imaging with a Quantum Cascade Laser
2015-01-01
Infrared (IR) spectroscopic imaging systems are a powerful tool for visualizing molecular microstructure of a sample without the need for dyes or stains. Table-top Fourier transform infrared (FT-IR) imaging spectrometers, the current established technology, can record broadband spectral data efficiently but requires scanning the entire spectrum with a low throughput source. The advent of high-intensity, broadly tunable quantum cascade lasers (QCL) has now accelerated IR imaging but results in a fundamentally different type of instrument and approach, namely, discrete frequency IR (DF-IR) spectral imaging. While the higher intensity of the source provides a higher signal per channel, the absence of spectral multiplexing also provides new opportunities and challenges. Here, we couple a rapidly tunable QCL with a high performance microscope equipped with a cooled focal plane array (FPA) detector. Our optical system is conceptualized to provide optimal performance based on recent theory and design rules for high-definition (HD) IR imaging. Multiple QCL units are multiplexed together to provide spectral coverage across the fingerprint region (776.9 to 1904.4 cm–1) in our DF-IR microscope capable of broad spectral coverage, wide-field detection, and diffraction-limited spectral imaging. We demonstrate that the spectral and spatial fidelity of this system is at least as good as the best FT-IR imaging systems. Our configuration provides a speedup for equivalent spectral signal-to-noise ratio (SNR) compared to the best spectral quality from a high-performance linear array system that has 10-fold larger pixels. Compared to the fastest available HD FT-IR imaging system, we demonstrate scanning of large tissue microarrays (TMA) in 3-orders of magnitude smaller time per essential spectral frequency. These advances offer new opportunities for high throughput IR chemical imaging, especially for the measurement of cells and tissues. PMID:25474546
Fast infrared chemical imaging with a quantum cascade laser.
Yeh, Kevin; Kenkel, Seth; Liu, Jui-Nung; Bhargava, Rohit
2015-01-06
Infrared (IR) spectroscopic imaging systems are a powerful tool for visualizing molecular microstructure of a sample without the need for dyes or stains. Table-top Fourier transform infrared (FT-IR) imaging spectrometers, the current established technology, can record broadband spectral data efficiently but requires scanning the entire spectrum with a low throughput source. The advent of high-intensity, broadly tunable quantum cascade lasers (QCL) has now accelerated IR imaging but results in a fundamentally different type of instrument and approach, namely, discrete frequency IR (DF-IR) spectral imaging. While the higher intensity of the source provides a higher signal per channel, the absence of spectral multiplexing also provides new opportunities and challenges. Here, we couple a rapidly tunable QCL with a high performance microscope equipped with a cooled focal plane array (FPA) detector. Our optical system is conceptualized to provide optimal performance based on recent theory and design rules for high-definition (HD) IR imaging. Multiple QCL units are multiplexed together to provide spectral coverage across the fingerprint region (776.9 to 1904.4 cm(-1)) in our DF-IR microscope capable of broad spectral coverage, wide-field detection, and diffraction-limited spectral imaging. We demonstrate that the spectral and spatial fidelity of this system is at least as good as the best FT-IR imaging systems. Our configuration provides a speedup for equivalent spectral signal-to-noise ratio (SNR) compared to the best spectral quality from a high-performance linear array system that has 10-fold larger pixels. Compared to the fastest available HD FT-IR imaging system, we demonstrate scanning of large tissue microarrays (TMA) in 3-orders of magnitude smaller time per essential spectral frequency. These advances offer new opportunities for high throughput IR chemical imaging, especially for the measurement of cells and tissues.
High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
Uplink Downlink Rate Balancing and Throughput Scaling in FDD Massive MIMO Systems
NASA Astrophysics Data System (ADS)
Bergel, Itsik; Perets, Yona; Shamai, Shlomo
2016-05-01
In this work we extend the concept of uplink-downlink rate balancing to frequency division duplex (FDD) massive MIMO systems. We consider a base station with large number antennas serving many single antenna users. We first show that any unused capacity in the uplink can be traded off for higher throughput in the downlink in a system that uses either dirty paper (DP) coding or linear zero-forcing (ZF) precoding. We then also study the scaling of the system throughput with the number of antennas in cases of linear Beamforming (BF) Precoding, ZF Precoding, and DP coding. We show that the downlink throughput is proportional to the logarithm of the number of antennas. While, this logarithmic scaling is lower than the linear scaling of the rate in the uplink, it can still bring significant throughput gains. For example, we demonstrate through analysis and simulation that increasing the number of antennas from 4 to 128 will increase the throughput by more than a factor of 5. We also show that a logarithmic scaling of downlink throughput as a function of the number of receive antennas can be achieved even when the number of transmit antennas only increases logarithmically with the number of receive antennas.
Minimum Interference Channel Assignment Algorithm for Multicast in a Wireless Mesh Network.
Choi, Sangil; Park, Jong Hyuk
2016-12-02
Wireless mesh networks (WMNs) have been considered as one of the key technologies for the configuration of wireless machines since they emerged. In a WMN, wireless routers provide multi-hop wireless connectivity between hosts in the network and also allow them to access the Internet via gateway devices. Wireless routers are typically equipped with multiple radios operating on different channels to increase network throughput. Multicast is a form of communication that delivers data from a source to a set of destinations simultaneously. It is used in a number of applications, such as distributed games, distance education, and video conferencing. In this study, we address a channel assignment problem for multicast in multi-radio multi-channel WMNs. In a multi-radio multi-channel WMN, two nearby nodes will interfere with each other and cause a throughput decrease when they transmit on the same channel. Thus, an important goal for multicast channel assignment is to reduce the interference among networked devices. We have developed a minimum interference channel assignment (MICA) algorithm for multicast that accurately models the interference relationship between pairs of multicast tree nodes using the concept of the interference factor and assigns channels to tree nodes to minimize interference within the multicast tree. Simulation results show that MICA achieves higher throughput and lower end-to-end packet delay compared with an existing channel assignment algorithm named multi-channel multicast (MCM). In addition, MICA achieves much lower throughput variation among the destination nodes than MCM.
Minimum Interference Channel Assignment Algorithm for Multicast in a Wireless Mesh Network
Choi, Sangil; Park, Jong Hyuk
2016-01-01
Wireless mesh networks (WMNs) have been considered as one of the key technologies for the configuration of wireless machines since they emerged. In a WMN, wireless routers provide multi-hop wireless connectivity between hosts in the network and also allow them to access the Internet via gateway devices. Wireless routers are typically equipped with multiple radios operating on different channels to increase network throughput. Multicast is a form of communication that delivers data from a source to a set of destinations simultaneously. It is used in a number of applications, such as distributed games, distance education, and video conferencing. In this study, we address a channel assignment problem for multicast in multi-radio multi-channel WMNs. In a multi-radio multi-channel WMN, two nearby nodes will interfere with each other and cause a throughput decrease when they transmit on the same channel. Thus, an important goal for multicast channel assignment is to reduce the interference among networked devices. We have developed a minimum interference channel assignment (MICA) algorithm for multicast that accurately models the interference relationship between pairs of multicast tree nodes using the concept of the interference factor and assigns channels to tree nodes to minimize interference within the multicast tree. Simulation results show that MICA achieves higher throughput and lower end-to-end packet delay compared with an existing channel assignment algorithm named multi-channel multicast (MCM). In addition, MICA achieves much lower throughput variation among the destination nodes than MCM. PMID:27918438
Quantitative proteomics in cardiovascular research: global and targeted strategies
Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun
2014-01-01
Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501
A modified error correction protocol for CCITT signalling system no. 7 on satellite links
NASA Astrophysics Data System (ADS)
Kreuer, Dieter; Quernheim, Ulrich
1991-10-01
Comite Consultatif International des Telegraphe et Telephone (CCITT) Signalling System No. 7 (SS7) provides a level 2 error correction protocol particularly suited for links with propagation delays higher than 15 ms. Not being originally designed for satellite links, however, the so called Preventive Cyclic Retransmission (PCR) Method only performs well on satellite channels when traffic is low. A modified level 2 error control protocol, termed Fix Delay Retransmission (FDR) method is suggested which performs better at high loads, thus providing a more efficient use of the limited carrier capacity. Both the PCR and the FDR methods are investigated by means of simulation and results concerning throughput, queueing delay, and system delay, respectively. The FDR method exhibits higher capacity and shorter delay than the PCR method.
High-throughput quantification of hydroxyproline for determination of collagen.
Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan
2011-10-15
An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.
A 0.13-µm implementation of 5 Gb/s and 3-mW folded parallel architecture for AES algorithm
NASA Astrophysics Data System (ADS)
Rahimunnisa, K.; Karthigaikumar, P.; Kirubavathy, J.; Jayakumar, J.; Kumar, S. Suresh
2014-02-01
A new architecture for encrypting and decrypting the confidential data using Advanced Encryption Standard algorithm is presented in this article. This structure combines the folded structure with parallel architecture to increase the throughput. The whole architecture achieved high throughput with less power. The proposed architecture is implemented in 0.13-µm Complementary metal-oxide-semiconductor (CMOS) technology. The proposed structure is compared with different existing structures, and from the result it is proved that the proposed structure gives higher throughput and less power compared to existing works.
Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho
2017-01-01
Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.
20170308 - Higher Throughput Toxicokinetics to Allow ...
As part of "Ongoing EDSP Directions & Activities" I will present CSS research on high throughput toxicokinetics, including in vitro data and models to allow rapid determination of the real world doses that may cause endocrine disruption. This is a presentation as part of the U.S. Environmental Protection Agency – Japan Ministry of the Environment 12th Bilateral Meeting on Endocrine Disruption Test Methods Development.
2017-12-11
provides ultra-low energy search operations. To improve throughput, the in-array pipeline scheme has been developed, allowing the MeTCAM to operate at a...controlled magnetic tunnel junction (VC-MTJ), which not only reduces cell area (thus achieving higher density) but also eliminates standby energy . This...Variations of the cell design are presented and evaluated. The results indicated a potential 90x improvement in the energy efficiency and a 50x
High-throughput sample adaptive offset hardware architecture for high-efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin
2018-03-01
A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.
GPU Lossless Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.
2014-01-01
Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.
Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong
2014-01-01
Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980
High-throughput analysis of yeast replicative aging using a microfluidic system
Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong
2015-01-01
Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317
Break-up of droplets in a concentrated emulsion flowing through a narrow constriction
NASA Astrophysics Data System (ADS)
Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team
2014-11-01
Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.
Fourier transform spectroscopy of cotton and cotton trash
USDA-ARS?s Scientific Manuscript database
Fourier Transform techniques have been shown to have higher signal-to-noise capabilities, higher throughput, negligible stray light, continuous spectra, and higher resolution. In addition, FT spectroscopy affords for frequencies in spectra to be measured all at once and more precise wavelength calib...
Analysis of Protein Expression in Cell Microarrays: A Tool for Antibody-based Proteomics
Andersson, Ann-Catrin; Strömberg, Sara; Bäckvall, Helena; Kampf, Caroline; Uhlen, Mathias; Wester, Kenneth; Pontén, Fredrik
2006-01-01
Tissue microarray (TMA) technology provides a possibility to explore protein expression patterns in a multitude of normal and disease tissues in a high-throughput setting. Although TMAs have been used for analysis of tissue samples, robust methods for studying in vitro cultured cell lines and cell aspirates in a TMA format have been lacking. We have adopted a technique to homogeneously distribute cells in an agarose gel matrix, creating an artificial tissue. This enables simultaneous profiling of protein expression in suspension- and adherent-grown cell samples assembled in a microarray. In addition, the present study provides an optimized strategy for the basic laboratory steps to efficiently produce TMAs. Presented modifications resulted in an improved quality of specimens and a higher section yield compared with standard TMA production protocols. Sections from the generated cell TMAs were tested for immunohistochemical staining properties using 20 well-characterized antibodies. Comparison of immunoreactivity in cultured dispersed cells and corresponding cells in tissue samples showed congruent results for all tested antibodies. We conclude that a modified TMA technique, including cell samples, provides a valuable tool for high-throughput analysis of protein expression, and that this technique can be used for global approaches to explore the human proteome. PMID:16957166
Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections
NASA Technical Reports Server (NTRS)
Tchorowski, Nicole; Murawski, Robert
2014-01-01
New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option, such as during real time transmissions, the SLE implementation cannot support high data rate communication.
Autotasked Performance in the NAS Workload: A Statistical Analysis
NASA Technical Reports Server (NTRS)
Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)
1998-01-01
A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.
Cao, K F; Zhang, H H; Han, H H; Song, Y; Bai, X L; Sun, H
2016-05-01
In this study, we comprehensively investigated the effect of dietary protein sources on the gut microbiome of weaned piglets with diets comprising different protein source using High-throughput 16SrRNA gene-based Illumina Miseq. A total of 48 healthy weaned piglets were allocated randomly to four treatments with 12 piglets in each group. The weaned piglets were fed with diets containing soybean meal (SBM), cottonseed meal (CSM), SBM and CSM (SC) or fish meal (FM). The intestinal content samples were taken from five segments of the small intestine. DNA was extracted from the samples and the V3-V4 regions of the 16SrRNA gene were amplified. The microbiota of the contents of the small intestine were very complex, including more than 4000 operational taxonomic units belonging to 32 different phyla. Four bacterial populations (i.e. Firmicutes, Proteobacteria, Bacteroidetes and Acidobacteria) were the most abundant bacterial groups. The genera Lactobacillus and Clostridium were found in slightly higher proportions in the groups with added CSM compared to the other groups. The proportion of reads assigned to the genus Escherichia/Shigella was much higher in the FM group. In conclusion, dietary protein source had significant effects on the small microbiome of weaned piglets. Dietary protein source have the potential to affect the small intestine microbiome of weaned piglets that will have a large impact on its metabolic capabilities and intestinal health. In this study, we successfully identified the microbiomes in the contents of the small intestine in the weaned piglets that were fed different protein source diets using high-throughput sequencing. The finding provided an evidence for the option of the appropriate protein source in the actual production. © 2016 The Society for Applied Microbiology.
Netest: A Tool to Measure the Maximum Burst Size, Available Bandwidth and Achievable Throughput
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Guojun; Tierney, Brian
2003-01-31
Distinguishing available bandwidth and achievable throughput is essential for improving network applications' performance. Achievable throughput is the throughput considering a number of factors such as network protocol, host speed, network path, and TCP buffer space, where as available bandwidth only considers the network path. Without understanding this difference, trying to improve network applications' performance is like ''blind men feeling the elephant'' [4]. In this paper, we define and distinguish bandwidth and throughput, and debate which part of each is achievable and which is available. Also, we introduce and discuss a new concept - Maximum Burst Size that is crucial tomore » the network performance and bandwidth sharing. A tool, netest, is introduced to help users to determine the available bandwidth, and provides information to achieve better throughput with fairness of sharing the available bandwidth, thus reducing misuse of the network.« less
High Throughput Transcriptomics @ USEPA (Toxicology ...
The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.
A Barcoding Strategy Enabling Higher-Throughput Library Screening by Microscopy.
Chen, Robert; Rishi, Harneet S; Potapov, Vladimir; Yamada, Masaki R; Yeh, Vincent J; Chow, Thomas; Cheung, Celia L; Jones, Austin T; Johnson, Terry D; Keating, Amy E; DeLoache, William C; Dueber, John E
2015-11-20
Dramatic progress has been made in the design and build phases of the design-build-test cycle for engineering cells. However, the test phase usually limits throughput, as many outputs of interest are not amenable to rapid analytical measurements. For example, phenotypes such as motility, morphology, and subcellular localization can be readily measured by microscopy, but analysis of these phenotypes is notoriously slow. To increase throughput, we developed microscopy-readable barcodes (MiCodes) composed of fluorescent proteins targeted to discernible organelles. In this system, a unique barcode can be genetically linked to each library member, making possible the parallel analysis of phenotypes of interest via microscopy. As a first demonstration, we MiCoded a set of synthetic coiled-coil leucine zipper proteins to allow an 8 × 8 matrix to be tested for specific interactions in micrographs consisting of mixed populations of cells. A novel microscopy-readable two-hybrid fluorescence localization assay for probing candidate interactions in the cytosol was also developed using a bait protein targeted to the peroxisome and a prey protein tagged with a fluorescent protein. This work introduces a generalizable, scalable platform for making microscopy amenable to higher-throughput library screening experiments, thereby coupling the power of imaging with the utility of combinatorial search paradigms.
Hierarchical Data Distribution Scheme for Peer-to-Peer Networks
NASA Astrophysics Data System (ADS)
Bhushan, Shashi; Dave, M.; Patel, R. B.
2010-11-01
In the past few years, peer-to-peer (P2P) networks have become an extremely popular mechanism for large-scale content sharing. P2P systems have focused on specific application domains (e.g. music files, video files) or on providing file system like capabilities. P2P is a powerful paradigm, which provides a large-scale and cost-effective mechanism for data sharing. P2P system may be used for storing data globally. Can we implement a conventional database on P2P system? But successful implementation of conventional databases on the P2P systems is yet to be reported. In this paper we have presented the mathematical model for the replication of the partitions and presented a hierarchical based data distribution scheme for the P2P networks. We have also analyzed the resource utilization and throughput of the P2P system with respect to the availability, when a conventional database is implemented over the P2P system with variable query rate. Simulation results show that database partitions placed on the peers with higher availability factor perform better. Degradation index, throughput, resource utilization are the parameters evaluated with respect to the availability factor.
Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)
High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...
High-throughput sequencing methods to study neuronal RNA-protein interactions.
Ule, Jernej
2009-12-01
UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.
Blocking Filters with Enhanced Throughput for X-Ray Microcalorimetry
NASA Technical Reports Server (NTRS)
Grove, David; Betcher, Jacob; Hagen, Mark
2012-01-01
New and improved blocking filters (see figure) have been developed for microcalorimeters on several mission payloads, made of high-transmission polyimide support mesh, that can replace the nickel mesh used in previous blocking filter flight designs. To realize the resolution and signal sensitivity of today s x-ray microcalorimeters, significant improvements in the blocking filter stack are needed. Using high-transmission polyimide support mesh, it is possible to improve overall throughput on a typical microcalorimeter such as Suzaku s X-ray Spectrometer by 11%, compared to previous flight designs. Using polyimide to replace standard metal mesh means the mesh will be transparent to energies 3 keV and higher. Incorporating polyimide s advantageous strength-to-weight ratio, thermal stability, and transmission characteristics permits thinner filter materials, significantly enhancing through - put. A prototype contamination blocking filter for ASTRO-H has passed QT-level acoustic testing. Resistive traces can also be incorporated to provide decontamination capability to actively restore filter performance in orbit.
Qekwana, Daniel Nenene; Oguttu, James Wabwire; Venter, Dries; Odoi, Agricola
2016-01-01
Bovine Taenia saginata cysticercus infections (also called bovine cysticercosis or beef measles) is usually diagnosed in cattle only during post-mortem meat inspection. The aim of this study was to investigate the identification rates of these infections in and to identify predictors/determinants of variations in the identification rates in abattoirs in Gauteng province, South Africa. Retrospective data for over 1.4 million cattle carcasses inspected in 26 abattoirs between January 2010 and December 2013 were used for the study. The identification rates (proportion of bovine Taenia saginata cysticercus positive carcasses) were computed and generalized estimating equations used to identify predictors/determinants of identification rates. The overall identification rate was 0.70% (95% CI: 0.45, 0.95). Significantly (p< 0.05) lower rates were reported during summer (0.55%) than other seasons. Some geographic areas reported significantly (p<0.05) higher rates than others. The identification rates in high throughput abattoirs was significantly (p<0.05) higher (RR: 9.4; 95% CI: 4.7-19.1) than in low throughput abattoirs. Similarly, the identification rates among animals from feedlots were significantly (p<0.05) higher (RR: 1.6; 95% CI: 1.7-3.5) than those from non-feedlot sources. No significant (p>0.05) association was identified between identification rates and either the number of meat inspectors per abattoir or the provider of inspection services. Although no significant association was found between identification rates and provider of inspection services, follow-up studies will need to be done to specifically investigate the potential conflict of interest arising from the fact that abattoir owners hire meat inspection services directly. Capture of abattoir surveillance data needs to include farm address and for each case to be reported separately. Finally, information on the type of identified cysts (alive or calcified) need to be collected to help better estimate risk to consumers. This study provides useful baseline data to guide future studies, surveillance and control efforts.
Shahini, Mehdi; Yeow, John T W
2011-08-12
We report on the enhancement of electrical cell lysis using carbon nanotubes (CNTs). Electrical cell lysis systems are widely utilized in microchips as they are well suited to integration into lab-on-a-chip devices. However, cell lysis based on electrical mechanisms has high voltage requirements. Here, we demonstrate that by incorporating CNTs into microfluidic electrolysis systems, the required voltage for lysis is reduced by half and the lysis throughput at low voltages is improved by ten times, compared to non-CNT microchips. In our experiment, E. coli cells are lysed while passing through an electric field in a microchannel. Based on the lightning rod effect, the electric field strengthened at the tip of the CNTs enhances cell lysis at lower voltage and higher throughput. This approach enables easy integration of cell lysis with other on-chip high-throughput sample-preparation processes.
Searching for resistance genes to Bursaphelenchus xylophilus using high throughput screening.
Santos, Carla S; Pinheiro, Miguel; Silva, Ana I; Egas, Conceição; Vasconcelos, Marta W
2012-11-07
Pine wilt disease (PWD), caused by the pinewood nematode (PWN; Bursaphelenchus xylophilus), damages and kills pine trees and is causing serious economic damage worldwide. Although the ecological mechanism of infestation is well described, the plant's molecular response to the pathogen is not well known. This is due mainly to the lack of genomic information and the complexity of the disease. High throughput sequencing is now an efficient approach for detecting the expression of genes in non-model organisms, thus providing valuable information in spite of the lack of the genome sequence. In an attempt to unravel genes potentially involved in the pine defense against the pathogen, we hereby report the high throughput comparative sequence analysis of infested and non-infested stems of Pinus pinaster (very susceptible to PWN) and Pinus pinea (less susceptible to PWN). Four cDNA libraries from infested and non-infested stems of P. pinaster and P. pinea were sequenced in a full 454 GS FLX run, producing a total of 2,083,698 reads. The putative amino acid sequences encoded by the assembled transcripts were annotated according to Gene Ontology, to assign Pinus contigs into Biological Processes, Cellular Components and Molecular Functions categories. Most of the annotated transcripts corresponded to Picea genes-25.4-39.7%, whereas a smaller percentage, matched Pinus genes, 1.8-12.8%, probably a consequence of more public genomic information available for Picea than for Pinus. The comparative transcriptome analysis showed that when P. pinaster was infested with PWN, the genes malate dehydrogenase, ABA, water deficit stress related genes and PAR1 were highly expressed, while in PWN-infested P. pinea, the highly expressed genes were ricin B-related lectin, and genes belonging to the SNARE and high mobility group families. Quantitative PCR experiments confirmed the differential gene expression between the two pine species. Defense-related genes triggered by nematode infestation were detected in both P. pinaster and P. pinea transcriptomes utilizing 454 pyrosequencing technology. P. pinaster showed higher abundance of genes related to transcriptional regulation, terpenoid secondary metabolism (including some with nematicidal activity) and pathogen attack. P. pinea showed higher abundance of genes related to oxidative stress and higher levels of expression in general of stress responsive genes. This study provides essential information about the molecular defense mechanisms utilized by P. pinaster and P. pinea against PWN infestation and contributes to a better understanding of PWD.
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo
2017-11-21
Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.
High throughput imaging cytometer with acoustic focussing.
Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter
2015-10-31
We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.
Protocols and programs for high-throughput growth and aging phenotyping in yeast.
Jung, Paul P; Christian, Nils; Kay, Daniel P; Skupin, Alexander; Linster, Carole L
2015-01-01
In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass) correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS), a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.
I describe research on high throughput exposure and toxicokinetics. These tools provide context for data generated by high throughput toxicity screening to allow risk-based prioritization of thousands of chemicals.
Lin, Frank Yeong-Sung; Hsiao, Chiu-Han; Yen, Hong-Hsu; Hsieh, Yu-Jen
2013-01-01
One of the important applications in Wireless Sensor Networks (WSNs) is video surveillance that includes the tasks of video data processing and transmission. Processing and transmission of image and video data in WSNs has attracted a lot of attention in recent years. This is known as Wireless Visual Sensor Networks (WVSNs). WVSNs are distributed intelligent systems for collecting image or video data with unique performance, complexity, and quality of service challenges. WVSNs consist of a large number of battery-powered and resource constrained camera nodes. End-to-end delay is a very important Quality of Service (QoS) metric for video surveillance application in WVSNs. How to meet the stringent delay QoS in resource constrained WVSNs is a challenging issue that requires novel distributed and collaborative routing strategies. This paper proposes a Near-Optimal Distributed QoS Constrained (NODQC) routing algorithm to achieve an end-to-end route with lower delay and higher throughput. A Lagrangian Relaxation (LR)-based routing metric that considers the “system perspective” and “user perspective” is proposed to determine the near-optimal routing paths that satisfy end-to-end delay constraints with high system throughput. The empirical results show that the NODQC routing algorithm outperforms others in terms of higher system throughput with lower average end-to-end delay and delay jitter. In this paper, for the first time, the algorithm shows how to meet the delay QoS and at the same time how to achieve higher system throughput in stringently resource constrained WVSNs.
Optimizing multi-dimensional high throughput screening using zebrafish
Truong, Lisa; Bugel, Sean M.; Chlebowski, Anna; Usenko, Crystal Y.; Simonich, Michael T.; Massey Simonich, Staci L.; Tanguay, Robert L.
2016-01-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. PMID:27453428
Xie, Yongchao; Wu, Bing; Zhang, Xu-Xiang; Yin, Jinbao; Mao, Liang; Hu, Maojie
2016-02-01
Graphene is a promising candidate as an antibacterial material owning to its bacterial toxicity. However, little information on influence of graphene on gut microbiota is available. In this study, mice were exposed to graphene for 4 weeks, and high-throughput sequencing was applied to characterize the changes in microbial community and antibiotic resistance genes (ARGs) in mouse gut. The results showed that graphene exposure increased biodiversity of gut microbiota, and changed their community. The 1 μg/d graphene exposure had higher influences on the gut microbiota than 10 μg/d and 100 μg/d graphene exposures, which might be due to higher aggregation of high-level graphene. The influence of graphene on gut microbiota might attribute to that graphene could induce oxidative stress and damage of cell membrane integrity. The results were verified by the increase of ratio of Gram-negative bacteria. Outer membrane of Gram-negative bacteria could reduce the membrane damage induced by graphene and make them more tolerance to graphene. Further, we found that graphene exposure significantly increased the abundance and types of ARGs, indicating a potential health risk of graphene. This study firstly provides new insight to the health effects of graphene on gut microbiota. Copyright © 2015 Elsevier Ltd. All rights reserved.
Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis
NASA Technical Reports Server (NTRS)
Montgomery, Todd L.
1995-01-01
This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, D.; Edwards, T.
High-level waste (HLW) throughput (i.e., the amount of waste processed per unit of time) is primarily a function of two critical parameters: waste loading (WL) and melt rate. For the Defense Waste Processing Facility (DWPF), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). Significant increases in waste throughput have been achieved at DWPF since initial radioactive operations began in 1996. Key technical and operational initiatives that supported increased waste throughput included improvements in facility attainment, the Chemical Processing Cell (CPC) flowsheet, process control models and frit formulations. As a resultmore » of these key initiatives, DWPF increased WLs from a nominal 28% for Sludge Batch 2 (SB2) to {approx}34 to 38% for SB3 through SB6 while maintaining or slightly improving canister fill times. Although considerable improvements in waste throughput have been obtained, future contractual waste loading targets are nominally 40%, while canister production rates are also expected to increase (to a rate of 325 to 400 canisters per year). Although implementation of bubblers have made a positive impact on increasing melt rate for recent sludge batches targeting WLs in the mid30s, higher WLs will ultimately make the feeds to DWPF more challenging to process. Savannah River Remediation (SRR) recently requested the Savannah River National Laboratory (SRNL) to perform a paper study assessment using future sludge projections to evaluate whether the current Process Composition Control System (PCCS) algorithms would provide projected operating windows to allow future contractual WL targets to be met. More specifically, the objective of this study was to evaluate future sludge batch projections (based on Revision 16 of the HLW Systems Plan) with respect to projected operating windows using current PCCS models and associated constraints. Based on the assessments, the waste loading interval over which a glass system (i.e., a projected sludge composition with a candidate frit) is predicted to be acceptable can be defined (i.e., the projected operating window) which will provide insight into the ability to meet future contractual WL obligations. In this study, future contractual WL obligations are assumed to be 40%, which is the goal after all flowsheet enhancements have been implemented to support DWPF operations. For a system to be considered acceptable, candidate frits must be identified that provide access to at least 40% WL while accounting for potential variation in the sludge resulting from differences in batch-to-batch transfers into the Sludge Receipt and Adjustment Tank (SRAT) and/or analytical uncertainties. In more general terms, this study will assess whether or not the current glass formulation strategy (based on the use of the Nominal and Variation Stage assessments) and current PCCS models will allow access to compositional regions required to targeted higher WLs for future operations. Some of the key questions to be considered in this study include: (1) If higher WLs are attainable with current process control models, are the models valid in these compositional regions? If the higher WL glass regions are outside current model development or validation ranges, is there existing data that could be used to demonstrate model applicability (or lack thereof)? If not, experimental data may be required to revise current models or serve as validation data with the existing models. (2) Are there compositional trends in frit space that are required by the PCCS models to obtain access to these higher WLs? If so, are there potential issues with the compositions of the associated frits (e.g., limitations on the B{sub 2}O{sub 3} and/or Li{sub 2}O concentrations) as they are compared to model development/validation ranges or to the term 'borosilicate' glass? If limitations on the frit compositional range are realized, what is the impact of these restrictions on other glass properties such as the ability to suppress nepheline formation or influence melt rate? The model based assessments being performed make the assumption that the process control models are applicable over the glass compositional regions being evaluated. Although the glass compositional region of interest is ultimately defined by the specific frit, sludge, and WL interval used, there is no prescreening of these compositional regions with respect to the model development or validation ranges which is consistent with current DWPF operations.« less
Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays
High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...
QPatch: the missing link between HTS and ion channel drug discovery.
Mathes, Chris; Friis, Søren; Finley, Michael; Liu, Yi
2009-01-01
The conventional patch clamp has long been considered the best approach for studying ion channel function and pharmacology. However, its low throughput has been a major hurdle to overcome for ion channel drug discovery. The recent emergence of higher throughput, automated patch clamp technology begins to break this bottleneck by providing medicinal chemists with high-quality, information-rich data in a more timely fashion. As such, these technologies have the potential to bridge a critical missing link between high-throughput primary screening and meaningful ion channel drug discovery programs. One of these technologies, the QPatch automated patch clamp system developed by Sophion Bioscience, records whole-cell ion channel currents from 16 or 48 individual cells in a parallel fashion. Here, we review the general applicability of the QPatch to studying a wide variety of ion channel types (voltage-/ligand-gated cationic/anionic channels) in various expression systems. The success rate of gigaseals, formation of the whole-cell configuration and usable cells ranged from 40-80%, depending on a number of factors including the cell line used, ion channel expressed, assay development or optimization time and expression level in these studies. We present detailed analyses of the QPatch features and results in case studies in which secondary screening assays were successfully developed for a voltage-gated calcium channel and a ligand-gated TRP channel. The increase in throughput compared to conventional patch clamp with the same cells was approximately 10-fold. We conclude that the QPatch, combining high data quality and speed with user friendliness and suitability for a wide array of ion channels, resides on the cutting edge of automated patch clamp technology and plays a pivotal role in expediting ion channel drug discovery.
Performance of TCP variants over LTE network
NASA Astrophysics Data System (ADS)
Nor, Shahrudin Awang; Maulana, Ade Novia
2016-08-01
One of the implementation of a wireless network is based on mobile broadband technology Long Term Evolution (LTE). LTE offers a variety of advantages, especially in terms of access speed, capacity, architectural simplicity and ease of implementation, as well as the breadth of choice of the type of user equipment (UE) that can establish the access. The majority of the Internet connections in the world happen using the TCP (Transmission Control Protocol) due to the TCP's reliability in transmitting packets in the network. TCP reliability lies in the ability to control the congestion. TCP was originally designed for wired media, but LTE connected through a wireless medium that is not stable in comparison to wired media. A wide variety of TCP has been made to produce a better performance than its predecessor. In this study, we simulate the performance provided by the TCP NewReno and TCP Vegas based on simulation using network simulator version 2 (ns2). The TCP performance is analyzed in terms of throughput, packet loss and end-to-end delay. In comparing the performance of TCP NewReno and TCP Vegas, the simulation result shows that the throughput of TCP NewReno is slightly higher than TCP Vegas, while TCP Vegas gives significantly better end-to-end delay and packet loss. The analysis of throughput, packet loss and end-to-end delay are made to evaluate the simulation.
Novel screening techniques for ion channel targeting drugs
Obergrussberger, Alison; Stölzle-Feix, Sonja; Becker, Nadine; Brüggemann, Andrea; Fertig, Niels; Möller, Clemens
2015-01-01
Ion channels are integral membrane proteins that regulate the flux of ions across the cell membrane. They are involved in nearly all physiological processes, and malfunction of ion channels has been linked to many diseases. Until recently, high-throughput screening of ion channels was limited to indirect, e.g. fluorescence-based, readout technologies. In the past years, direct label-free biophysical readout technologies by means of electrophysiology have been developed. Planar patch-clamp electrophysiology provides a direct functional label-free readout of ion channel function in medium to high throughput. Further electrophysiology features, including temperature control and higher-throughput instruments, are continually being developed. Electrophysiological screening in a 384-well format has recently become possible. Advances in chip and microfluidic design, as well as in cell preparation and handling, have allowed challenging cell types to be studied by automated patch clamp. Assays measuring action potentials in stem cell-derived cardiomyocytes, relevant for cardiac safety screening, and neuronal cells, as well as a large number of different ion channels, including fast ligand-gated ion channels, have successfully been established by automated patch clamp. Impedance and multi-electrode array measurements are particularly suitable for studying cardiomyocytes and neuronal cells within their physiological network, and to address more complex physiological questions. This article discusses recent advances in electrophysiological technologies available for screening ion channel function and regulation. PMID:26556400
Novel screening techniques for ion channel targeting drugs.
Obergrussberger, Alison; Stölzle-Feix, Sonja; Becker, Nadine; Brüggemann, Andrea; Fertig, Niels; Möller, Clemens
2015-01-01
Ion channels are integral membrane proteins that regulate the flux of ions across the cell membrane. They are involved in nearly all physiological processes, and malfunction of ion channels has been linked to many diseases. Until recently, high-throughput screening of ion channels was limited to indirect, e.g. fluorescence-based, readout technologies. In the past years, direct label-free biophysical readout technologies by means of electrophysiology have been developed. Planar patch-clamp electrophysiology provides a direct functional label-free readout of ion channel function in medium to high throughput. Further electrophysiology features, including temperature control and higher-throughput instruments, are continually being developed. Electrophysiological screening in a 384-well format has recently become possible. Advances in chip and microfluidic design, as well as in cell preparation and handling, have allowed challenging cell types to be studied by automated patch clamp. Assays measuring action potentials in stem cell-derived cardiomyocytes, relevant for cardiac safety screening, and neuronal cells, as well as a large number of different ion channels, including fast ligand-gated ion channels, have successfully been established by automated patch clamp. Impedance and multi-electrode array measurements are particularly suitable for studying cardiomyocytes and neuronal cells within their physiological network, and to address more complex physiological questions. This article discusses recent advances in electrophysiological technologies available for screening ion channel function and regulation.
Schnoes, Alexandra M.; Ream, David C.; Thorman, Alexander W.; Babbitt, Patricia C.; Friedberg, Iddo
2013-01-01
The ongoing functional annotation of proteins relies upon the work of curators to capture experimental findings from scientific literature and apply them to protein sequence and structure data. However, with the increasing use of high-throughput experimental assays, a small number of experimental studies dominate the functional protein annotations collected in databases. Here, we investigate just how prevalent is the “few articles - many proteins” phenomenon. We examine the experimentally validated annotation of proteins provided by several groups in the GO Consortium, and show that the distribution of proteins per published study is exponential, with 0.14% of articles providing the source of annotations for 25% of the proteins in the UniProt-GOA compilation. Since each of the dominant articles describes the use of an assay that can find only one function or a small group of functions, this leads to substantial biases in what we know about the function of many proteins. Mass-spectrometry, microscopy and RNAi experiments dominate high throughput experiments. Consequently, the functional information derived from these experiments is mostly of the subcellular location of proteins, and of the participation of proteins in embryonic developmental pathways. For some organisms, the information provided by different studies overlap by a large amount. We also show that the information provided by high throughput experiments is less specific than those provided by low throughput experiments. Given the experimental techniques available, certain biases in protein function annotation due to high-throughput experiments are unavoidable. Knowing that these biases exist and understanding their characteristics and extent is important for database curators, developers of function annotation programs, and anyone who uses protein function annotation data to plan experiments. PMID:23737737
Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...
High-throughput screening, predictive modeling and computational embryology - Abstract
High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...
Zheng, Ji; Zhou, Zhenchao; Wei, Yuanyuan; Chen, Tao; Feng, Wanqiu; Chen, Hong
2018-05-01
The rapid expansion of human activity in a region can exacerbate human health risks induced by antibiotic resistance genes (ARGs). Peri-urban ecosystems serve at the symbiotic interface between urban and rural ecosystems, and investigations into the dissemination of ARGs in peri-urban areas provide a basic framework for tracking the spread of ARGs and potential mitigations. In this study, through the use of high-throughput quantitative PCR and 16S rRNA gene high-throughput sequencing, seasonal and geographical distributions of ARGs and their host bacterial communities were characterized in a peri-urban river. The abundance of ARGs in downstream was 5.2-33.9 times higher than upstream, which indicated distinct antibiotic resistance pollution in the areas where human lives. With the comparison classified based on land use nearby, the abundance of ARGs in samples near farmland and villages was higher than in the background (3.47-5.58 times), pointing to the high load in the river caused by farming and other human activities in the peri-urban areas. With the co-occurrence pattern revealed by network analysis, blaVEB and tetM were proposed to be indicators of ARGs which get together in the same module. Furthermore, seasonal variations in ARGs and the transport of bacterial communities were observed. The effects of seasonal temperature on the dissemination of ARGs along the watershed was also evaluated. The highest absolute abundance of ARGs occurred in summer (2.81 × 10 9 copies/L on average), the trends of ARG abundances in four seasons were similar with local air temperature. The Linear discriminant analysis effect size (LEfSe) suggested that nine bacterial genera were implicated as biomarkers for the corresponding season. Mobile genetic elements (MGEs) showed significant positive correlation with ARGs (P < 0.01) and MGEs were also identified as the key-contributing factor driving ARG alteration. This study provides an overview of seasonal and geographical variations in ARGs distribution in a peri-urban river and draws attention to controlling pollutants in peri-urban ecosystems. Copyright © 2018 Elsevier Ltd. All rights reserved.
[Current applications of high-throughput DNA sequencing technology in antibody drug research].
Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong
2012-03-01
Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.
NASA Astrophysics Data System (ADS)
Fang, Sheng-Po; Jao, PitFee; Senior, David E.; Kim, Kyoung-Tae; Yoon, Yong-Kyu
2017-12-01
High throughput nanomanufacturing of photopatternable nanofibers and subsequent photopatterning is reported. For the production of high density nanofibers, the tube nozzle electrospinning (TNE) process has been used, where an array of micronozzles on the sidewall of a plastic tube are used as spinnerets. By increasing the density of nozzles, the electric fields of adjacent nozzles confine the cone of electrospinning and give a higher density of nanofibers. With TNE, higher density nozzles are easily achievable compared to metallic nozzles, e.g. an inter-nozzle distance as small as 0.5 cm and an average semi-vertical repulsion angle of 12.28° for 8-nozzles were achieved. Nanofiber diameter distribution, mass throughput rate, and growth rate of nanofiber stacks in different operating conditions and with different numbers of nozzles, such as 2, 4 and 8 nozzles, and scalability with single and double tube configurations are discussed. Nanofibers made of SU-8, photopatternable epoxy, have been collected to a thickness of over 80 μm in 240 s of electrospinning and the production rate of 0.75 g/h is achieved using the 2 tube 8 nozzle systems, followed by photolithographic micropatterning. TNE is scalable to a large number of nozzles, and offers high throughput production, plug and play capability with standard electrospinning equipment, and little waste of polymer.
Micro-patterned agarose gel devices for single-cell high-throughput microscopy of E. coli cells.
Priest, David G; Tanaka, Nobuyuki; Tanaka, Yo; Taniguchi, Yuichi
2017-12-21
High-throughput microscopy of bacterial cells elucidated fundamental cellular processes including cellular heterogeneity and cell division homeostasis. Polydimethylsiloxane (PDMS)-based microfluidic devices provide advantages including precise positioning of cells and throughput, however device fabrication is time-consuming and requires specialised skills. Agarose pads are a popular alternative, however cells often clump together, which hinders single cell quantitation. Here, we imprint agarose pads with micro-patterned 'capsules', to trap individual cells and 'lines', to direct cellular growth outwards in a straight line. We implement this micro-patterning into multi-pad devices called CapsuleHotel and LineHotel for high-throughput imaging. CapsuleHotel provides ~65,000 capsule structures per mm 2 that isolate individual Escherichia coli cells. In contrast, LineHotel provides ~300 line structures per mm that direct growth of micro-colonies. With CapsuleHotel, a quantitative single cell dataset of ~10,000 cells across 24 samples can be acquired and analysed in under 1 hour. LineHotel allows tracking growth of > 10 micro-colonies across 24 samples simultaneously for up to 4 generations. These easy-to-use devices can be provided in kit format, and will accelerate discoveries in diverse fields ranging from microbiology to systems and synthetic biology.
Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.
Yang, Darren; Wong, Wesley P
2018-01-01
We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.
High throughput single cell counting in droplet-based microfluidics.
Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie
2017-05-02
Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.
Optimizing multi-dimensional high throughput screening using zebrafish.
Truong, Lisa; Bugel, Sean M; Chlebowski, Anna; Usenko, Crystal Y; Simonich, Michael T; Simonich, Staci L Massey; Tanguay, Robert L
2016-10-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. Copyright © 2016 Elsevier Inc. All rights reserved.
Environmental Impact on Vascular Development Predicted by High Throughput Screening
Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...
tcpl: The ToxCast Pipeline for High-Throughput Screening Data
Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...
High-throughput screening, predictive modeling and computational embryology
High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...
High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, c...
High-Throughput Toxicokinetics (HTTK) R package (CompTox CoP presentation)
Toxicokinetics (TK) provides a bridge between HTS and HTE by predicting tissue concentrations due to exposure, but traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to determine range of effic...
Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.
2016-01-01
Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633
Khan, Arifa S; Vacante, Dominick A; Cassart, Jean-Pol; Ng, Siemon H S; Lambert, Christophe; Charlebois, Robert L; King, Kathryn E
Several nucleic-acid based technologies have recently emerged with capabilities for broad virus detection. One of these, high throughput sequencing, has the potential for novel virus detection because this method does not depend upon prior viral sequence knowledge. However, the use of high throughput sequencing for testing biologicals poses greater challenges as compared to other newly introduced tests due to its technical complexities and big data bioinformatics. Thus, the Advanced Virus Detection Technologies Users Group was formed as a joint effort by regulatory and industry scientists to facilitate discussions and provide a forum for sharing data and experiences using advanced new virus detection technologies, with a focus on high throughput sequencing technologies. The group was initiated as a task force that was coordinated by the Parenteral Drug Association and subsequently became the Advanced Virus Detection Technologies Interest Group to continue efforts for using new technologies for detection of adventitious viruses with broader participation, including international government agencies, academia, and technology service providers. © PDA, Inc. 2016.
Li, Fumin; Wang, Jun; Jenkins, Rand
2016-05-01
There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.
Park, Chanhun; Nam, Hee-Geun; Kim, Pung-Ho; Mun, Sungyong
2014-06-01
The removal of isoleucine from valine has been a key issue in the stage of valine crystallization, which is the final step in the valine production process in industry. To address this issue, a three-zone simulated moving-bed (SMB) process for the separation of valine and isoleucine has been developed previously. However, the previous process, which was based on a classical port-location mode, had some limitations in throughput and valine product concentration. In this study, a three-zone SMB process based on a modified port-location mode was applied to the separation of valine and isoleucine for the purpose of making a marked improvement in throughput and valine product concentration. Computer simulations and a lab-scale process experiment showed that the modified three-zone SMB for valine separation led to >65% higher throughput and >160% higher valine concentration compared to the previous three-zone SMB for the same separation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Cacouris, Theodore; Rao, Rajasekhar; Rokitski, Rostislav; Jiang, Rui; Melchior, John; Burfeindt, Bernd; O'Brien, Kevin
2012-03-01
Deep UV (DUV) lithography is being applied to pattern increasingly finer geometries, leading to solutions like double- and multiple-patterning. Such process complexities lead to higher costs due to the increasing number of steps required to produce the desired results. One of the consequences is that the lithography equipment needs to provide higher operating efficiencies to minimize the cost increases, especially for producers of memory devices that experience a rapid decline in sales prices of these products over time. In addition to having introduced higher power 193nm light sources to enable higher throughput, we previously described technologies that also enable: higher tool availability via advanced discharge chamber gas management algorithms; improved process monitoring via enhanced on-board beam metrology; and increased depth of focus (DOF) via light source bandwidth modulation. In this paper we will report on the field performance of these technologies with data that supports the desired improvements in on-wafer performance and operational efficiencies.
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
High-Throughput Bit-Serial LDPC Decoder LSI Based on Multiple-Valued Asynchronous Interleaving
NASA Astrophysics Data System (ADS)
Onizawa, Naoya; Hanyu, Takahiro; Gaudet, Vincent C.
This paper presents a high-throughput bit-serial low-density parity-check (LDPC) decoder that uses an asynchronous interleaver. Since consecutive log-likelihood message values on the interleaver are similar, node computations are continuously performed by using the most recently arrived messages without significantly affecting bit-error rate (BER) performance. In the asynchronous interleaver, each message's arrival rate is based on the delay due to the wire length, so that the decoding throughput is not restricted by the worst-case latency, which results in a higher average rate of computation. Moreover, the use of a multiple-valued data representation makes it possible to multiplex control signals and data from mutual nodes, thus minimizing the number of handshaking steps in the asynchronous interleaver and eliminating the clock signal entirely. As a result, the decoding throughput becomes 1.3 times faster than that of a bit-serial synchronous decoder under a 90nm CMOS technology, at a comparable BER.
High-throughput countercurrent microextraction in passive mode.
Xie, Tingliang; Xu, Cong
2018-05-15
Although microextraction is much more efficient than conventional macroextraction, its practical application has been limited by low throughputs and difficulties in constructing robust countercurrent microextraction (CCME) systems. In this work, a robust CCME process was established based on a novel passive microextractor with four units without any moving parts. The passive microextractor has internal recirculation and can efficiently mix two immiscible liquids. The hydraulic characteristics as well as the extraction and back-extraction performance of the passive CCME were investigated experimentally. The recovery efficiencies of the passive CCME were 1.43-1.68 times larger than the best values achieved using cocurrent extraction. Furthermore, the total throughput of the passive CCME developed in this work was about one to three orders of magnitude higher than that of other passive CCME systems reported in the literature. Therefore, a robust CCME process with high throughputs has been successfully constructed, which may promote the application of passive CCME in a wide variety of fields.
NASA Technical Reports Server (NTRS)
Jandebeur, T. S.
1980-01-01
The effect of sample concentration on throughput and resolution in a modified continuous particle electrophoresis (CPE) system with flow in an upward direction is investigated. Maximum resolution is achieved at concentrations ranging from 2 x 10 to the 8th power cells/ml to 8 x 10 to the 8th power cells/ml. The widest peak separation is at 2 x 10 to the 8th power cells/ml; however, the sharpest peaks and least overlap between cell populations is at 8 x 10 to the 8th power cells/ml. Apparently as a result of improved electrophoresis cell performance due to coasting the chamber with bovine serum albumin, changing the electrode membranes and rinse, and lowering buffer temperatures, sedimentation effects attending to higher concentrations are diminished. Throughput as measured by recovery of fixed cells is diminished at the concentrations judged most likely to yield satisfactory resolution. The tradeoff appears to be improved recovery/throughput at the expense of resolution.
Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure, however traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...
High-throughput methods for characterizing the mechanical properties of coatings
NASA Astrophysics Data System (ADS)
Siripirom, Chavanin
The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component segregates to the surface during curing. The distribution of siloxane as a function of thickness into the sample showed differences depending on the formulation parameters. The coatings which had higher siloxane content near the surface were those coatings found to perform well in field tests.
High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis
Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.
2016-09-23
High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less
2015-01-01
High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296
Quasi-random array imaging collimator
Fenimore, E.E.
1980-08-20
A hexagonally shaped quasi-random no-two-holes-touching imaging collimator. The quasi-random array imaging collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasing throughput by elimination of a substrate. The present invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.
Fenimore, E.E.
1980-08-22
A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.
Development and Application of a High Throughput Protein Unfolding Kinetic Assay
Wang, Qiang; Waterhouse, Nicklas; Feyijinmi, Olusegun; Dominguez, Matthew J.; Martinez, Lisa M.; Sharp, Zoey; Service, Rachel; Bothe, Jameson R.; Stollar, Elliott J.
2016-01-01
The kinetics of folding and unfolding underlie protein stability and quantification of these rates provides important insights into the folding process. Here, we present a simple high throughput protein unfolding kinetic assay using a plate reader that is applicable to the studies of the majority of 2-state folding proteins. We validate the assay by measuring kinetic unfolding data for the SH3 (Src Homology 3) domain from Actin Binding Protein 1 (AbpSH3) and its stabilized mutants. The results of our approach are in excellent agreement with published values. We further combine our kinetic assay with a plate reader equilibrium assay, to obtain indirect estimates of folding rates and use these approaches to characterize an AbpSH3-peptide hybrid. Our high throughput protein unfolding kinetic assays allow accurate screening of libraries of mutants by providing both kinetic and equilibrium measurements and provide a means for in-depth ϕ-value analyses. PMID:26745729
Fernandes, Richard; Carey, Conn; Hynes, James; Papkovsky, Dmitri
2013-01-01
The importance of food safety has resulted in a demand for a more rapid, high-throughput method for total viable count (TVC). The industry standard for TVC determination (ISO 4833:2003) is widely used but presents users with some drawbacks. The method is materials- and labor-intensive, requiring multiple agar plates per sample. More importantly, the method is slow, with 72 h typically required for a definitive result. Luxcel Biosciences has developed the GreenLight Model 960, a microtiter plate-based assay providing a rapid high-throughput method of aerobic bacterial load assessment through analysis of microbial oxygen consumption. Results are generated in 1-12 h, depending on microbial load. The mix and measure procedure allows rapid detection of microbial oxygen consumption and equates oxygen consumption to microbial load (CFU/g), providing a simple, sensitive means of assessing the microbial contamination levels in foods (1). As bacteria in the test sample grow and respire, they deplete O2, which is detected as an increase in the GreenLight probe signal above the baseline level (2). The time required to reach this increase in signal can be used to calculate the CFU/g of the original sample, based on a predetermined calibration. The higher the initial microbial load, the earlier this threshold is reached (1).
NASA Astrophysics Data System (ADS)
Loisel, G.; Lake, P.; Gard, P.; Dunham, G.; Nielsen-Weber, L.; Wu, M.; Norris, E.
2016-11-01
At Sandia National Laboratories, the x-ray generator Manson source model 5 was upgraded from 10 to 25 kV. The purpose of the upgrade is to drive higher characteristics photon energies with higher throughput. In this work we present characterization studies for the source size and the x-ray intensity when varying the source voltage for a series of K-, L-, and M-shell lines emitted from Al, Y, and Au elements composing the anode. We used a 2-pinhole camera to measure the source size and an energy dispersive detector to monitor the spectral content and intensity of the x-ray source. As the voltage increases, the source size is significantly reduced and line intensity is increased for the three materials. We can take advantage of the smaller source size and higher source throughput to effectively calibrate the suite of Z Pulsed Power Facility crystal spectrometers.
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure. However traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
A recently developed, commercially available, open-air, surface sampling ion source for mass spectrometers provides individual analyses in several seconds. To realize its full throughput potential, an autosampler and field sample carrier were designed and built. The autosampler ...
NASA Astrophysics Data System (ADS)
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-12-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Drug Discovery Algorithm for Cutaneous Leishmaniasis
Grogl, Max; Hickman, Mark; Ellis, William; Hudson, Thomas; Lazo, John S.; Sharlow, Elizabeth R.; Johnson, Jacob; Berman, Jonathan; Sciotti, Richard J.
2013-01-01
Cutaneous leishmaniasis is clinically widespread but lacks treatments that are effective and well tolerated. Because all present drugs have been grandfathered into clinical use, there are no examples of a pre-clinical product evaluation scheme that lead to new candidates for formal development. To provide oral agents for development targeting cutaneous leishmaniasis, we have implemented a discovery scheme that incorporates in vitro and in vivo testing of efficacy, toxicity, and pharmacokinetics/metabolism. Particular emphasis is placed on in vivo testing, progression from higher-throughput models to those with most clinical relevance, and efficient use of resources. PMID:23390221
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Monolithic amorphous silicon modules on continuous polymer substrate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimmer, D.P.
This report examines manufacturing monolithic amorphous silicon modules on a continuous polymer substrate. Module production costs can be reduced by increasing module performance, expanding production, and improving and modifying production processes. Material costs can be reduced by developing processes that use a 1-mil polyimide substrate and multilayers of low-cost material for the front encapsulant. Research to speed up a-Si and ZnO deposition rates is needed to improve throughputs. To keep throughput rates compatible with depositions, multibeam fiber optic delivery systems for laser scribing can be used. However, mechanical scribing systems promise even higher throughputs. Tandem cells and production experience canmore » increase device efficiency and stability. Two alternative manufacturing processes are described: (1) wet etching and sheet handling and (2) wet etching and roll-to-roll fabrication.« less
Implicit Block ACK Scheme for IEEE 802.11 WLANs
Sthapit, Pranesh; Pyun, Jae-Young
2016-01-01
The throughput of IEEE 802.11 standard is significantly bounded by the associated Medium Access Control (MAC) overhead. Because of the overhead, an upper limit exists for throughput, which is bounded, including situations where data rates are extremely high. Therefore, an overhead reduction is necessary to achieve higher throughput. The IEEE 802.11e amendment introduced the block ACK mechanism, to reduce the number of control messages in MAC. Although the block ACK scheme greatly reduces overhead, further improvements are possible. In this letter, we propose an implicit block ACK method that further reduces the overhead associated with IEEE 802.11e’s block ACK scheme. The mathematical analysis results are presented for both the original protocol and the proposed scheme. A performance improvement of greater than 10% was achieved with the proposed implementation.
Tan, Yann-Chong; Blum, Lisa K; Kongpachith, Sarah; Ju, Chia-Hsin; Cai, Xiaoyong; Lindstrom, Tamsin M; Sokolove, Jeremy; Robinson, William H
2014-03-01
We developed a DNA barcoding method to enable high-throughput sequencing of the cognate heavy- and light-chain pairs of the antibodies expressed by individual B cells. We used this approach to elucidate the plasmablast antibody response to influenza vaccination. We show that >75% of the rationally selected plasmablast antibodies bind and neutralize influenza, and that antibodies from clonal families, defined by sharing both heavy-chain VJ and light-chain VJ sequence usage, do so most effectively. Vaccine-induced heavy-chain VJ regions contained on average >20 nucleotide mutations as compared to their predicted germline gene sequences, and some vaccine-induced antibodies exhibited higher binding affinities for hemagglutinins derived from prior years' seasonal influenza as compared to their affinities for the immunization strains. Our results show that influenza vaccination induces the recall of memory B cells that express antibodies that previously underwent affinity maturation against prior years' seasonal influenza, suggesting that 'original antigenic sin' shapes the antibody response to influenza vaccination. Published by Elsevier Inc.
Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping
2016-10-01
Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.
High-throughput 3D spheroid culture and drug testing using a 384 hanging drop array.
Tung, Yi-Chung; Hsiao, Amy Y; Allen, Steven G; Torisawa, Yu-suke; Ho, Mitchell; Takayama, Shuichi
2011-02-07
Culture of cells as three-dimensional (3D) aggregates can enhance in vitro tests for basic biological research as well as for therapeutics development. Such 3D culture models, however, are often more complicated, cumbersome, and expensive than two-dimensional (2D) cultures. This paper describes a 384-well format hanging drop culture plate that makes spheroid formation, culture, and subsequent drug testing on the obtained 3D cellular constructs as straightforward to perform and adapt to existing high-throughput screening (HTS) instruments as conventional 2D cultures. Using this platform, we show that drugs with different modes of action produce distinct responses in the physiological 3D cell spheroids compared to conventional 2D cell monolayers. Specifically, the anticancer drug 5-fluorouracil (5-FU) has higher anti-proliferative effects on 2D cultures whereas the hypoxia activated drug commonly referred to as tirapazamine (TPZ) are more effective against 3D cultures. The multiplexed 3D hanging drop culture and testing plate provides an efficient way to obtain biological insights that are often lost in 2D platforms.
2010-01-01
Catalytic graphitization for 14C-accelerator mass spectrometry (14C-AMS) produced various forms of elemental carbon. Our high-throughput Zn reduction method (C/Fe = 1:5, 500 °C, 3 h) produced the AMS target of graphite-coated iron powder (GCIP), a mix of nongraphitic carbon and Fe3C. Crystallinity of the AMS targets of GCIP (nongraphitic carbon) was increased to turbostratic carbon by raising the C/Fe ratio from 1:5 to 1:1 and the graphitization temperature from 500 to 585 °C. The AMS target of GCIP containing turbostratic carbon had a large isotopic fractionation and a low AMS ion current. The AMS target of GCIP containing turbostratic carbon also yielded less accurate/precise 14C-AMS measurements because of the lower graphitization yield and lower thermal conductivity that were caused by the higher C/Fe ratio of 1:1. On the other hand, the AMS target of GCIP containing nongraphitic carbon had higher graphitization yield and better thermal conductivity over the AMS target of GCIP containing turbostratic carbon due to optimal surface area provided by the iron powder. Finally, graphitization yield and thermal conductivity were stronger determinants (over graphite crystallinity) for accurate/precise/high-throughput biological, biomedical, and environmental14C-AMS applications such as absorption, distribution, metabolism, elimination (ADME), and physiologically based pharmacokinetics (PBPK) of nutrients, drugs, phytochemicals, and environmental chemicals. PMID:20163100
Test and Evaluation of WiMAX Performance Using Open-Source Modeling and Simulation Software Tools
2010-12-01
specific needs. For instance, one may seek to maximize the system throughput while maximizing the number of trans- mitted data packets with hard...seeking to maximize the throughput of the system (Yu 2008; Pishdad and Rabiee 2008; Piro et al. 2010; Wongthavarawat and Ganz 2003; Mohammadi, Akl, and...testing environment provides tools to allow for setting up and running test environments over multiple systems (buildbot) and provides classes to
Study of Material Densification of In718 in the Higher Throughput Parameter Regime
NASA Technical Reports Server (NTRS)
Cordner, Samuel
2016-01-01
Selective Laser Melting (SLM) is a powder bed fusion additive manufacturing process used increasingly in the aerospace industry to reduce the cost, weight, and fabrication time for complex propulsion components. Previous optimization studies for SLM using the Concept Laser M1 and M2 machines at NASA Marshall Space Flight Center have centered on machine default parameters. The objective of this project is to characterize how heat treatment affects density and porosity from a microscopic point of view. This is performs using higher throughput parameters (a previously unexplored region of the manufacturing operating envelope for this application) on material consolidation. Density blocks were analyzed to explore the relationship between build parameters (laser power, scan speed, and hatch spacing) and material consolidation (assessed in terms of density and porosity). The study also considers the impact of post-processing, specifically hot isostatic pressing and heat treatment, as well as deposition pattern on material consolidation in the higher energy parameter regime. Metallurgical evaluation of specimens will also be presented. This work will contribute to creating a knowledge base (understanding material behavior in all ranges of the AM equipment operating envelope) that is critical to transitioning AM from the custom low rate production sphere it currently occupies to the world of mass high rate production, where parts are fabricated at a rapid rate with confidence that they will meet or exceed all stringent functional requirements for spaceflight hardware. These studies will also provide important data on the sensitivity of material consolidation to process parameters that will inform the design and development of future flight articles using SLM.
Searching for resistance genes to Bursaphelenchus xylophilus using high throughput screening
2012-01-01
Background Pine wilt disease (PWD), caused by the pinewood nematode (PWN; Bursaphelenchus xylophilus), damages and kills pine trees and is causing serious economic damage worldwide. Although the ecological mechanism of infestation is well described, the plant’s molecular response to the pathogen is not well known. This is due mainly to the lack of genomic information and the complexity of the disease. High throughput sequencing is now an efficient approach for detecting the expression of genes in non-model organisms, thus providing valuable information in spite of the lack of the genome sequence. In an attempt to unravel genes potentially involved in the pine defense against the pathogen, we hereby report the high throughput comparative sequence analysis of infested and non-infested stems of Pinus pinaster (very susceptible to PWN) and Pinus pinea (less susceptible to PWN). Results Four cDNA libraries from infested and non-infested stems of P. pinaster and P. pinea were sequenced in a full 454 GS FLX run, producing a total of 2,083,698 reads. The putative amino acid sequences encoded by the assembled transcripts were annotated according to Gene Ontology, to assign Pinus contigs into Biological Processes, Cellular Components and Molecular Functions categories. Most of the annotated transcripts corresponded to Picea genes-25.4-39.7%, whereas a smaller percentage, matched Pinus genes, 1.8-12.8%, probably a consequence of more public genomic information available for Picea than for Pinus. The comparative transcriptome analysis showed that when P. pinaster was infested with PWN, the genes malate dehydrogenase, ABA, water deficit stress related genes and PAR1 were highly expressed, while in PWN-infested P. pinea, the highly expressed genes were ricin B-related lectin, and genes belonging to the SNARE and high mobility group families. Quantitative PCR experiments confirmed the differential gene expression between the two pine species. Conclusions Defense-related genes triggered by nematode infestation were detected in both P. pinaster and P. pinea transcriptomes utilizing 454 pyrosequencing technology. P. pinaster showed higher abundance of genes related to transcriptional regulation, terpenoid secondary metabolism (including some with nematicidal activity) and pathogen attack. P. pinea showed higher abundance of genes related to oxidative stress and higher levels of expression in general of stress responsive genes. This study provides essential information about the molecular defense mechanisms utilized by P. pinaster and P. pinea against PWN infestation and contributes to a better understanding of PWD. PMID:23134679
Microbial forensics: fiber optic microarray subtyping of Bacillus anthracis
NASA Astrophysics Data System (ADS)
Shepard, Jason R. E.
2009-05-01
The past decade has seen increased development and subsequent adoption of rapid molecular techniques involving DNA analysis for detection of pathogenic microorganisms, also termed microbial forensics. The continued accumulation of microbial sequence information in genomic databases now better positions the field of high-throughput DNA analysis to proceed in a more manageable fashion. The potential to build off of these databases exists as technology continues to develop, which will enable more rapid, cost effective analyses. This wealth of genetic information, along with new technologies, has the potential to better address some of the current problems and solve the key issues involved in DNA analysis of pathogenic microorganisms. To this end, a high density fiber optic microarray has been employed, housing numerous DNA sequences simultaneously for detection of various pathogenic microorganisms, including Bacillus anthracis, among others. Each organism is analyzed with multiple sequences and can be sub-typed against other closely related organisms. For public health labs, real-time PCR methods have been developed as an initial preliminary screen, but culture and growth are still considered the gold standard. Technologies employing higher throughput than these standard methods are better suited to capitalize on the limitless potential garnered from the sequence information. Microarray analyses are one such format positioned to exploit this potential, and our array platform is reusable, allowing repetitive tests on a single array, providing an increase in throughput and decrease in cost, along with a certainty of detection, down to the individual strain level.
Characterizing ncRNAs in Human Pathogenic Protists Using High-Throughput Sequencing Technology
Collins, Lesley Joan
2011-01-01
ncRNAs are key genes in many human diseases including cancer and viral infection, as well as providing critical functions in pathogenic organisms such as fungi, bacteria, viruses, and protists. Until now the identification and characterization of ncRNAs associated with disease has been slow or inaccurate requiring many years of testing to understand complicated RNA and protein gene relationships. High-throughput sequencing now offers the opportunity to characterize miRNAs, siRNAs, small nucleolar RNAs (snoRNAs), and long ncRNAs on a genomic scale, making it faster and easier to clarify how these ncRNAs contribute to the disease state. However, this technology is still relatively new, and ncRNA discovery is not an application of high priority for streamlined bioinformatics. Here we summarize background concepts and practical approaches for ncRNA analysis using high-throughput sequencing, and how it relates to understanding human disease. As a case study, we focus on the parasitic protists Giardia lamblia and Trichomonas vaginalis, where large evolutionary distance has meant difficulties in comparing ncRNAs with those from model eukaryotes. A combination of biological, computational, and sequencing approaches has enabled easier classification of ncRNA classes such as snoRNAs, but has also aided the identification of novel classes. It is hoped that a higher level of understanding of ncRNA expression and interaction may aid in the development of less harsh treatment for protist-based diseases. PMID:22303390
SINA: accurate high-throughput multiple sequence alignment of ribosomal RNA genes.
Pruesse, Elmar; Peplies, Jörg; Glöckner, Frank Oliver
2012-07-15
In the analysis of homologous sequences, computation of multiple sequence alignments (MSAs) has become a bottleneck. This is especially troublesome for marker genes like the ribosomal RNA (rRNA) where already millions of sequences are publicly available and individual studies can easily produce hundreds of thousands of new sequences. Methods have been developed to cope with such numbers, but further improvements are needed to meet accuracy requirements. In this study, we present the SILVA Incremental Aligner (SINA) used to align the rRNA gene databases provided by the SILVA ribosomal RNA project. SINA uses a combination of k-mer searching and partial order alignment (POA) to maintain very high alignment accuracy while satisfying high throughput performance demands. SINA was evaluated in comparison with the commonly used high throughput MSA programs PyNAST and mothur. The three BRAliBase III benchmark MSAs could be reproduced with 99.3, 97.6 and 96.1 accuracy. A larger benchmark MSA comprising 38 772 sequences could be reproduced with 98.9 and 99.3% accuracy using reference MSAs comprising 1000 and 5000 sequences. SINA was able to achieve higher accuracy than PyNAST and mothur in all performed benchmarks. Alignment of up to 500 sequences using the latest SILVA SSU/LSU Ref datasets as reference MSA is offered at http://www.arb-silva.de/aligner. This page also links to Linux binaries, user manual and tutorial. SINA is made available under a personal use license.
An industrial engineering approach to laboratory automation for high throughput screening
Menke, Karl C.
2000-01-01
Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation. PMID:18924701
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 16 2014-07-01 2014-07-01 false Condensers used as control devices. 65...
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Condensers used as control devices. 65...
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Condensers used as control devices. 65...
Listen to the Urgent Sound of Drums: Major Challenges in African Higher Education
ERIC Educational Resources Information Center
Visser, Herman
2008-01-01
African higher education is currently facing tremendous challenges. The pressure and demand for access is huge. This is understandable against the background of traditionally low participation, low success and throughput rates, declining financial contributions from governments and donors, and critical pressures for efficiency, modernization,…
Zmijan, Robert; Jonnalagadda, Umesh S.; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn
2015-01-01
We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint. PMID:29456838
Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C
2016-01-01
Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.
Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.
Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N
2004-01-01
Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.
Telemetry Options for LDB Payloads
NASA Technical Reports Server (NTRS)
Stilwell, Bryan D.; Field, Christopher J.
2016-01-01
The Columbia Scientific Balloon Facility provides Telemetry and Command systems necessary for balloon operations and science support. There are various Line-Of-Sight (LOS) and Over-The-Horizon (OTH) systems and interfaces that provide communications to and from a science payload. This presentation will discuss the current data throughput options available and future capabilities that may be incorporated in the LDB Support Instrumentation Package (SIP) such as doubling the TDRSS data rate. We will also explore some new technologies that could potentially expand the data throughput of OTH communications.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
Achieving Fair Throughput among TCP Flows in Multi-Hop Wireless Mesh Networks
NASA Astrophysics Data System (ADS)
Hou, Ting-Chao; Hsu, Chih-Wei
Previous research shows that the IEEE 802.11 DCF channel contention mechanism is not capable of providing throughput fairness among nodes in different locations of the wireless mesh network. The node nearest the gateway will always strive for the chance to transmit data, causing fewer transmission opportunities for the nodes farther from the gateway, resulting in starvation. Prior studies modify the DCF mechanism to address the fairness problem. This paper focuses on the fairness study when TCP flows are carried over wireless mesh networks. By not modifying lower layer protocols, the current work identifies TCP parameters that impact throughput fairness and proposes adjusting those parameters to reduce frame collisions and improve throughput fairness. With the aid of mathematical formulation and ns2 simulations, this study finds that frame transmission from each node can be effectively controlled by properly controlling the delayed ACK timer and using a suitable advertised window. The proposed method reduces frame collisions and greatly improves TCP throughput fairness.
Alvarez, Guillermo Dufort Y; Favaro, Federico; Lecumberry, Federico; Martin, Alvaro; Oliver, Juan P; Oreggioni, Julian; Ramirez, Ignacio; Seroussi, Gadiel; Steinfeld, Leonardo
2018-02-01
This work presents a wireless multichannel electroencephalogram (EEG) recording system featuring lossless and near-lossless compression of the digitized EEG signal. Two novel, low-complexity, efficient compression algorithms were developed and tested in a low-power platform. The algorithms were tested on six public EEG databases comparing favorably with the best compression rates reported up to date in the literature. In its lossless mode, the platform is capable of encoding and transmitting 59-channel EEG signals, sampled at 500 Hz and 16 bits per sample, at a current consumption of 337 A per channel; this comes with a guarantee that the decompressed signal is identical to the sampled one. The near-lossless mode allows for significant energy savings and/or higher throughputs in exchange for a small guaranteed maximum per-sample distortion in the recovered signal. Finally, we address the tradeoff between computation cost and transmission savings by evaluating three alternatives: sending raw data, or encoding with one of two compression algorithms that differ in complexity and compression performance. We observe that the higher the throughput (number of channels and sampling rate) the larger the benefits obtained from compression.
Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei
2007-01-01
Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.
NASA Technical Reports Server (NTRS)
Abbe, Brian S.; Pinck, Deborah S.
1995-01-01
The Advanced Communications Technology Satellite (ACTS) Mobile Terminal (AMT) experiments have provided a terminal technology testbed for the evaluation of K- and Ka-band mobile satellite communications (satcom). Such a system could prove to be highly beneficial for many different commercial and government mobile satcom users. Combining ACTS' highly concentrated spotbeams with the smaller, higher-gain Ka-band antenna technology, results in a system design that can support a much higher throughput capacity than today's commercial configurations. To date, experiments in such diverse areas as emergency medical applications, enhanced Personal Communication Services (PCS), disaster recovery assistance, military applications, and general voice and data services have already been evaluated. Other applications that will be evaluated over the next year include telemedicine, ISDN, and television network return feed. Baseline AMT performance results will be presented, including Bit Error Rate (BER) curves and mobile propagation data characterizing the K- and Ka-band mobile satcom channel. In addition, observations from many of the application-specific experiments will also be provided.
Rizvi, Imran; Moon, Sangjun; Hasan, Tayyaba; Demirci, Utkan
2013-01-01
In vitro 3D cancer models that provide a more accurate representation of disease in vivo are urgently needed to improve our understanding of cancer pathology and to develop better cancer therapies. However, development of 3D models that are based on manual ejection of cells from micropipettes suffer from inherent limitations such as poor control over cell density, limited repeatability, low throughput, and, in the case of coculture models, lack of reproducible control over spatial distance between cell types (e.g., cancer and stromal cells). In this study, we build on a recently introduced 3D model in which human ovarian cancer (OVCAR-5) cells overlaid on Matrigel™ spontaneously form multicellular acini. We introduce a high-throughput automated cell printing system to bioprint a 3D coculture model using cancer cells and normal fibroblasts micropatterned on Matrigel™. Two cell types were patterned within a spatially controlled microenvironment (e.g., cell density, cell-cell distance) in a high-throughput and reproducible manner; both cell types remained viable during printing and continued to proliferate following patterning. This approach enables the miniaturization of an established macro-scale 3D culture model and would allow systematic investigation into the multiple unknown regulatory feedback mechanisms between tumor and stromal cells and provide a tool for high-throughput drug screening. PMID:21298805
Ramanathan, Ragu; Ghosal, Anima; Ramanathan, Lakshmi; Comstock, Kate; Shen, Helen; Ramanathan, Dil
2018-05-01
Evaluation of HPLC-high-resolution mass spectrometry (HPLC-HRMS) full scan with polarity switching for increasing throughput of human in vitro cocktail drug-drug interaction assay. Microsomal incubates were analyzed using a high resolution and high mass accuracy Q-Exactive mass spectrometer to collect integrated qualitative and quantitative (qual/quant) data. Within assay, positive-to-negative polarity switching HPLC-HRMS method allowed quantification of eight and two probe compounds in the positive and negative ionization modes, respectively, while monitoring for LOR and its metabolites. LOR-inhibited CYP2C19 and showed higher activity for CYP2D6, CYP2E1 and CYP3A4. Overall, LC-HRMS-based nontargeted full scan quantitation allowed to improve the throughput of the in vitro cocktail drug-drug interaction assay.
NASA Astrophysics Data System (ADS)
Watanabe, A.; Furukawa, H.
2018-04-01
The resolution of multichannel Fourier transform (McFT) spectroscopy is insufficient for many applications despite its extreme advantage of high throughput. We propose an improved configuration to realise both performance using a two-dimensional area sensor. For the spectral resolution, we obtained the interferogram of a larger optical path difference by shifting the area sensor without altering any optical components. The non-linear phase error of the interferometer was successfully corrected using a phase-compensation calculation. Warping compensation was also applied to realise a higher throughput to accumulate the signal between vertical pixels. Our approach significantly improved the resolution and signal-to-noise ratio by factors of 1.7 and 34, respectively. This high-resolution and high-sensitivity McFT spectrometer will be useful for detecting weak light signals such as those in non-invasive diagnosis.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737
Adaptive and reliably acknowledged FSO communications
NASA Astrophysics Data System (ADS)
Fitz, Michael P.; Halford, Thomas R.; Kose, Cenk; Cromwell, Jonathan; Gordon, Steven
2015-05-01
Atmospheric turbulence causes the receive signal intensity on free space optical (FSO) communication links to vary over time. Scintillation fades can stymie connectivity for milliseconds at a time. To approach the information-theoretic limits of communication in such time-varying channels, it necessary to either code across extremely long blocks of data - thereby inducing unacceptable delays - or to vary the code rate according to the instantaneous channel conditions. We describe the design, laboratory testing, and over-the-air testing of an FSO modem that employs a protocol with adaptive coded modulation (ACM) and hybrid automatic repeat request. For links with fixed throughput, this protocol provides a 10dB reduction in the required received signal-to-noise ratio (SNR); for links with fixed range, this protocol provides the greater than a 3x increase in throughput. Independent U.S. Government tests demonstrate that our protocol effectively adapts the code rate to match the instantaneous channel conditions. The modem is able to provide throughputs in excess of 850 Mbps on links with ranges greater than 15 kilometers.
USDA-ARS?s Scientific Manuscript database
High-throughput phenotyping platforms (HTPPs) provide novel opportunities to more effectively dissect the genetic basis of drought-adaptive traits. This genome-wide association study (GWAS) compares the results obtained with two Unmanned Aerial Vehicles (UAVs) and a ground-based platform used to mea...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...
Linking Emotional Intelligence to Achieve Technology Enhanced Learning in Higher Education
ERIC Educational Resources Information Center
Kruger, Janette; Blignaut, A. Seugnet
2013-01-01
Higher education institutions (HEIs) increasingly use technology-enhanced learning (TEL) environments (e.g. blended learning and e-learning) to improve student throughput and retention rates. As the demand for TEL courses increases, expectations rise for faculty to meet the challenge of using TEL effectively. The promises that TEL holds have not…
Lessons from high-throughput protein crystallization screening: 10 years of practical experience
JR, Luft; EH, Snell; GT, DeTitta
2011-01-01
Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073
Ultrafast Microfluidic Cellular Imaging by Optical Time-Stretch.
Lau, Andy K S; Wong, Terence T W; Shum, Ho Cheung; Wong, Kenneth K Y; Tsia, Kevin K
2016-01-01
There is an unmet need in biomedicine for measuring a multitude of parameters of individual cells (i.e., high content) in a large population efficiently (i.e., high throughput). This is particularly driven by the emerging interest in bringing Big-Data analysis into this arena, encompassing pathology, drug discovery, rare cancer cell detection, emulsion microdroplet assays, to name a few. This momentum is particularly evident in recent advancements in flow cytometry. They include scaling of the number of measurable colors from the labeled cells and incorporation of imaging capability to access the morphological information of the cells. However, an unspoken predicament appears in the current technologies: higher content comes at the expense of lower throughput, and vice versa. For example, accessing additional spatial information of individual cells, imaging flow cytometers only achieve an imaging throughput ~1000 cells/s, orders of magnitude slower than the non-imaging flow cytometers. In this chapter, we introduce an entirely new imaging platform, namely optical time-stretch microscopy, for ultrahigh speed and high contrast label-free single-cell (in a ultrafast microfluidic flow up to 10 m/s) imaging and analysis with an ultra-fast imaging line-scan rate as high as tens of MHz. Based on this technique, not only morphological information of the individual cells can be obtained in an ultrafast manner, quantitative evaluation of cellular information (e.g., cell volume, mass, refractive index, stiffness, membrane tension) at nanometer scale based on the optical phase is also possible. The technology can also be integrated with conventional fluorescence measurements widely adopted in the non-imaging flow cytometers. Therefore, these two combinatorial and complementary measurement capabilities in long run is an attractive platform for addressing the pressing need for expanding the "parameter space" in high-throughput single-cell analysis. This chapter provides the general guidelines of constructing the optical system for time stretch imaging, fabrication and design of the microfluidic chip for ultrafast fluidic flow, as well as the image acquisition and processing.
Robotic Patterning a Superhydrophobic Surface for Collective Cell Migration Screening.
Pang, Yonggang; Yang, Jing; Hui, Zhixin; Grottkau, Brian E
2018-04-01
Collective cell migration, in which cells migrate as a group, is fundamental in many biological and pathological processes. There is increasing interest in studying the collective cell migration in high throughput. Cell scratching, insertion blocker, and gel-dissolving techniques are some methodologies used previously. However, these methods have the drawbacks of cell damage, substrate surface alteration, limitation in medium exchange, and solvent interference. The superhydrophobic surface, on which the water contact angle is greater than 150 degrees, has been recently utilized to generate patterned arrays. Independent cell culture areas can be generated on a substrate that functions the same as a conventional multiple well plate. However, so far there has been no report on superhydrophobic patterning for the study of cell migration. In this study, we report on the successful development of a robotically patterned superhydrophobic array for studying collective cell migration in high throughput. The array was developed on a rectangular single-well cell culture plate consisting of hydrophilic flat microwells separated by the superhydrophobic surface. The manufacturing process is robotic and includes patterning discrete protective masks to the substrate using 3D printing, robotic spray coating of silica nanoparticles, robotic mask removal, robotic mini silicone blocker patterning, automatic cell seeding, and liquid handling. Compared with a standard 96-well plate, our system increases the throughput by 2.25-fold and generates a cell-free area in each well non-destructively. Our system also demonstrates higher efficiency than conventional way of liquid handling using microwell plates, and shorter processing time than manual operating in migration assays. The superhydrophobic surface had no negative impact on cell viability. Using our system, we studied the collective migration of human umbilical vein endothelial cells and cancer cells using assays of endpoint quantification, dynamic cell tracking, and migration quantification following varied drug treatments. This system provides a versatile platform to study collective cell migration in high throughput for a broad range of applications.
Cai, Yingying; Xia, Miaomiao; Dong, Huina; Qian, Yuan; Zhang, Tongcun; Zhu, Beiwei; Wu, Jinchuan; Zhang, Dawei
2018-05-11
As a very important coenzyme in the cell metabolism, Vitamin B 12 (cobalamin, VB 12 ) has been widely used in food and medicine fields. The complete biosynthesis of VB 12 requires approximately 30 genes, but overexpression of these genes did not result in expected increase of VB 12 production. High-yield VB 12 -producing strains are usually obtained by mutagenesis treatments, thus developing an efficient screening approach is urgently needed. By the help of engineered strains with varied capacities of VB 12 production, a riboswitch library was constructed and screened, and the btuB element from Salmonella typhimurium was identified as the best regulatory device. A flow cytometry high-throughput screening system was developed based on the btuB riboswitch with high efficiency to identify positive mutants. Mutation of Sinorhizobium meliloti (S. meliloti) was optimized using the novel mutation technique of atmospheric and room temperature plasma (ARTP). Finally, the mutant S. meliloti MC5-2 was obtained and considered as a candidate for industrial applications. After 7 d's cultivation on a rotary shaker at 30 °C, the VB 12 titer of S. meliloti MC5-2 reached 156 ± 4.2 mg/L, which was 21.9% higher than that of the wild type strain S. meliloti 320 (128 ± 3.2 mg/L). The genome of S. meliloti MC5-2 was sequenced, and gene mutations were identified and analyzed. To our knowledge, it is the first time that a riboswitch element was used in S. meliloti. The flow cytometry high-throughput screening system was successfully developed and a high-yield VB 12 producing strain was obtained. The identified and analyzed gene mutations gave useful information for developing high-yield strains by metabolic engineering. Overall, this work provides a useful high-throughput screening method for developing high VB 12 -yield strains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loisel, G., E-mail: gploise@sandia.gov; Lake, P.; Gard, P.
2016-11-15
At Sandia National Laboratories, the x-ray generator Manson source model 5 was upgraded from 10 to 25 kV. The purpose of the upgrade is to drive higher characteristics photon energies with higher throughput. In this work we present characterization studies for the source size and the x-ray intensity when varying the source voltage for a series of K-, L-, and M-shell lines emitted from Al, Y, and Au elements composing the anode. We used a 2-pinhole camera to measure the source size and an energy dispersive detector to monitor the spectral content and intensity of the x-ray source. As themore » voltage increases, the source size is significantly reduced and line intensity is increased for the three materials. We can take advantage of the smaller source size and higher source throughput to effectively calibrate the suite of Z Pulsed Power Facility crystal spectrometers.« less
Prioritized retransmission in slotted all-optical packet-switched networks
NASA Astrophysics Data System (ADS)
Ghaffar Pour Rahbar, Akbar; Yang, Oliver
2006-12-01
We consider an all-optical slotted packet-switched network interconnected by a number of bufferless all-optical switches with contention-based operation. One approach to reduce the cost of the expensive contention resolution hardware could be retransmission in which each ingress switch keeps a copy of the transmitted traffic in the electronic buffer and retransmits whenever required. The conventional retransmission technique may need a higher number of retransmissions until traffic passes through the network. This in turn may lead to a retransmission at a higher layer and reduce the network throughput. In this paper, we propose and analyze a simple but effective prioritized retransmission technique in which dropped traffic is prioritized when retransmitted from ingress switches so that the core switch can process them with a higher priority. We present the analysis of both techniques in multifiber network architecture and verify it via simulation to demonstrate that our proposed algorithm can limit the number of retransmissions significantly and can improve TCP throughput better than the conventional retransmission technique.
Diels–Alder reactions of myrcene using intensified continuous-flow reactors
Álvarez-Diéguez, Miguel Á; Kohl, Thomas M; Tsanaktsidis, John
2017-01-01
This work describes the Diels–Alder reaction of the naturally occurring substituted butadiene, myrcene, with a range of different naturally occurring and synthetic dienophiles. The synthesis of the Diels–Alder adduct from myrcene and acrylic acid, containing surfactant properties, was scaled-up in a plate-type continuous-flow reactor with a volume of 105 mL to a throughput of 2.79 kg of the final product per day. This continuous-flow approach provides a facile alternative scale-up route to conventional batch processing, and it helps to intensify the synthesis protocol by applying higher reaction temperatures and shorter reaction times. PMID:28228853
Pulsed laser activated cell sorter (PLACS) for high-throughput fluorescent mammalian cell sorting
NASA Astrophysics Data System (ADS)
Chen, Yue; Wu, Ting-Hsiang; Chung, Aram; Kung, Yu-Chung; Teitell, Michael A.; Di Carlo, Dino; Chiou, Pei-Yu
2014-09-01
We present a Pulsed Laser Activated Cell Sorter (PLACS) realized by exciting laser induced cavitation bubbles in a PDMS microfluidic channel to create high speed liquid jets to deflect detected fluorescent samples for high speed sorting. Pulse laser triggered cavitation bubbles can expand in few microseconds and provide a pressure higher than tens of MPa for fluid perturbation near the focused spot. This ultrafast switching mechanism has a complete on-off cycle less than 20 μsec. Two approaches have been utilized to achieve 3D sample focusing in PLACS. One is relying on multilayer PDMS channels to provide 3D hydrodynamic sheath flows. It offers accurate timing control of fast (2 m sec-1) passing particles so that synchronization with laser bubble excitation is possible, an critically important factor for high purity and high throughput sorting. PLACS with 3D hydrodynamic focusing is capable of sorting at 11,000 cells/sec with >95% purity, and 45,000 cells/sec with 45% purity using a single channel in a single step. We have also demonstrated 3D focusing using inertial flows in PLACS. This sheathless focusing approach requires 10 times lower initial cell concentration than that in sheath-based focusing and avoids severe sample dilution from high volume sheath flows. Inertia PLACS is capable of sorting at 10,000 particles sec-1 with >90% sort purity.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
High-throughput Titration of Luciferase-expressing Recombinant Viruses
Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon
2014-01-01
Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
High-throughput automatic defect review for 300mm blank wafers with atomic force microscope
NASA Astrophysics Data System (ADS)
Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il
2015-03-01
While feature size in lithography process continuously becomes smaller, defect sizes on blank wafers become more comparable to device sizes. Defects with nm-scale characteristic size could be misclassified by automated optical inspection (AOI) and require post-processing for proper classification. Atomic force microscope (AFM) is known to provide high lateral and the highest vertical resolution by mechanical probing among all techniques. However, its low throughput and tip life in addition to the laborious efforts for finding the defects have been the major limitations of this technique. In this paper we introduce automatic defect review (ADR) AFM as a post-inspection metrology tool for defect study and classification for 300 mm blank wafers and to overcome the limitations stated above. The ADR AFM provides high throughput, high resolution, and non-destructive means for obtaining 3D information for nm-scale defect review and classification.
Jacobs, K R; Guillemin, G J; Lovejoy, D B
2018-02-01
Kynurenine 3-monooxygenase (KMO) is a well-validated therapeutic target for the treatment of neurodegenerative diseases, including Alzheimer's disease (AD) and Huntington's disease (HD). This work reports a facile fluorescence-based KMO assay optimized for high-throughput screening (HTS) that achieves a throughput approximately 20-fold higher than the fastest KMO assay currently reported. The screen was run with excellent performance (average Z' value of 0.80) from 110,000 compounds across 341 plates and exceeded all statistical parameters used to describe a robust HTS assay. A subset of molecules was selected for validation by ultra-high-performance liquid chromatography, resulting in the confirmation of a novel hit with an IC 50 comparable to that of the well-described KMO inhibitor Ro-61-8048. A medicinal chemistry program is currently underway to further develop our novel KMO inhibitor scaffolds.
Information-based management mode based on value network analysis for livestock enterprises
NASA Astrophysics Data System (ADS)
Liu, Haoqi; Lee, Changhoon; Han, Mingming; Su, Zhongbin; Padigala, Varshinee Anu; Shen, Weizheng
2018-01-01
With the development of computer and IT technologies, enterprise management has gradually become information-based management. Moreover, due to poor technical competence and non-uniform management, most breeding enterprises show a lack of organisation in data collection and management. In addition, low levels of efficiency result in increasing production costs. This paper adopts 'struts2' in order to construct an information-based management system for standardised and normalised management within the process of production in beef cattle breeding enterprises. We present a radio-frequency identification system by studying multiple-tag anti-collision via a dynamic grouping ALOHA algorithm. This algorithm is based on the existing ALOHA algorithm and uses an improved packet dynamic of this algorithm, which is characterised by a high-throughput rate. This new algorithm can reach a throughput 42% higher than that of the general ALOHA algorithm. With a change in the number of tags, the system throughput is relatively stable.
Shot-noise limited throughput of soft x-ray ptychography for nanometrology applications
NASA Astrophysics Data System (ADS)
Koek, Wouter; Florijn, Bastiaan; Bäumer, Stefan; Kruidhof, Rik; Sadeghian, Hamed
2018-03-01
Due to its potential for high resolution and three-dimensional imaging, soft x-ray ptychography has received interest for nanometrology applications. We have analyzed the measurement time per unit area when using soft x-ray ptychography for various nanometrology applications including mask inspection and wafer inspection, and are thus able to predict (order of magnitude) throughput figures. Here we show that for a typical measurement system, using a typical sampling strategy, and when aiming for 10-15 nm resolution, it is expected that a wafer-based topology (2.5D) measurement takes approximately 4 minutes per μm2 , and a full three-dimensional measurement takes roughly 6 hours per μm2 . Due to their much higher reflectivity EUV masks can be measured considerably faster; a measurement speed of 0.1 seconds per μm2 is expected. However, such speeds do not allow for full wafer or mask inspection at industrially relevant throughput.
WFC3 TV3 Testing: IR Channel Blue Leaks
NASA Astrophysics Data System (ADS)
Brown, Thomas R.
2008-03-01
A new IR detector (IR4; FPA165) is housed in WFC3 during the current campaign of thermal vacuum (TV) ground testing at GSFC. As part of these tests, we measured the IR channel throughput. Compared to the previous IR detectors, IR4 has much higher quantum efficiency at all wavelengths, particularly in the optical. The total throughput for the IR channel is still low in the optical, due to the opacity of the IR filters at these wavelengths, but there is a small wavelength region (~710-830 nm) where these filters do not offer as much blocking as needed to meet Contract End Item specifications. For this reason, the throughput measurements were extended into the blue to quantify the amount of blue leak in the narrow and medium IR bandpasses where a few percent of the measured flux could come from optical photons when observing hot sources. The results are tabulated here.
Ziolkowski, Pawel; Wambach, Matthias; Ludwig, Alfred; Mueller, Eckhard
2018-01-08
In view of the variety and complexity of thermoelectric (TE) material systems, combinatorial approaches to materials development come to the fore for identifying new promising compounds. The success of this approach is related to the availability and reliability of high-throughput characterization methods for identifying interrelations between materials structures and properties within the composition spread libraries. A meaningful characterization starts with determination of the Seebeck coefficient as a major feature of TE materials. Its measurement, and hence the accuracy and detectability of promising material compositions, may be strongly affected by thermal and electrical measurement conditions. This work illustrates the interrelated effects of the substrate material, the layer thickness, and spatial property distributions of thin film composition spread libraries, which are studied experimentally by local thermopower scans by means of the Potential and Seebeck Microprobe (PSM). The study is complemented by numerical evaluation. Material libraries of the half-Heusler compound system Ti-Ni-Sn were deposited on selected substrates (Si, AlN, Al 2 O 3 ) by magnetron sputtering. Assuming homogeneous properties of a film, significant decrease of the detected thermopower S m can be expected on substrates with higher thermal conductivity, yielding an underestimation of materials thermopower between 15% and 50%, according to FEM (finite element methods) simulations. Thermally poor conducting substrates provide a better accuracy with thermopower underestimates lower than 8%, but suffer from a lower spatial resolution. According to FEM simulations, local scanning of sharp thermopower peaks on lowly conductive substrates is linked to an additional deviation of the measured thermopower of up to 70% compared to homogeneous films, which is 66% higher than for corresponding cases on substrates with higher thermal conductivity of this study.
Wang, Shanyun; Wang, Weidong; Liu, Lu; Zhuang, Linjie; Zhao, Siyan; Su, Yu; Li, Yixiao; Wang, Mengzi; Wang, Cheng; Xu, Liya; Zhu, Guibing
2018-05-24
Artificial microbial nitrogen (N) cycle hotspots in the plant-bed/ditch system were developed and investigated based on intact core and slurry assays measurement using isotopic tracing technology, quantitative PCR and high-throughput sequencing. By increasing hydraulic retention time and periodically fluctuating water level in heterogeneous riparian zones, hotspots of anammox, nitrification, denitrification, ammonium (NH 4 + ) oxidation, nitrite (NO 2 - ) oxidation, nitrate (NO 3 - ) reduction and DNRA were all stimulated at the interface sediments, with the abundance and activity being about 1-3 orders of magnitude higher than those in nonhotspots. Isotopic pairing experiments revealed that in microbial hotspots, nitrite sources were higher than the sinks, and both NH 4 + oxidation (55.8%) and NO 3 - reduction (44.2%) provided nitrite for anammox, which accounted for 43.0% of N-loss and 44.4% of NH 4 + removal in riparian zones but did not involve nitrous oxide (N 2 O) emission risks. High-throughput analysis identified that bacterial quorum sensing mediated this anammox hotspot with B.fulgida dominating the anammox community, but it was B. anammoxidans and Jettenia sp. that contributed more to anammox activity. In the nonhotspot zones, the NO 2 - source (NO 3 - reduction dominated) was lower than the sink, limiting the effects on anammox. The in situ N 2 O flux measurement showed that the microbial hotspot had a 27.1% reduced N 2 O emission flux compared with the nonhotspot zones.
Yeaman, Grant R; Paul, Sudakshina; Nahirna, Iryna; Wang, Yongcheng; Deffenbaugh, Andrew E; Liu, Zi Lucy; Glenn, Kevin C
2016-06-22
In order to provide farmers with better and more customized alternatives to improve yields, combining multiple genetically modified (GM) traits into a single product (called stacked trait crops) is becoming prevalent. Trait protein expression levels are used to characterize new GM products and establish exposure limits, two important components of safety assessment. Developing a multiplexed immunoassay capable of measuring all trait proteins in the same sample allows for higher sample throughput and savings in both time and expense. Fluorescent (bead-based) multiplexed immunoassays (FMI) have gained wide acceptance in mammalian research and in clinical applications. In order to facilitate the measurement of stacked GM traits, we have developed and validated an FMI assay that can measure five different proteins (β-glucuronidase, neomycin phosphotransferase II, Cry1Ac, Cry2Ab2, and CP4 5-enolpyruvyl-shikimate-3-phosphate synthase) present in cotton leaf from a stacked trait product. Expression levels of the five proteins determined by FMI in cotton leaf tissues have been evaluated relative to expression levels determined by enzyme-linked immunosorbent assays (ELISAs) of the individual proteins and shown to be comparable. The FMI met characterization requirements similar to those used for ELISA. Therefore, it is reasonable to conclude that FMI results are equivalent to those determined by conventional individual ELISAs to measure GM protein expression levels in stacked trait products but with significantly higher throughput, reduced time, and more efficient use of resources.
Kresse, Stine H; Namløs, Heidi M; Lorenz, Susanne; Berner, Jeanne-Marie; Myklebost, Ola; Bjerkehagen, Bodil; Meza-Zepeda, Leonardo A
2018-01-01
Nucleic acid material of adequate quality is crucial for successful high-throughput sequencing (HTS) analysis. DNA and RNA isolated from archival FFPE material are frequently degraded and not readily amplifiable due to chemical damage introduced during fixation. To identify optimal nucleic acid extraction kits, DNA and RNA quantity, quality and performance in HTS applications were evaluated. DNA and RNA were isolated from five sarcoma archival FFPE blocks, using eight extraction protocols from seven kits from three different commercial vendors. For DNA extraction, the truXTRAC FFPE DNA kit from Covaris gave higher yields and better amplifiable DNA, but all protocols gave comparable HTS library yields using Agilent SureSelect XT and performed well in downstream variant calling. For RNA extraction, all protocols gave comparable yields and amplifiable RNA. However, for fusion gene detection using the Archer FusionPlex Sarcoma Assay, the truXTRAC FFPE RNA kit from Covaris and Agencourt FormaPure kit from Beckman Coulter showed the highest percentage of unique read-pairs, providing higher complexity of HTS data and more frequent detection of recurrent fusion genes. truXTRAC simultaneous DNA and RNA extraction gave similar outputs as individual protocols. These findings show that although successful HTS libraries could be generated in most cases, the different protocols gave variable quantity and quality for FFPE nucleic acid extraction. Selecting the optimal procedure is highly valuable and may generate results in borderline quality specimens.
LOCATE: a mouse protein subcellular localization database
Fink, J. Lynn; Aturaliya, Rajith N.; Davis, Melissa J.; Zhang, Fasheng; Hanson, Kelly; Teasdale, Melvena S.; Kai, Chikatoshi; Kawai, Jun; Carninci, Piero; Hayashizaki, Yoshihide; Teasdale, Rohan D.
2006-01-01
We present here LOCATE, a curated, web-accessible database that houses data describing the membrane organization and subcellular localization of proteins from the FANTOM3 Isoform Protein Sequence set. Membrane organization is predicted by the high-throughput, computational pipeline MemO. The subcellular locations of selected proteins from this set were determined by a high-throughput, immunofluorescence-based assay and by manually reviewing >1700 peer-reviewed publications. LOCATE represents the first effort to catalogue the experimentally verified subcellular location and membrane organization of mammalian proteins using a high-throughput approach and provides localization data for ∼40% of the mouse proteome. It is available at . PMID:16381849
Jones, Jaime R; Neff, Linda J; Ely, Elizabeth K; Parker, Andrew M
2012-12-01
The Cities Readiness Initiative is a federally funded program designed to assist 72 metropolitan statistical areas (MSAs) in preparing to dispense life-saving medical countermeasures within 48 hours of a public health emergency. Beginning in 2008, the 72 MSAs were required to conduct 3 drills related to the distribution and dispensing of emergency medical countermeasures. The report describes the results of the first year of pilot data for medical countermeasure drills conducted by the MSAs. The MSAs were provided templates with key metrics for 5 functional elements critical for a successful dispensing campaign: personnel call down, site activation, facility setup, pick-list generation, and dispensing throughput. Drill submissions were compiled into single data sets for each of the 5 drills. Analyses were conducted to determine whether the measures were comparable across business and non-business hours. Descriptive statistics were computed for each of the key metrics identified in the 5 drills. Most drills were conducted on Mondays and Wednesdays during business hours (8:00 am-5:00 pm). The median completion time for the personnel call-down drill was 1 hour during business hours (n = 287) and 55 minutes during non-business hours (n = 136). Site-activation drills were completed in a median of 30 minutes during business hours and 5 minutes during non-business hours. Facility setup drills were completed more rapidly during business hours (75 minutes) compared with non-business hours (96 minutes). During business hours, pick lists were generated in a median of 3 minutes compared with 5 minutes during non-business hours. Aggregate results from the dispensing throughput drills demonstrated that the median observed throughput during business hours (60 people/h) was higher than that during non-business hours (43 people/h). The results of the analyses from this pilot sample of drill submissions provide a baseline for the determination of a national standard in operational capabilities for local jurisdictions to achieve in their planning efforts for a mass dispensing campaign during an emergency.
Lakshmi, Bhavana Sethu; Wang, Ruobing; Madhubala, Rentala
2014-06-24
Leishmaniasis is a neglected tropical disease caused by Leishmania species. It is a major health concern affecting 88 countries and threatening 350 million people globally. Unfortunately, there are no vaccines and there are limitations associated with the current therapeutic regimens for leishmaniasis. The emerging cases of drug-resistance further aggravate the situation, demanding rapid drug and vaccine development. The genome sequence of Leishmania, provides access to novel genes that hold potential as chemotherapeutic targets or vaccine candidates. In this study, we selected 19 antigenic genes from about 8000 common Leishmania genes based on the Leishmania major and Leishmania infantum genome information available in the pathogen databases. Potential vaccine candidates thus identified were screened using an in vitro high throughput immunological platform developed in the laboratory. Four candidate genes coding for tuzin, flagellar glycoprotein-like protein (FGP), phospholipase A1-like protein (PLA1) and potassium voltage-gated channel protein (K VOLT) showed a predominant protective Th1 response over disease exacerbating Th2. We report the immunogenic properties and protective efficacy of one of the four antigens, tuzin, as a DNA vaccine against Leishmania donovani challenge. Our results show that administration of tuzin DNA protected BALB/c mice against L. donovani challenge and that protective immunity was associated with higher levels of IFN-γ and IL-12 production in comparison to IL-4 and IL-10. Our study presents a simple approach to rapidly identify potential vaccine candidates using the exhaustive information stored in the genome and an in vitro high-throughput immunological platform. Copyright © 2014. Published by Elsevier Ltd.
High-throughput measurements of the optical redox ratio using a commercial microplate reader.
Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C
2015-01-01
There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.
BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing
Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph
2011-01-01
Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797
Foliar fungi of Betula pendula: impact of tree species mixtures and assessment methods
Nguyen, Diem; Boberg, Johanna; Cleary, Michelle; Bruelheide, Helge; Hönig, Lydia; Koricheva, Julia; Stenlid, Jan
2017-01-01
Foliar fungi of silver birch (Betula pendula) in an experimental Finnish forest were investigated across a gradient of tree species richness using molecular high-throughput sequencing and visual macroscopic assessment. We hypothesized that the molecular approach detects more fungal taxa than visual assessment, and that there is a relationship among the most common fungal taxa detected by both techniques. Furthermore, we hypothesized that the fungal community composition, diversity, and distribution patterns are affected by changes in tree diversity. Sequencing revealed greater diversity of fungi on birch leaves than the visual assessment method. One species showed a linear relationship between the methods. Species-specific variation in fungal community composition could be partially explained by tree diversity, though overall fungal diversity was not affected by tree diversity. Analysis of specific fungal taxa indicated tree diversity effects at the local neighbourhood scale, where the proportion of birch among neighbouring trees varied, but not at the plot scale. In conclusion, both methods may be used to determine tree diversity effects on the foliar fungal community. However, high-throughput sequencing provided higher resolution of the fungal community, while the visual macroscopic assessment detected functionally active fungal species. PMID:28150710
The impact of the condenser on cytogenetic image quality in digital microscope system.
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.
Visser, Claas Willem; Kamperman, Tom; Karbaat, Lisanne P.; Lohse, Detlef; Karperien, Marcel
2018-01-01
Microfluidic chips provide unparalleled control over droplets and jets, which have advanced all natural sciences. However, microfluidic applications could be vastly expanded by increasing the per-channel throughput and directly exploiting the output of chips for rapid additive manufacturing. We unlock these features with in-air microfluidics, a new chip-free platform to manipulate microscale liquid streams in the air. By controlling the composition and in-air impact of liquid microjets by surface tension–driven encapsulation, we fabricate monodisperse emulsions, particles, and fibers with diameters of 20 to 300 μm at rates that are 10 to 100 times higher than chip-based droplet microfluidics. Furthermore, in-air microfluidics uniquely enables module-based production of three-dimensional (3D) multiscale (bio)materials in one step because droplets are partially solidified in-flight and can immediately be printed onto a substrate. In-air microfluidics is cytocompatible, as demonstrated by additive manufacturing of 3D modular constructs with tailored microenvironments for multiple cell types. Its in-line control, high throughput and resolution, and cytocompatibility make in-air microfluidics a versatile platform technology for science, industry, and health care. PMID:29399628
Farlora, Rodolfo; Araya-Garay, José; Gallardo-Escárate, Cristian
2014-06-01
Understanding the molecular underpinnings involved in the reproduction of the salmon louse is critical for designing novel strategies of pest management for this ectoparasite. However, genomic information on sex-related genes is still limited. In the present work, sex-specific gene transcription was revealed in the salmon louse Caligus rogercresseyi using high-throughput Illumina sequencing. A total of 30,191,914 and 32,292,250 high quality reads were generated for females and males, and these were de novo assembled into 32,173 and 38,177 contigs, respectively. Gene ontology analysis showed a pattern of higher expression in the female as compared to the male transcriptome. Based on our sequence analysis and known sex-related proteins, several genes putatively involved in sex differentiation, including Dmrt3, FOXL2, VASA, and FEM1, and other potentially significant candidate genes in C. rogercresseyi, were identified for the first time. In addition, the occurrence of SNPs in several differentially expressed contigs annotating for sex-related genes was found. This transcriptome dataset provides a useful resource for future functional analyses, opening new opportunities for sea lice pest control. Copyright © 2014 Elsevier B.V. All rights reserved.
Han, Xiaoping; Chen, Haide; Huang, Daosheng; Chen, Huidong; Fei, Lijiang; Cheng, Chen; Huang, He; Yuan, Guo-Cheng; Guo, Guoji
2018-04-05
Human pluripotent stem cells (hPSCs) provide powerful models for studying cellular differentiations and unlimited sources of cells for regenerative medicine. However, a comprehensive single-cell level differentiation roadmap for hPSCs has not been achieved. We use high throughput single-cell RNA-sequencing (scRNA-seq), based on optimized microfluidic circuits, to profile early differentiation lineages in the human embryoid body system. We present a cellular-state landscape for hPSC early differentiation that covers multiple cellular lineages, including neural, muscle, endothelial, stromal, liver, and epithelial cells. Through pseudotime analysis, we construct the developmental trajectories of these progenitor cells and reveal the gene expression dynamics in the process of cell differentiation. We further reprogram primed H9 cells into naïve-like H9 cells to study the cellular-state transition process. We find that genes related to hemogenic endothelium development are enriched in naïve-like H9. Functionally, naïve-like H9 show higher potency for differentiation into hematopoietic lineages than primed cells. Our single-cell analysis reveals the cellular-state landscape of hPSC early differentiation, offering new insights that can be harnessed for optimization of differentiation protocols.
NASA Astrophysics Data System (ADS)
Liu, Xiaoqin; Francis, Richard; Tobita, Kimimasa; Kim, Andy; Leatherbury, Linda; Lo, Cecilia W.
2013-02-01
Ultrasound biomicroscopy (UBM) is ideally suited for phenotyping fetal mice for congenital heart disease (CHD), as imaging can be carried out noninvasively to provide both hemodynamic and structural information essential for CHD diagnosis. Using the UBM (Vevo 2100; 40Hz) in conjunction with the clinical ultrasound system (Acuson Sequioa C512; 15Hz), we developed a two-step screening protocol to scan thousands fetuses derived from ENU mutagenized pedigrees. A wide spectrum of CHD was detected by the UBM, which were subsequently confirmed with follow-up necropsy and histopathology examination with episcopic fluorescence image capture. CHD observed included outflow anomalies, left/right heart obstructive lesions, septal/valvular defects and cardiac situs anomalies. Meanwhile, various extracardiac defects were found, such as polydactyly, craniofacial defects, exencephaly, omphalocele-cleft palate, most of which were associated with cardiac defects. Our analyses showed the UBM was better at assessing cardiac structure and blood flow profiles, while conventional ultrasound allowed higher throughput low-resolution screening. Our study showed the integration of conventional clinical ultrasound imaging with the UBM for fetal mouse cardiovascular phenotyping can maximize the detection and recovery of CHD mutants.
NASA Astrophysics Data System (ADS)
Foronda, Augusto; Ohta, Chikara; Tamaki, Hisashi
Dirty paper coding (DPC) is a strategy to achieve the region capacity of multiple input multiple output (MIMO) downlink channels and a DPC scheduler is throughput optimal if users are selected according to their queue states and current rates. However, DPC is difficult to implement in practical systems. One solution, zero-forcing beamforming (ZFBF) strategy has been proposed to achieve the same asymptotic sum rate capacity as that of DPC with an exhaustive search over the entire user set. Some suboptimal user group selection schedulers with reduced complexity based on ZFBF strategy (ZFBF-SUS) and proportional fair (PF) scheduling algorithm (PF-ZFBF) have also been proposed to enhance the throughput and fairness among the users, respectively. However, they are not throughput optimal, fairness and throughput decrease if each user queue length is different due to different users channel quality. Therefore, we propose two different scheduling algorithms: a throughput optimal scheduling algorithm (ZFBF-TO) and a reduced complexity scheduling algorithm (ZFBF-RC). Both are based on ZFBF strategy and, at every time slot, the scheduling algorithms have to select some users based on user channel quality, user queue length and orthogonality among users. Moreover, the proposed algorithms have to produce the rate allocation and power allocation for the selected users based on a modified water filling method. We analyze the schedulers complexity and numerical results show that ZFBF-RC provides throughput and fairness improvements compared to the ZFBF-SUS and PF-ZFBF scheduling algorithms.
NASA Astrophysics Data System (ADS)
Wang, Liping; Ji, Yusheng; Liu, Fuqiang
The integration of multihop relays with orthogonal frequency-division multiple access (OFDMA) cellular infrastructures can meet the growing demands for better coverage and higher throughput. Resource allocation in the OFDMA two-hop relay system is more complex than that in the conventional single-hop OFDMA system. With time division between transmissions from the base station (BS) and those from relay stations (RSs), fixed partitioning of the BS subframe and RS subframes can not adapt to various traffic demands. Moreover, single-hop scheduling algorithms can not be used directly in the two-hop system. Therefore, we propose a semi-distributed algorithm called ASP to adjust the length of every subframe adaptively, and suggest two ways to extend single-hop scheduling algorithms into multihop scenarios: link-based and end-to-end approaches. Simulation results indicate that the ASP algorithm increases system utilization and fairness. The max carrier-to-interference ratio (Max C/I) and proportional fairness (PF) scheduling algorithms extended using the end-to-end approach obtain higher throughput than those using the link-based approach, but at the expense of more overhead for information exchange between the BS and RSs. The resource allocation scheme using ASP and end-to-end PF scheduling achieves a tradeoff between system throughput maximization and fairness.
ac electroosmotic pumping induced by noncontact external electrodes.
Wang, Shau-Chun; Chen, Hsiao-Ping; Chang, Hsueh-Chia
2007-09-21
Electroosmotic (EO) pumps based on dc electroosmosis is plagued by bubble generation and other electrochemical reactions at the electrodes at voltages beyond 1 V for electrolytes. These disadvantages limit their throughput and offset their portability advantage over mechanical syringe or pneumatic pumps. ac electroosmotic pumps at high frequency (>100 kHz) circumvent the bubble problem by inducing polarization and slip velocity on embedded electrodes,1 but they require complex electrode designs to produce a net flow. We report a new high-throughput ac EO pump design based on induced-polarization on the entire channel surface instead of just on the electrodes. Like dc EO pumps, our pump electrodes are outside of the load section and form a cm-long pump unit consisting of three circular reservoirs (3 mm in diameter) connected by a 1x1 mm channel. The field-induced polarization can produce an effective Zeta potential exceeding 1 V and an ac slip velocity estimated as 1 mmsec or higher, both one order of magnitude higher than earlier dc and ac pumps, giving rise to a maximum throughput of 1 mulsec. Polarization over the entire channel surface, quadratic scaling with respect to the field and high voltage at high frequency without electrode bubble generation are the reasons why the current pump is superior to earlier dc and ac EO pumps.
Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen
2015-04-15
High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
High count-rate study of two TES x-ray microcalorimeters with different transition temperatures
NASA Astrophysics Data System (ADS)
Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; Sadleir, John E.; Smith, Stephen J.; Wassell, Edward J.
2017-10-01
We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures (T c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T cs had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.3 eV at 6 keV from lower and higher T c devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the so-called event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96% throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T c (faster) device, and 5.8 eV FWHM with 97% throughput with the lower T c (slower) device at 722 Hz.
Zhao, Meng-Meng; Du, Shan-Shan; Li, Qiu-Hong; Chen, Tao; Qiu, Hui; Wu, Qin; Chen, Shan-Shan; Zhou, Ying; Zhang, Yuan; Hu, Yang; Su, Yi-Liang; Shen, Li; Zhang, Fen; Weng, Dong; Li, Hui-Ping
2017-02-01
This study aims to use high throughput 16SrRNA gene sequencing to examine the bacterial profile of lymph node biopsy samples of patients with sarcoidosis and to further verify the association between Propionibacterium acnes (P. acnes) and sarcoidosis. A total of 36 mediastinal lymph node biopsy specimens were collected from 17 cases of sarcoidosis, 8 tuberculosis (TB group), and 11 non-infectious lung diseases (control group). The V4 region of the bacterial 16SrRNA gene in the specimens was amplified and sequenced using the high throughput sequencing platform MiSeq, and bacterial profile was established. The data analysis software QIIME and Metastats were used to compare bacterial relative abundance in the three patient groups. Overall, 545 genera were identified; 38 showed significantly lower and 29 had significantly higher relative abundance in the sarcoidosis group than in the TB and control groups (P < 0.01). P. acnes 16SrRNA was exclusively found in all the 17 samples of the sarcoidosis group, whereas was not detected in the TB and control groups. The relative abundance of P. acnes in the sarcoidosis group (0.16% ± 0. 11%) was significantly higher than that in the TB (Metastats analysis: P = 0.0010, q = 0.0044) and control groups (Metastats analysis: P = 0.0010, q = 0.0038). The relative abundance of P. granulosum was only 0.0022% ± 0. 0044% in the sarcoidosis group. P. granulosum 16SrRNA was not detected in the other two groups. High throughput 16SrRNA gene sequencing appears to be a useful tool to investigate the bacterial profile of sarcoidosis specimens. The results suggest that P. acnes may be involved in sarcoidosis development.
Design of differential optical absorption spectroscopy long-path telescopes based on fiber optics.
Merten, André; Tschritter, Jens; Platt, Ulrich
2011-02-10
We present a new design principle of telescopes for use in the spectral investigation of the atmosphere and the detection of atmospheric trace gases with the long-path differential optical absorption spectroscopy (DOAS) technique. A combination of emitting and receiving fibers in a single bundle replaces the commonly used coaxial-Newton-type combination of receiving and transmitting telescope. This very simplified setup offers a higher light throughput and simpler adjustment and allows smaller instruments, which are easier to handle and more portable. The higher transmittance was verified by ray-tracing calculations, which result in a theoretical factor threefold improvement in signal intensity compared with the old setup. In practice, due to the easier alignment and higher stability, up to factor of 10 higher signal intensities were found. In addition, the use of a fiber optic light source provides a better spectral characterization of the light source, which results in a lower detection limit for trace gases studied with this instrument. This new design will greatly enhance the usability and the range of applications of active DOAS instruments.
Sobhani, R; McVicker, R; Spangenberg, C; Rosso, D
2012-01-01
In regions characterized by water scarcity, such as coastal Southern California, groundwater containing chromophoric dissolved organic matter is a viable source of water supply. In the coastal aquifer of Orange County in California, seawater intrusion driven by coastal groundwater pumping increased the concentration of bromide in extracted groundwater from 0.4 mg l⁻¹ in 2000 to over 0.8 mg l⁻¹ in 2004. Bromide, a precursor to bromate formation is regulated by USEPA and the California Department of Health as a potential carcinogen and therefore must be reduced to a level below 10 μg l⁻¹. This paper compares two processes for treatment of highly coloured groundwater: nanofiltration and ozone injection coupled with biologically activated carbon. The requirement for bromate removal decreased the water production in the ozonation process to compensate for increased maintenance requirements, and required the adoption of catalytic carbon with associated increase in capital and operating costs per unit volume. However, due to the absence of oxidant addition in nanofiltration processes, this process is not affected by bromide. We performed a process analysis and a comparative economic analysis of capital and operating costs for both technologies. Our results show that for the case studied in coastal Southern California, nanofiltration has higher throughput and lower specific capital and operating cost, when compared to ozone injection with biologically activate carbon. Ozone injection with biologically activated carbon, compared to nanofiltration, has 14% higher capital cost and 12% higher operating costs per unit water produced while operating at the initial throughput. Due to reduced ozone concentration required to accommodate for bromate reduction, the ozonation process throughput is reduced and the actual cost increase (per unit water produced) is 68% higher for capital cost and 30% higher for operations. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Elo; Huang, Amy; Cadag, Eithon
In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less
Leung, Elo; Huang, Amy; Cadag, Eithon; ...
2016-01-20
In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less
Boozer, Christina; Kim, Gibum; Cong, Shuxin; Guan, Hannwen; Londergan, Timothy
2006-08-01
Surface plasmon resonance (SPR) biosensors have enabled a wide range of applications in which researchers can monitor biomolecular interactions in real time. Owing to the fact that SPR can provide affinity and kinetic data, unique features in applications ranging from protein-peptide interaction analysis to cellular ligation experiments have been demonstrated. Although SPR has historically been limited by its throughput, new methods are emerging that allow for the simultaneous analysis of many thousands of interactions. When coupled with new protein array technologies, high-throughput SPR methods give users new and improved methods to analyze pathways, screen drug candidates and monitor protein-protein interactions.
Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor
2016-01-01
Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.
NASA Astrophysics Data System (ADS)
Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.
2017-11-01
Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.
High Throughput Determination of Tetramine in Drinking ...
Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.
The Adverse Outcome Pathway (AOP) framework provides a systematic way to describe linkages between molecular and cellular processes and organism or population level effects. The current AOP assembly methods however, are inefficient. Our goal is to generate computationally-pr...
Digital Microwave System Design Guide.
1984-02-01
traffic analysis is a continuous effort, setting parameters for subsequent stages of expansion after the system design is finished. 2.1.3 Quality of...operational structure of the user for whom he is providing service. 2.2.3 Quality of Service. In digital communications, the basic performance parameter ...the basic interpretation of system performance is measured in terms of a single parameter , throughput. Throughput can be defined as the number of
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.
2016-01-01
Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240
High-throughput selection for cellulase catalysts using chemical complementation.
Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W
2008-12-24
Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.
A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation
Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.
2010-01-01
Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460
Improving the quality of physician communication with rapid-throughput analysis and report cards.
Farrell, Michael H; Christopher, Stephanie A; La Pean Kirschner, Alison; Roedl, Sara J; O'Tool, Faith O; Ahmad, Nadia Y; Farrell, Philip M
2014-11-01
Problems with clinician-patient communication negatively impact newborn screening, genetics, and all of healthcare. Training programs teach communication, but educational methods are not feasible for entire populations of clinicians. To address this healthcare quality gap, we developed a Communication Quality Assurance intervention. Child health providers volunteered for a randomized controlled trial of assessment and a report card. Participants provided telephone counseling to a standardized parent regarding a newborn screening result showing heterozygous status for cystic fibrosis or sickle cell disease. Our rapid-throughput timeline allows individualized feedback within a week. Two encounters were recorded (baseline and after a random sample received the report card) and abstracted for four groups of communication quality indicators. 92 participants finished both counseling encounters within our rapid-throughput time limits. Participants randomized to receive the report card improved communication behaviors more than controls, including request for teach-back (p<0.01), opening behaviors (p=0.01), anticipate/validate emotion (p<0.001) and the ratio of explained to unexplained jargon words (p<0.03). The rapid-throughput report card is effective at improving specific communication behaviors. Communication can be taught, but this project shows how healthcare organizations can assure communication quality everywhere. Further implementation could improve newborn screening, genetics, and healthcare in general. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
High-throughput electrophysiological assays for voltage gated ion channels using SyncroPatch 768PE.
Li, Tianbo; Lu, Gang; Chiang, Eugene Y; Chernov-Rogan, Tania; Grogan, Jane L; Chen, Jun
2017-01-01
Ion channels regulate a variety of physiological processes and represent an important class of drug target. Among the many methods of studying ion channel function, patch clamp electrophysiology is considered the gold standard by providing the ultimate precision and flexibility. However, its utility in ion channel drug discovery is impeded by low throughput. Additionally, characterization of endogenous ion channels in primary cells remains technical challenging. In recent years, many automated patch clamp (APC) platforms have been developed to overcome these challenges, albeit with varying throughput, data quality and success rate. In this study, we utilized SyncroPatch 768PE, one of the latest generation APC platforms which conducts parallel recording from two-384 modules with giga-seal data quality, to push these 2 boundaries. By optimizing various cell patching parameters and a two-step voltage protocol, we developed a high throughput APC assay for the voltage-gated sodium channel Nav1.7. By testing a group of Nav1.7 reference compounds' IC50, this assay was proved to be highly consistent with manual patch clamp (R > 0.9). In a pilot screening of 10,000 compounds, the success rate, defined by > 500 MΩ seal resistance and >500 pA peak current, was 79%. The assay was robust with daily throughput ~ 6,000 data points and Z' factor 0.72. Using the same platform, we also successfully recorded endogenous voltage-gated potassium channel Kv1.3 in primary T cells. Together, our data suggest that SyncroPatch 768PE provides a powerful platform for ion channel research and drug discovery.
NASA Astrophysics Data System (ADS)
Smith, Geoffrey B.; Earp, Alan; Franklin, Jim B.; McCredie, Geoffrey
2001-11-01
Simple quantitative performance criteria are developed for translucent materials in terms of hemispherical visible transmittance, and angular spread of transmitted luminance using a half angle. Criteria are linked to applications in luminaires and skylights with emphasis on maximising visible throughput while minimising glare. These basic criteria are also extended to angle of incidence changes which are substantial. Example data is provided showing that acrylic pigmented with spherical polymer particles can have total hemispherical transmittance with weak thickness dependence, which is better than clear sheet, while the spread of transmitted light is quite thickness-sensitive and occurs over wider angles than inorganic pigments. This combination means significantly fewer lamps can achieve specified lux levels with low glare, and smaller skylights can provide higher, more uniform daylight illuminance.
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST)
Dowd, Scot E; Zaragoza, Joaquin; Rodriguez, Javier R; Oliver, Melvin J; Payton, Paxton R
2005-01-01
Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST), which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN). W.ND-BLAST provides intuitive Graphic User Interfaces (GUI) for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours) on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV) and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is freely downloadable from . With registration the software is free, installation, networking, and usage instructions are provided as well as a support forum. PMID:15819992
Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis
NASA Astrophysics Data System (ADS)
Mohamed Ismael, Hawa; Vandyck, George Kobina
The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.
Chabbert, Christophe D; Adjalley, Sophie H; Steinmetz, Lars M; Pelechano, Vicent
2018-01-01
Chromatin immunoprecipitation followed by sequencing (ChIP-Seq) or microarray hybridization (ChIP-on-chip) are standard methods for the study of transcription factor binding sites and histone chemical modifications. However, these approaches only allow profiling of a single factor or protein modification at a time.In this chapter, we present Bar-ChIP, a higher throughput version of ChIP-Seq that relies on the direct ligation of molecular barcodes to chromatin fragments. Bar-ChIP enables the concurrent profiling of multiple DNA-protein interactions and is therefore amenable to experimental scale-up, without the need for any robotic instrumentation.
MoRu/Be multilayers for extreme ultraviolet applications
Bajt, Sasa C.; Wall, Mark A.
2001-01-01
High reflectance, low intrinsic roughness and low stress multilayer systems for extreme ultraviolet (EUV) lithography comprise amorphous layers MoRu and crystalline Be layers. Reflectance greater than 70% has been demonstrated for MoRu/Be multilayers with 50 bilayer pairs. Optical throughput of MoRu/Be multilayers can be 30-40% higher than that of Mo/Be multilayer coatings. The throughput can be improved using a diffusion barrier to make sharper interfaces. A capping layer on the top surface of the multilayer improves the long-term reflectance and EUV radiation stability of the multilayer by forming a very thin native oxide that is water resistant.
Liaskou, Evaggelia; Klemsdal Henriksen, Eva Kristine; Holm, Kristian; Kaveh, Fatemeh; Hamm, David; Fear, Janine; Viken, Marte K; Hov, Johannes Roksund; Melum, Espen; Robins, Harlan; Olweus, Johanna; Karlsen, Tom H; Hirschfield, Gideon M
2016-05-01
Hepatic T-cell infiltrates and a strong genetic human leukocyte antigen association represent characteristic features of various immune-mediated liver diseases. Conceptually the presence of disease-associated antigens is predicted to be reflected in T-cell receptor (TCR) repertoires. Here, we aimed to determine if disease-associated TCRs could be identified in the nonviral chronic liver diseases primary biliary cirrhosis (PBC), primary sclerosing cholangitis (PSC), and alcoholic liver disease (ALD). We performed high-throughput sequencing of the TCRβ chain complementarity-determining region 3 of liver-infiltrating T cells from PSC (n = 20), PBC (n = 10), and ALD (n = 10) patients, alongside genomic human leukocyte antigen typing. The frequency of TCRβ nucleotide sequences was significantly higher in PSC samples (2.53 ± 0.80, mean ± standard error of the mean) compared to PBC samples (1.13 ± 0.17, P < 0.0001) and ALD samples (0.62 ± 0.10, P < 0.0001). An average clonotype overlap of 0.85% was detected among PSC samples, significantly higher compared to the average overlap of 0.77% seen within the PBC (P = 0.024) and ALD groups (0.40%, P < 0.0001). From eight to 42 clonotypes were uniquely detected in each of the three disease groups (≥30% of the respective patient samples). Multiple, unique sequences using different variable family genes encoded the same amino acid clonotypes, providing additional support for antigen-driven selection. In PSC and PBC, disease-associated clonotypes were detected among patients with human leukocyte antigen susceptibility alleles. We demonstrate liver-infiltrating disease-associated clonotypes in all three diseases evaluated, and evidence for antigen-driven clonal expansions. Our findings indicate that differential TCR signatures, as determined by high-throughput sequencing, may represent an imprint of distinctive antigenic repertoires present in the different chronic liver diseases; this thereby opens up the prospect of studying disease-relevant T cells in order to better understand and treat liver disease. © 2015 by the American Association for the Study of Liver Diseases.
Capers, Patrice L.; Brown, Andrew W.; Dawson, John A.; Allison, David B.
2015-01-01
Background: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing) has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. Objective: To evaluate the use of double sampling combined with multiple imputation (DS + MI) to address meta-research questions, using as an example adherence of PubMed entries to two simple consolidated standards of reporting trials guidelines for titles and abstracts. Methods: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT, human, abstract available, and English language (n = 322, 107). For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI) method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO) human rating method. Multiple imputation of the missing-completely at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. Results: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title = 1.00, abstract = 0.92). Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS + MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by year: subsample RHITLO 1.050–1.174 vs. DS + MI 1.082–1.151). As evidence of improved accuracy, DS + MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. Conclusion: Our results support our hypothesis that DS + MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of literature. PMID:25988135
Burdick, David B; Cavnor, Chris C; Handcock, Jeremy; Killcoyne, Sarah; Lin, Jake; Marzolf, Bruz; Ramsey, Stephen A; Rovira, Hector; Bressler, Ryan; Shmulevich, Ilya; Boyle, John
2010-07-14
High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services.
2010-01-01
Background High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Results Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. Conclusion The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services. PMID:20630057
Thermodynamic Studies for Drug Design and Screening
Garbett, Nichola C.; Chaires, Jonathan B.
2012-01-01
Introduction A key part of drug design and development is the optimization of molecular interactions between an engineered drug candidate and its binding target. Thermodynamic characterization provides information about the balance of energetic forces driving binding interactions and is essential for understanding and optimizing molecular interactions. Areas covered This review discusses the information that can be obtained from thermodynamic measurements and how this can be applied to the drug development process. Current approaches for the measurement and optimization of thermodynamic parameters are presented, specifically higher throughput and calorimetric methods. Relevant literature for this review was identified in part by bibliographic searches for the period 2004 – 2011 using the Science Citation Index and PUBMED and the keywords listed below. Expert opinion The most effective drug design and development platform comes from an integrated process utilizing all available information from structural, thermodynamic and biological studies. Continuing evolution in our understanding of the energetic basis of molecular interactions and advances in thermodynamic methods for widespread application are essential to realize the goal of thermodynamically-driven drug design. Comprehensive thermodynamic evaluation is vital early in the drug development process to speed drug development towards an optimal energetic interaction profile while retaining good pharmacological properties. Practical thermodynamic approaches, such as enthalpic optimization, thermodynamic optimization plots and the enthalpic efficiency index, have now matured to provide proven utility in design process. Improved throughput in calorimetric methods remains essential for even greater integration of thermodynamics into drug design. PMID:22458502
The Effects of Transcranial Direct Current Stimulation (tDCS) on Multitasking Throughput Capacity
Nelson, Justin; McKinley, Richard A.; Phillips, Chandler; McIntire, Lindsey; Goodyear, Chuck; Kreiner, Aerial; Monforton, Lanie
2016-01-01
Background: Multitasking has become an integral attribute associated with military operations within the past several decades. As the amount of information that needs to be processed during these high level multitasking environments exceeds the human operators' capabilities, the information throughput capacity reaches an asymptotic limit. At this point, the human operator can no longer effectively process and respond to the incoming information resulting in a plateau or decline in performance. The objective of the study was to evaluate the efficacy of a non-invasive brain stimulation technique known as transcranial direct current stimulation (tDCS) applied to a scalp location over the left dorsolateral prefrontal cortex (lDLPFC) to improve information processing capabilities during a multitasking environment. Methods: The study consisted of 20 participants from Wright-Patterson Air Force Base (16 male and 4 female) with an average age of 31.1 (SD = 4.5). Participants were randomly assigned into two groups, each consisting of eight males and two females. Group one received 2 mA of anodal tDCS and group two received sham tDCS over the lDLPFC on their testing day. Results: The findings indicate that anodal tDCS significantly improves the participants' information processing capability resulting in improved performance compared to sham tDCS. For example, the multitasking throughput capacity for the sham tDCS group plateaued near 1.0 bits/s at the higher baud input (2.0 bits/s) whereas the anodal tDCS group plateaued near 1.3 bits/s. Conclusion: The findings provided new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator. Future research should be conducted to determine the longevity of the enhancement of transcranial direct current stimulation on multitasking performance, which has yet to be accomplished. PMID:27965553
The Effects of Transcranial Direct Current Stimulation (tDCS) on Multitasking Throughput Capacity.
Nelson, Justin; McKinley, Richard A; Phillips, Chandler; McIntire, Lindsey; Goodyear, Chuck; Kreiner, Aerial; Monforton, Lanie
2016-01-01
Background: Multitasking has become an integral attribute associated with military operations within the past several decades. As the amount of information that needs to be processed during these high level multitasking environments exceeds the human operators' capabilities, the information throughput capacity reaches an asymptotic limit. At this point, the human operator can no longer effectively process and respond to the incoming information resulting in a plateau or decline in performance. The objective of the study was to evaluate the efficacy of a non-invasive brain stimulation technique known as transcranial direct current stimulation (tDCS) applied to a scalp location over the left dorsolateral prefrontal cortex (lDLPFC) to improve information processing capabilities during a multitasking environment. Methods: The study consisted of 20 participants from Wright-Patterson Air Force Base (16 male and 4 female) with an average age of 31.1 (SD = 4.5). Participants were randomly assigned into two groups, each consisting of eight males and two females. Group one received 2 mA of anodal tDCS and group two received sham tDCS over the lDLPFC on their testing day. Results: The findings indicate that anodal tDCS significantly improves the participants' information processing capability resulting in improved performance compared to sham tDCS. For example, the multitasking throughput capacity for the sham tDCS group plateaued near 1.0 bits/s at the higher baud input (2.0 bits/s) whereas the anodal tDCS group plateaued near 1.3 bits/s. Conclusion: The findings provided new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator. Future research should be conducted to determine the longevity of the enhancement of transcranial direct current stimulation on multitasking performance, which has yet to be accomplished.
NASA Astrophysics Data System (ADS)
Lagus, Todd P.; Edd, Jon F.
2013-03-01
Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.
High-Throughput Cloning and Expression Library Creation for Functional Proteomics
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-01-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047
Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W
2001-07-01
The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.
Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko
2012-01-01
Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904
RIPiT-Seq: A high-throughput approach for footprinting RNA:protein complexes
Singh, Guramrit; Ricci, Emiliano P.; Moore, Melissa J.
2013-01-01
Development of high-throughput approaches to map the RNA interaction sites of individual RNA binding proteins (RBPs) transcriptome-wide is rapidly transforming our understanding of post-transcriptional gene regulatory mechanisms. Here we describe a ribonucleoprotein (RNP) footprinting approach we recently developed for identifying occupancy sites of both individual RBPs and multi-subunit RNP complexes. RNA:protein immunoprecipitation in tandem (RIPiT) yields highly specific RNA footprints of cellular RNPs isolated via two sequential purifications; the resulting RNA footprints can then be identified by high-throughput sequencing (Seq). RIPiT-Seq is broadly applicable to all RBPs regardless of their RNA binding mode and thus provides a means to map the RNA binding sites of RBPs with poor inherent ultraviolet (UV) crosslinkability. Further, among current high-throughput approaches, RIPiT has the unique capacity to differentiate binding sites of RNPs with overlapping protein composition. It is therefore particularly suited for studying dynamic RNP assemblages whose composition evolves as gene expression proceeds. PMID:24096052
An improved high-throughput lipid extraction method for the analysis of human brain lipids.
Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett
2013-03-01
We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.
High-throughput cloning and expression library creation for functional proteomics.
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-05-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particularly important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single-gene experiments, creating the need for fast, flexible, and reliable cloning systems. These collections of ORF clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial, we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator(TM) DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This tutorial is part of the International Proteomics Tutorial Programme (IPTP12). © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela
2016-10-01
Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.
High-throughput determination of structural phase diagram and constituent phases using GRENDEL
NASA Astrophysics Data System (ADS)
Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.
2015-11-01
Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.
Large-Scale Biomonitoring of Remote and Threatened Ecosystems via High-Throughput Sequencing
Gibson, Joel F.; Shokralla, Shadi; Curry, Colin; Baird, Donald J.; Monk, Wendy A.; King, Ian; Hajibabaei, Mehrdad
2015-01-01
Biodiversity metrics are critical for assessment and monitoring of ecosystems threatened by anthropogenic stressors. Existing sorting and identification methods are too expensive and labour-intensive to be scaled up to meet management needs. Alternately, a high-throughput DNA sequencing approach could be used to determine biodiversity metrics from bulk environmental samples collected as part of a large-scale biomonitoring program. Here we show that both morphological and DNA sequence-based analyses are suitable for recovery of individual taxonomic richness, estimation of proportional abundance, and calculation of biodiversity metrics using a set of 24 benthic samples collected in the Peace-Athabasca Delta region of Canada. The high-throughput sequencing approach was able to recover all metrics with a higher degree of taxonomic resolution than morphological analysis. The reduced cost and increased capacity of DNA sequence-based approaches will finally allow environmental monitoring programs to operate at the geographical and temporal scale required by industrial and regulatory end-users. PMID:26488407
NASA Astrophysics Data System (ADS)
Taoka, Hidekazu; Kishiyama, Yoshihisa; Higuchi, Kenichi; Sawahashi, Mamoru
This paper presents comparisons between common and dedicated reference signals (RSs) for channel estimation in MIMO multiplexing using codebook-based precoding for orthogonal frequency division multiplexing (OFDM) radio access in the Evolved UTRA downlink with frequency division duplexing (FDD). We clarify the best RS structure for precoding-based MIMO multiplexing based on comparisons of the structures in terms of the achievable throughput taking into account the overhead of the common and dedicated RSs and the precoding matrix indication (PMI) signal. Based on extensive simulations on the throughput in 2-by-2 and 4-by-4 MIMO multiplexing with precoding, we clarify that channel estimation based on common RSs multiplied with the precoding matrix indicated by the PMI signal achieves higher throughput compared to that using dedicated RSs irrespective of the number of spatial multiplexing streams when the number of available precoding matrices, i.e., the codebook size, is less than approximately 16 and 32 for 2-by-2 and 4-by-4 MIMO multiplexing, respectively.
Problems in processing Rheinische Braunkohle (soft coal) (in German)
DOE Office of Scientific and Technical Information (OSTI.GOV)
von Hartmann, G.B.
At Wesseling, difficulties were encountered with the hydrogenation of Rhine brown coal. The hydrogenation reaction was proceeding too rapidly at 600 atm pressure under relatively low temperature and throughput conditions. This caused a build-up of ''caviar'' deposits containing ash and asphalts. This flocculation of asphalt seemed to arise because the rapid reaction produced a liquid medium unable to hold the heavy asphalt particles in suspension. A stronger paraffinic character of the oil was also a result. To obtain practical, problem-free yields, throughput had to be increased (from .4 kg/liter/hr to more than .5), and temperature had to be increased (frommore » 24.0 MV to 24,8 MV). Further, a considerable increase in sludge recycling was recommended. The Wesseling plant was unable to increase the temperature and throughput. However, more sludge was recycled, producing a paste better able to hold higher-molecular-weight particles in suspension. If this were not to solve the ''caviar'' deposit problems, further recommendations were suggested including addition of more heavy oil.« less
ERIC Educational Resources Information Center
Lindberg, Matti
2014-01-01
This study illustrates the differences between Finnish and British graduates in the higher education-to-work transition and related market mechanisms in the year 2000. Specifically, the differences between the Finnish and British students' academic careers and ability to find employment after graduation were evaluated in relation to the Finnish HE…
ac electroosmotic pumping induced by noncontact external electrodes
Wang, Shau-Chun; Chen, Hsiao-Ping; Chang, Hsueh-Chia
2007-01-01
Electroosmotic (EO) pumps based on dc electroosmosis is plagued by bubble generation and other electrochemical reactions at the electrodes at voltages beyond 1 V for electrolytes. These disadvantages limit their throughput and offset their portability advantage over mechanical syringe or pneumatic pumps. ac electroosmotic pumps at high frequency (>100 kHz) circumvent the bubble problem by inducing polarization and slip velocity on embedded electrodes,1 but they require complex electrode designs to produce a net flow. We report a new high-throughput ac EO pump design based on induced-polarization on the entire channel surface instead of just on the electrodes. Like dc EO pumps, our pump electrodes are outside of the load section and form a cm-long pump unit consisting of three circular reservoirs (3 mm in diameter) connected by a 1×1 mm channel. The field-induced polarization can produce an effective Zeta potential exceeding 1 V and an ac slip velocity estimated as 1 mm∕sec or higher, both one order of magnitude higher than earlier dc and ac pumps, giving rise to a maximum throughput of 1 μl∕sec. Polarization over the entire channel surface, quadratic scaling with respect to the field and high voltage at high frequency without electrode bubble generation are the reasons why the current pump is superior to earlier dc and ac EO pumps. PMID:19693362
From cancer genomes to cancer models: bridging the gaps
Baudot, Anaïs; Real, Francisco X.; Izarzugaza, José M. G.; Valencia, Alfonso
2009-01-01
Cancer genome projects are now being expanded in an attempt to provide complete landscapes of the mutations that exist in tumours. Although the importance of cataloguing genome variations is well recognized, there are obvious difficulties in bridging the gaps between high-throughput resequencing information and the molecular mechanisms of cancer evolution. Here, we describe the current status of the high-throughput genomic technologies, and the current limitations of the associated computational analysis and experimental validation of cancer genetic variants. We emphasize how the current cancer-evolution models will be influenced by the high-throughput approaches, in particular through efforts devoted to monitoring tumour progression, and how, in turn, the integration of data and models will be translated into mechanistic knowledge and clinical applications. PMID:19305388
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).
Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.
Loeffler 4.0: Diagnostic Metagenomics.
Höper, Dirk; Wylezich, Claudia; Beer, Martin
2017-01-01
A new world of possibilities for "virus discovery" was opened up with high-throughput sequencing becoming available in the last decade. While scientifically metagenomic analysis was established before the start of the era of high-throughput sequencing, the availability of the first second-generation sequencers was the kick-off for diagnosticians to use sequencing for the detection of novel pathogens. Today, diagnostic metagenomics is becoming the standard procedure for the detection and genetic characterization of new viruses or novel virus variants. Here, we provide an overview about technical considerations of high-throughput sequencing-based diagnostic metagenomics together with selected examples of "virus discovery" for animal diseases or zoonoses and metagenomics for food safety or basic veterinary research. © 2017 Elsevier Inc. All rights reserved.
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; ...
2016-09-23
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio)
Usui, Takuji; Noble, Daniel W.A.; O’Dea, Rose E.; Fangmeier, Melissa L.; Lagisz, Malgorzata; Hesselson, Daniel
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34–0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes. PMID:29372124
Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D
2017-05-11
Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.
Suram, Santosh K; Newhouse, Paul F; Zhou, Lan; Van Campen, Douglas G; Mehta, Apurva; Gregoire, John M
2016-11-14
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4 V 1.5 Fe 0.5 O 10.5 as a light absorber with direct band gap near 2.7 eV. The strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platform for identifying new optical materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko
2010-06-23
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less
Quantifying protein-protein interactions in high throughput using protein domain microarrays.
Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin
2010-04-01
Protein microarrays provide an efficient way to identify and quantify protein-protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain-peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (K(D)s) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein-ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein-protein interaction networks.
Ethoscopes: An open platform for high-throughput ethomics.
Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J; French, Alice S; Jamasb, Arian R; Gilestro, Giorgio F
2017-10-01
Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.
Kim, Sung-Hou [Moraga, CA; Kim, Rosalind [Moraga, CA; Jancarik, Jamila [Walnut Creek, CA
2012-01-31
An optimum solubility screen in which a panel of buffers and many additives are provided in order to obtain the most homogeneous and monodisperse protein condition for protein crystallization. The present methods are useful for proteins that aggregate and cannot be concentrated prior to setting up crystallization screens. A high-throughput method using the hanging-drop method and vapor diffusion equilibrium and a panel of twenty-four buffers is further provided. Using the present methods, 14 poorly behaving proteins have been screened, resulting in 11 of the proteins having highly improved dynamic light scattering results allowing concentration of the proteins, and 9 were crystallized.
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
High performance hybrid magnetic structure for biotechnology applications
Humphries, David E; Pollard, Martin J; Elkin, Christopher J
2005-10-11
The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.
High performance hybrid magnetic structure for biotechnology applications
Humphries, David E.; Pollard, Martin J.; Elkin, Christopher J.
2006-12-12
The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
web cellHTS2: a web-application for the analysis of high-throughput screening data.
Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael
2010-04-12
The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.
Web server for priority ordered multimedia services
NASA Astrophysics Data System (ADS)
Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund
2001-10-01
In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.
Wu, Nicholas C.; Young, Arthur P.; Al-Mawsawi, Laith Q.; Olson, C. Anders; Feng, Jun; Qi, Hangfei; Luan, Harding H.; Li, Xinmin; Wu, Ting-Ting
2014-01-01
ABSTRACT Viral proteins often display several functions which require multiple assays to dissect their genetic basis. Here, we describe a systematic approach to screen for loss-of-function mutations that confer a fitness disadvantage under a specified growth condition. Our methodology was achieved by genetically monitoring a mutant library under two growth conditions, with and without interferon, by deep sequencing. We employed a molecular tagging technique to distinguish true mutations from sequencing error. This approach enabled us to identify mutations that were negatively selected against, in addition to those that were positively selected for. Using this technique, we identified loss-of-function mutations in the influenza A virus NS segment that were sensitive to type I interferon in a high-throughput fashion. Mechanistic characterization further showed that a single substitution, D92Y, resulted in the inability of NS to inhibit RIG-I ubiquitination. The approach described in this study can be applied under any specified condition for any virus that can be genetically manipulated. IMPORTANCE Traditional genetics focuses on a single genotype-phenotype relationship, whereas high-throughput genetics permits phenotypic characterization of numerous mutants in parallel. High-throughput genetics often involves monitoring of a mutant library with deep sequencing. However, deep sequencing suffers from a high error rate (∼0.1 to 1%), which is usually higher than the occurrence frequency for individual point mutations within a mutant library. Therefore, only mutations that confer a fitness advantage can be identified with confidence due to an enrichment in the occurrence frequency. In contrast, it is impossible to identify deleterious mutations using most next-generation sequencing techniques. In this study, we have applied a molecular tagging technique to distinguish true mutations from sequencing errors. It enabled us to identify mutations that underwent negative selection, in addition to mutations that experienced positive selection. This study provides a proof of concept by screening for loss-of-function mutations on the influenza A virus NS segment that are involved in its anti-interferon activity. PMID:24965464
Revealing cancer subtypes with higher-order correlations applied to imaging and omics data.
Graim, Kiley; Liu, Tiffany Ting; Achrol, Achal S; Paull, Evan O; Newton, Yulia; Chang, Steven D; Harsh, Griffith R; Cordero, Sergio P; Rubin, Daniel L; Stuart, Joshua M
2017-03-31
Patient stratification to identify subtypes with different disease manifestations, severity, and expected survival time is a critical task in cancer diagnosis and treatment. While stratification approaches using various biomarkers (including high-throughput gene expression measurements) for patient-to-patient comparisons have been successful in elucidating previously unseen subtypes, there remains an untapped potential of incorporating various genotypic and phenotypic data to discover novel or improved groupings. Here, we present HOCUS, a unified analytical framework for patient stratification that uses a community detection technique to extract subtypes out of sparse patient measurements. HOCUS constructs a patient-to-patient network from similarities in the data and iteratively groups and reconstructs the network into higher order clusters. We investigate the merits of using higher-order correlations to cluster samples of cancer patients in terms of their associations with survival outcomes. In an initial test of the method, the approach identifies cancer subtypes in mutation data of glioblastoma, ovarian, breast, prostate, and bladder cancers. In several cases, HOCUS provides an improvement over using the molecular features directly to compare samples. Application of HOCUS to glioblastoma images reveals a size and location classification of tumors that improves over human expert-based stratification. Subtypes based on higher order features can reveal comparable or distinct groupings. The distinct solutions can provide biologically- and treatment-relevant solutions that are just as significant as solutions based on the original data.
NASA Technical Reports Server (NTRS)
Lee, Paul U.; Smith, Nancy M.; Bienert, Nancy; Brasil, Connie; Buckley, Nathan; Chevalley, Eric; Homola, Jeffrey; Omar, Faisal; Parke, Bonny; Yoo, Hyo-Sang
2016-01-01
LaGuardia (LGA) departure delay was identified by the stakeholders and subject matter experts as a significant bottleneck in the New York metropolitan area. Departure delay at LGA is primarily due to dependency between LGA's arrival and departure runways: LGA departures cannot begin takeoff until arrivals have cleared the runway intersection. If one-in one-out operations are not maintained and a significant arrival-to-departure imbalance occurs, the departure backup can persist through the rest of the day. At NASA Ames Research Center, a solution called "Departure-sensitive Arrival Spacing" (DSAS) was developed to maximize the departure throughput without creating significant delays in the arrival traffic. The concept leverages a Terminal Sequencing and Spacing (TSS) operations that create and manage the arrival schedule to the runway threshold and added an interface enhancement to the traffic manager's timeline to provide the ability to manually adjust inter-arrival spacing to build precise gaps for multiple departures between arrivals. A more complete solution would include a TSS algorithm enhancement that could automatically build these multi-departure gaps. With this set of capabilities, inter-arrival spacing could be controlled for optimal departure throughput. The concept was prototyped in a human-in-the- loop (HITL) simulation environment so that operational requirements such as coordination procedures, timing and magnitude of TSS schedule adjustments, and display features for Tower, TRACON and Traffic Management Unit could be determined. A HITL simulation was conducted in August 2014 to evaluate the concept in terms of feasibility, controller workload impact, and potential benefits. Three conditions were tested, namely a Baseline condition without scheduling, TSS condition that schedules the arrivals to the runway threshold, and TSS+DSAS condition that adjusts the arrival schedule to maximize the departure throughput. The results showed that during high arrival demand period, departure throughput could be incrementally increased under TSS and TSS+DSAS conditions without compromising the arrival throughput. The concept, operational procedures, and summary results were originally published in ATM20151 but detailed results were omitted. This paper expands on the earlier paper to provide the detailed results on throughput, conformance, safety, flight time/distance, etc. that provide extra insights into the feasibility and the potential benefits on the concept.
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
NASA Astrophysics Data System (ADS)
Li, Wei; Huang, Zhitong; Li, Haoyue; Ji, Yuefeng
2018-04-01
Visible light communication (VLC) is a promising candidate for short-range broadband access due to its integration of advantages for both optical communication and wireless communication, whereas multi-user access is a key problem because of the intra-cell and inter-cell interferences. In addition, the non-flat channel effect results in higher losses for users in high frequency bands, which leads to unfair qualities. To solve those issues, we propose a power adaptive multi-filter carrierless amplitude and phase access (PA-MF-CAPA) scheme, and in the first step of this scheme, the MF-CAPA scheme utilizing multiple filters as different CAP dimensions is used to realize multi-user access. The character of orthogonality among the filters in different dimensions can mitigate the effect of intra-cell and inter-cell interferences. Moreover, the MF-CAPA scheme provides different channels modulated on the same frequency bands, which further increases the transmission rate. Then, the power adaptive procedure based on MF-CAPA scheme is presented to realize quality fairness. As demonstrated in our experiments, the MF-CAPA scheme yields an improved throughput compared with multi-band CAP access scheme, and the PA-MF-CAPA scheme enhances the quality fairness and further improves the throughput compared with the MF-CAPA scheme.
Li, Bing; Ju, Feng; Cai, Lin; Zhang, Tong
2015-09-01
The broad-spectrum profile of bacterial pathogens and their fate in sewage treatment plants (STPs) were investigated using high-throughput sequencing based metagenomic approach. This novel approach could provide a united platform to standardize bacterial pathogen detection and realize direct comparison among different samples. Totally, 113 bacterial pathogen species were detected in eight samples including influent, effluent, activated sludge (AS), biofilm, and anaerobic digestion sludge with the abundances ranging from 0.000095% to 4.89%. Among these 113 bacterial pathogens, 79 species were reported in STPs for the first time. Specially, compared to AS in bulk mixed liquor, more pathogen species and higher total abundance were detected in upper foaming layer of AS. This suggests that the foaming layer of AS might impose more threat to onsite workers and citizens in the surrounding areas of STPs because pathogens in foaming layer are easily transferred into air and cause possible infections. The high removal efficiency (98.0%) of total bacterial pathogens suggests that AS treatment process is effective to remove most bacterial pathogens. Remarkable similarities of bacterial pathogen compositions between influent and human gut indicated that bacterial pathogen profiles in influents could well reflect the average bacterial pathogen communities of urban resident guts within the STP catchment area.
Deep Space Optical Link ARQ Performance Analysis
NASA Technical Reports Server (NTRS)
Clare, Loren; Miles, Gregory
2016-01-01
Substantial advancements have been made toward the use of optical communications for deep space exploration missions, promising a much higher volume of data to be communicated in comparison with present -day Radio Frequency (RF) based systems. One or more ground-based optical terminals are assumed to communicate with the spacecraft. Both short-term and long-term link outages will arise due to weather at the ground station(s), space platform pointing stability, and other effects. To mitigate these outages, an Automatic Repeat Query (ARQ) retransmission method is assumed, together with a reliable back channel for acknowledgement traffic. Specifically, the Licklider Transmission Protocol (LTP) is used, which is a component of the Disruption-Tolerant Networking (DTN) protocol suite that is well suited for high bandwidth-delay product links subject to disruptions. We provide an analysis of envisioned deep space mission scenarios and quantify buffering, latency and throughput performance, using a simulation in which long-term weather effects are modeled with a Gilbert -Elliot Markov chain, short-term outages occur as a Bernoulli process, and scheduled outages arising from geometric visibility or operational constraints are represented. We find that both short- and long-term effects impact throughput, but long-term weather effects dominate buffer sizing and overflow losses as well as latency performance.
2018-01-01
Effect-directed analysis (EDA) is a commonly used approach for effect-based identification of endocrine disruptive chemicals in complex (environmental) mixtures. However, for routine toxicity assessment of, for example, water samples, current EDA approaches are considered time-consuming and laborious. We achieved faster EDA and identification by downscaling of sensitive cell-based hormone reporter gene assays and increasing fractionation resolution to allow testing of smaller fractions with reduced complexity. The high-resolution EDA approach is demonstrated by analysis of four environmental passive sampler extracts. Downscaling of the assays to a 384-well format allowed analysis of 64 fractions in triplicate (or 192 fractions without technical replicates) without affecting sensitivity compared to the standard 96-well format. Through a parallel exposure method, agonistic and antagonistic androgen and estrogen receptor activity could be measured in a single experiment following a single fractionation. From 16 selected candidate compounds, identified through nontargeted analysis, 13 could be confirmed chemically and 10 were found to be biologically active, of which the most potent nonsteroidal estrogens were identified as oxybenzone and piperine. The increased fractionation resolution and the higher throughput that downscaling provides allow for future application in routine high-resolution screening of large numbers of samples in order to accelerate identification of (emerging) endocrine disruptors. PMID:29547277
From astronomy and telecommunications to biomedicine
NASA Astrophysics Data System (ADS)
Behr, Bradford B.; Baker, Scott A.; Bismilla, Yusuf; Cenko, Andrew T.; DesRoches, Brandon; Hajian, Arsen R.; Meade, Jeffrey T.; Nitkowski, Arthur; Preston, Kyle J.; Schmidt, Bradley S.; Sherwood-Droz, Nicolás.; Slaa, Jared
2015-03-01
Photonics is an inherently interdisciplinary endeavor, as technologies and techniques invented or developed in one scientific field are often found to be applicable to other fields or disciplines. We present two case studies in which optical spectroscopy technologies originating from stellar astrophysics and optical telecommunications multiplexing have been successfully adapted for biomedical applications. The first case involves a design concept called the High Throughput Virtual Slit, or HTVS, which provides high spectral resolution without the throughput inefficiency typically associated with a narrow spectrometer slit. HTVS-enhanced spectrometers have been found to significantly improve the sensitivity and speed of fiber-fed Raman analysis systems, and the method is now being adapted for hyperspectral imaging for medical and biological sensing. The second example of technology transfer into biomedicine centers on integrated optics, in which optical waveguides are fabricated on to silicon substrates in a substantially similar fashion as integrated circuits in computer chips. We describe an architecture referred to as OCTANE which implements a small and robust "spectrometer-on-a-chip" which is optimized for optical coherence tomography (OCT). OCTANE-based OCT systems deliver three-dimensional imaging resolution at the micron scale with greater stability and lower cost than equivalent conventional OCT approaches. Both HTVS and OCTANE enable higher precision and improved reliability under environmental conditions that are typically found in a clinical or laboratory setting.
Wavelength Scanning with a Tilting Interference Filter for Glow-Discharge Elemental Imaging.
Storey, Andrew P; Ray, Steven J; Hoffmann, Volker; Voronov, Maxim; Engelhard, Carsten; Buscher, Wolfgang; Hieftje, Gary M
2017-06-01
Glow discharges have long been used for depth profiling and bulk analysis of solid samples. In addition, over the past decade, several methods of obtaining lateral surface elemental distributions have been introduced, each with its own strengths and weaknesses. Challenges for each of these techniques are acceptable optical throughput and added instrumental complexity. Here, these problems are addressed with a tilting-filter instrument. A pulsed glow discharge is coupled to an optical system comprising an adjustable-angle tilting filter, collimating and imaging lenses, and a gated, intensified charge-coupled device (CCD) camera, which together provide surface elemental mapping of solid samples. The tilting-filter spectrometer is instrumentally simpler, produces less image distortion, and achieves higher optical throughput than a monochromator-based instrument, but has a much more limited tunable spectral range and poorer spectral resolution. As a result, the tilting-filter spectrometer is limited to single-element or two-element determinations, and only when the target spectral lines fall within an appropriate spectral range and can be spectrally discerned. Spectral interferences that result from heterogeneous impurities can be flagged and overcome by observing the spatially resolved signal response across the available tunable spectral range. The instrument has been characterized and evaluated for the spatially resolved analysis of glow-discharge emission from selected but representative samples.
NASA Astrophysics Data System (ADS)
Tsai, H. Y.; Gao, B. Z.; Yang, S. F.; Li, C. S.; Fuh, C. Bor
2014-01-01
This paper presents the use of fluorescent biofunctional nanoparticles (10-30 nm) to detect alpha-fetoprotein (AFP) in a thin-channel magnetic immunoassay. We used an AFP model biomarker and s-shaped deposition zones to test the proposed detection method. The results show that the detection using fluorescent biofunctional nanoparticle has a higher throughput than that of functional microparticle used in previous experiments on affinity reactions. The proposed method takes about 3 min (versus 150 min of previous method) to detect 100 samples. The proposed method is useful for screening biomarkers in clinical applications, and can reduce the run time for sandwich immunoassays to less than 20 min. The detection limits (0.06 pg/ml) and linear ranges (0.068 pg/ml-0.68 ng/ml) of AFP using fluorescent biofunctional nanoparticles are the same as those of using functional microparticles within experimental errors. This detection limit is substantially lower and the linear range is considerably wider than those of enzyme-linked immunosorbent assay (ELISA) and other methods in sandwich immunoassay methods. The differences between this method and an ELISA in AFP measurements of serum samples were less than 12 %. The proposed method provides simple, fast, and sensitive detection with a high throughput for biomarkers.
The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System
Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong
2013-01-01
Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284
A novel LTE scheduling algorithm for green technology in smart grid.
Hindia, Mohammad Nour; Reza, Ahmed Wasif; Noordin, Kamarul Ariffin; Chayon, Muhammad Hasibur Rashid
2015-01-01
Smart grid (SG) application is being used nowadays to meet the demand of increasing power consumption. SG application is considered as a perfect solution for combining renewable energy resources and electrical grid by means of creating a bidirectional communication channel between the two systems. In this paper, three SG applications applicable to renewable energy system, namely, distribution automation (DA), distributed energy system-storage (DER) and electrical vehicle (EV), are investigated in order to study their suitability in Long Term Evolution (LTE) network. To compensate the weakness in the existing scheduling algorithms, a novel bandwidth estimation and allocation technique and a new scheduling algorithm are proposed. The technique allocates available network resources based on application's priority, whereas the algorithm makes scheduling decision based on dynamic weighting factors of multi-criteria to satisfy the demands (delay, past average throughput and instantaneous transmission rate) of quality of service. Finally, the simulation results demonstrate that the proposed mechanism achieves higher throughput, lower delay and lower packet loss rate for DA and DER as well as provide a degree of service for EV. In terms of fairness, the proposed algorithm shows 3%, 7 % and 9% better performance compared to exponential rule (EXP-Rule), modified-largest weighted delay first (M-LWDF) and exponential/PF (EXP/PF), respectively.
A Novel LTE Scheduling Algorithm for Green Technology in Smart Grid
Hindia, Mohammad Nour; Reza, Ahmed Wasif; Noordin, Kamarul Ariffin; Chayon, Muhammad Hasibur Rashid
2015-01-01
Smart grid (SG) application is being used nowadays to meet the demand of increasing power consumption. SG application is considered as a perfect solution for combining renewable energy resources and electrical grid by means of creating a bidirectional communication channel between the two systems. In this paper, three SG applications applicable to renewable energy system, namely, distribution automation (DA), distributed energy system-storage (DER) and electrical vehicle (EV), are investigated in order to study their suitability in Long Term Evolution (LTE) network. To compensate the weakness in the existing scheduling algorithms, a novel bandwidth estimation and allocation technique and a new scheduling algorithm are proposed. The technique allocates available network resources based on application’s priority, whereas the algorithm makes scheduling decision based on dynamic weighting factors of multi-criteria to satisfy the demands (delay, past average throughput and instantaneous transmission rate) of quality of service. Finally, the simulation results demonstrate that the proposed mechanism achieves higher throughput, lower delay and lower packet loss rate for DA and DER as well as provide a degree of service for EV. In terms of fairness, the proposed algorithm shows 3%, 7 % and 9% better performance compared to exponential rule (EXP-Rule), modified-largest weighted delay first (M-LWDF) and exponential/PF (EXP/PF), respectively. PMID:25830703
The stabilisation of purified, reconstituted P-glycoprotein by freeze drying with disaccharides.
Heikal, Adam; Box, Karl; Rothnie, Alice; Storm, Janet; Callaghan, Richard; Allen, Marcus
2009-02-01
The drug efflux pump P-glycoprotein (P-gp) (ABCB1) confers multidrug resistance, a major cause of failure in the chemotherapy of tumours, exacerbated by a shortage of potent and selective inhibitors. A high throughput assay using purified P-gp to screen and characterise potential inhibitors would greatly accelerate their development. However, long-term stability of purified reconstituted ABCB1 can only be reliably achieved with storage at -80 degrees C. For example, at 20 degrees C, the activity of ABCB1 was abrogated with a half-life of <1 day. The aim of this investigation was to stabilise purified, reconstituted ABCB1 to enable storage at higher temperatures and thereby enable design of a high throughput assay system. The ABCB1 purification procedure was optimised to allow successful freeze drying by substitution of glycerol with the disaccharides trehalose or maltose. Addition of disaccharides resulted in ATPase activity being retained immediately following lyophilisation with no significant difference between the two disaccharides. However, during storage trehalose preserved ATPase activity for several months regardless of the temperature (e.g. 60% retention at 150 days), whereas ATPase activity in maltose purified P-gp was affected by both storage time and temperature. The data provide an effective mechanism for the production of resilient purified, reconstituted ABCB1.
Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang
2017-04-01
Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.
High Throughput Determination of Critical Human Dosing ...
High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data into predicted human equivalent doses that can be linked with biologically relevant exposure scenarios. Thus, HTTK provides essential data for risk prioritization for thousands of chemicals that lack TK data. One critical HTTK parameter that can be measured in vitro is the unbound fraction of a chemical in plasma (Fub). However, for chemicals that bind strongly to plasma, Fub is below the limits of detection (LOD) for high throughput analytical chemistry, and therefore cannot be quantified. A novel method for quantifying Fub was implemented for 85 strategically selected chemicals: measurement of Fub was attempted at 10%, 30%, and 100% of physiological plasma concentrations using rapid equilibrium dialysis assays. Varying plasma concentrations instead of chemical concentrations makes high throughput analytical methodology more likely to be successful. Assays at 100% plasma concentration were unsuccessful for 34 chemicals. For 12 of these 34 chemicals, Fub could be quantified at 10% and/or 30% plasma concentrations; these results imply that the assay failure at 100% plasma concentration was caused by plasma protein binding for these chemicals. Assay failure for the remaining 22 chemicals may
High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.
Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E
2017-01-01
Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.
Multi-shot PROPELLER for high-field preclinical MRI
Pandit, Prachi; Qi, Yi; Story, Jennifer; King, Kevin F.; Johnson, G. Allan
2012-01-01
With the development of numerous mouse models of cancer, there is a tremendous need for an appropriate imaging technique to study the disease evolution. High-field T2-weighted imaging using PROPELLER MRI meets this need. The 2-shot PROPELLER technique presented here, provides (a) high spatial resolution, (b) high contrast resolution, and (c) rapid and non-invasive imaging, which enables high-throughput, longitudinal studies in free-breathing mice. Unique data collection and reconstruction makes this method robust against motion artifacts. The 2-shot modification introduced here, retains more high-frequency information and provides higher SNR than conventional single-shot PROPELLER, making this sequence feasible at high-fields, where signal loss is rapid. Results are shown in a liver metastases model to demonstrate the utility of this technique in one of the more challenging regions of the mouse, which is the abdomen. PMID:20572138
Multishot PROPELLER for high-field preclinical MRI.
Pandit, Prachi; Qi, Yi; Story, Jennifer; King, Kevin F; Johnson, G Allan
2010-07-01
With the development of numerous mouse models of cancer, there is a tremendous need for an appropriate imaging technique to study the disease evolution. High-field T(2)-weighted imaging using PROPELLER (Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction) MRI meets this need. The two-shot PROPELLER technique presented here provides (a) high spatial resolution, (b) high contrast resolution, and (c) rapid and noninvasive imaging, which enables high-throughput, longitudinal studies in free-breathing mice. Unique data collection and reconstruction makes this method robust against motion artifacts. The two-shot modification introduced here retains more high-frequency information and provides higher signal-to-noise ratio than conventional single-shot PROPELLER, making this sequence feasible at high fields, where signal loss is rapid. Results are shown in a liver metastases model to demonstrate the utility of this technique in one of the more challenging regions of the mouse, which is the abdomen. (c) 2010 Wiley-Liss, Inc.
Kueseng, Pamornrat; Pawliszyn, Janusz
2013-11-22
A new thin-film, carboxylated multiwalled carbon nanotubes/polydimethylsiloxane (MWCNTs-COOH/PDMS) coating was developed for 96-blade solid-phase microextraction (SPME) system followed by high performance liquid chromatography with ultraviolet detection (HPLC-UV). The method provided good extraction efficiency (64-90%) for three spiked levels, with relative standard deviations (RSD)≤6%, and detection limits between 1 and 2 μg/L for three phenolic compounds. The MWCNTs-COOH/PDMS 96-blade SPME system presents advantages over traditional methods due to its simplicity of use, easy coating preparation, low cost and high sample throughput (2.1 min per sample). The developed coating is reusable for a minimum of 110 extractions with good extraction efficiency. The coating provided higher extraction efficiency (3-8 times greater) than pure PDMS coatings. Copyright © 2013 Elsevier B.V. All rights reserved.
A scalable silicon photonic chip-scale optical switch for high performance computing systems.
Yu, Runxiang; Cheung, Stanley; Li, Yuliang; Okamoto, Katsunari; Proietti, Roberto; Yin, Yawei; Yoo, S J B
2013-12-30
This paper discusses the architecture and provides performance studies of a silicon photonic chip-scale optical switch for scalable interconnect network in high performance computing systems. The proposed switch exploits optical wavelength parallelism and wavelength routing characteristics of an Arrayed Waveguide Grating Router (AWGR) to allow contention resolution in the wavelength domain. Simulation results from a cycle-accurate network simulator indicate that, even with only two transmitter/receiver pairs per node, the switch exhibits lower end-to-end latency and higher throughput at high (>90%) input loads compared with electronic switches. On the device integration level, we propose to integrate all the components (ring modulators, photodetectors and AWGR) on a CMOS-compatible silicon photonic platform to ensure a compact, energy efficient and cost-effective device. We successfully demonstrate proof-of-concept routing functions on an 8 × 8 prototype fabricated using foundry services provided by OpSIS-IME.
NASA Astrophysics Data System (ADS)
Aldhaibani, Jaafar A.; Ahmad, R. B.; Yahya, A.; Azeez, Suzan A.
2015-05-01
Wireless multi-hop relay networks have become very important technologies in mobile communications. These networks ensure high throughput and coverage extension with a low cost. The poor capacity at cell edges is not enough to meet with growing demand of high capacity and throughput irrespective of user's placement in the cellular network. In this paper we propose optimal placement of relay node that provides maximum achievable rate at users and enhances the throughput and coverage at cell edge region. The proposed scheme is based on the outage probability at users and taken on account the interference between nodes. Numerical analyses along with simulation results indicated there are an improvement in capacity for users at the cell edge is 40% increment from all cell capacity.
Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping
2016-09-01
There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.
Istepanian, R S H; Philip, N
2005-01-01
In this paper we describe some of the optimisation issues relevant to the requirements of high throughput of medical data and video streaming traffic in 3G wireless environments. In particular we present a challenging 3G mobile health care application that requires a demanding 3G medical data throughput. We also describe the 3G QoS requirement of mObile Tele-Echography ultra-Light rObot system (OTELO that is designed to provide seamless 3G connectivity for real-time ultrasound medical video streams and diagnosis from a remote site (robotic and patient station) manipulated by an expert side (specialists) that is controlling the robotic scanning operation and presenting a real-time feedback diagnosis using 3G wireless communication links.
High-density plasma deposition manufacturing productivity improvement
NASA Astrophysics Data System (ADS)
Olmer, Leonard J.; Hudson, Chris P.
1999-09-01
High Density Plasma (HDP) deposition provides a means to deposit high quality dielectrics meeting submicron gap fill requirements. But, compared to traditional PECVD processing, HDP is relatively expensive due to the higher capital cost of the equipment. In order to keep processing costs low, it became necessary to maximize the wafer throughput of HDP processing without degrading the film properties. The approach taken was to optimize the post deposition microwave in-situ clean efficiency. A regression model, based on actual data, indicated that number of wafers processed before a chamber clean was the dominant factor. Furthermore, a design change in the ceramic hardware, surrounding the electrostatic chuck, provided thermal isolation resulting in an enhanced clean rate of the chamber process kit. An infra-red detector located in the chamber exhaust line provided a means to endpoint the clean and in-film particle data confirmed the infra-red results. The combination of increased chamber clean frequency, optimized clean time and improved process.
A Precision Metrology System for the Hubble Space Telescope Wide Field Camera 3 Instrument
NASA Technical Reports Server (NTRS)
Toland, Ronald W.
2003-01-01
The Wide Field Camera 3 (WFC3) instrument for the Hubble Space Telescope (HST) will replace the current Wide Field and Planetary Camera 2 (WFPC2). By providing higher throughput and sensitivity than WFPC2, and operating from the near-IR to the near-UV, WFC3 will once again bring the performance of HST above that from ground-based observatories. Crucial to the integration of the WFC3 optical bench is a pair of 2-axis cathetometers used to view targets which cannot be seen by other means when the bench is loaded into its enclosure. The setup and calibration of these cathetometers is described, along with results from a comparison of the cathetometer system with other metrology techniques.
Baculovirus expression system and method for high throughput expression of genetic material
Clark, Robin; Davies, Anthony
2001-01-01
The present invention provides novel recombinant baculovirus expression systems for expressing foreign genetic material in a host cell. Such expression systems are readily adapted to an automated method for expression foreign genetic material in a high throughput manner. In other aspects, the present invention features a novel automated method for determining the function of foreign genetic material by transfecting the same into a host by way of the recombinant baculovirus expression systems according to the present invention.
High throughput protein production screening
Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA
2009-09-08
Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.
Morgan, Martin; Anders, Simon; Lawrence, Michael; Aboyoun, Patrick; Pagès, Hervé; Gentleman, Robert
2009-01-01
Summary: ShortRead is a package for input, quality assessment, manipulation and output of high-throughput sequencing data. ShortRead is provided in the R and Bioconductor environments, allowing ready access to additional facilities for advanced statistical analysis, data transformation, visualization and integration with diverse genomic resources. Availability and Implementation: This package is implemented in R and available at the Bioconductor web site; the package contains a ‘vignette’ outlining typical work flows. Contact: mtmorgan@fhcrc.org PMID:19654119
Dashboard visualizations: Supporting real-time throughput decision-making.
Franklin, Amy; Gantela, Swaroop; Shifarraw, Salsawit; Johnson, Todd R; Robinson, David J; King, Brent R; Mehta, Amit M; Maddow, Charles L; Hoot, Nathan R; Nguyen, Vickie; Rubio, Adriana; Zhang, Jiajie; Okafor, Nnaemeka G
2017-07-01
Providing timely and effective care in the emergency department (ED) requires the management of individual patients as well as the flow and demands of the entire department. Strategic changes to work processes, such as adding a flow coordination nurse or a physician in triage, have demonstrated improvements in throughput times. However, such global strategic changes do not address the real-time, often opportunistic workflow decisions of individual clinicians in the ED. We believe that real-time representation of the status of the entire emergency department and each patient within it through information visualizations will better support clinical decision-making in-the-moment and provide for rapid intervention to improve ED flow. This notion is based on previous work where we found that clinicians' workflow decisions were often based on an in-the-moment local perspective, rather than a global perspective. Here, we discuss the challenges of designing and implementing visualizations for ED through a discussion of the development of our prototype Throughput Dashboard and the potential it holds for supporting real-time decision-making. Copyright © 2017. Published by Elsevier Inc.
High-throughput and reliable protocols for animal microRNA library cloning.
Xiao, Caide
2011-01-01
MicroRNAs are short single-stranded RNA molecules (18-25 nucleotides). Because of their ability to silence gene expressions, they can be used to diagnose and treat tumors. Experimental construction of microRNA libraries was the most important step to identify microRNAs from animal tissues. Although there are many commercial kits with special protocols to construct microRNA libraries, this chapter provides the most reliable, high-throughput, and affordable protocols for microRNA library construction. The high-throughput capability of our protocols came from a double concentration (3 and 15%, thickness 1.5 mm) polyacrylamide gel electrophoresis (PAGE), which could directly extract microRNA-size RNAs from up to 400 μg total RNA (enough for two microRNA libraries). The reliability of our protocols was assured by a third PAGE, which selected PCR products of microRNA-size RNAs ligated with 5' and 3' linkers by a miRCat™ kit. Also, a MathCAD program was provided to automatically search short RNAs inserted between 5' and 3' linkers from thousands of sequencing text files.
Experimental and Study Design Considerations for Uncovering Oncometabolites.
Haznadar, Majda; Mathé, Ewy A
2017-01-01
Metabolomics as a field has gained attention due to its potential for biomarker discovery, namely because it directly reflects disease phenotype and is the downstream effect of posttranslational modifications. The field provides a "top-down," integrated view of biochemistry in complex organisms, as opposed to the traditional "bottom-up" approach that aims to analyze networks of interactions between genes, proteins and metabolites. It also allows for the detection of thousands of endogenous metabolites in various clinical biospecimens in a high-throughput manner, including tissue and biofluids such as blood and urine. Of note, because biological fluid samples can be collected relatively easily, the time-dependent fluctuations of metabolites can be readily studied in detail.In this chapter, we aim to provide an overview of (1) analytical methods that are currently employed in the field, and (2) study design concepts that should be considered prior to conducting high-throughput metabolomics studies. While widely applicable, the concepts presented here are namely applicable to high-throughput untargeted studies that aim to search for metabolite biomarkers that are associated with a particular human disease.
Ethoscopes: An open platform for high-throughput ethomics
Geissmann, Quentin; Garcia Rodriguez, Luis; Beckwith, Esteban J.; French, Alice S.; Jamasb, Arian R.
2017-01-01
Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope. PMID:29049280
Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif
2008-03-01
High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.
Jun, Young Jin; Park, Sung Hyeon; Woo, Seong Ihl
2014-12-08
Combinatorial high-throughput optical screening method was developed to find the optimum composition of highly active Pd-based catalysts at the cathode of the hybrid Li-air battery. Pd alone, which is one-third the cost of Pt, has difficulty in replacing Pt; therefore, the integration of other metals was investigated to improve its performance toward oxygen reduction reaction (ORR). Among the binary Pd-based catalysts, the composition of Pd-Ir derived catalysts had higher performance toward ORR compared to other Pd-based binary combinations. The composition at 88:12 at. % (Pd: Ir) showed the highest activity toward ORR at the cathode of the hybrid Li-air battery. The prepared Pd(88)Ir(12)/C catalyst showed a current density of -2.58 mA cm(-2) at 0.8 V (vs RHE), which was around 30% higher compared to that of Pd/C (-1.97 mA cm(-2)). When the prepared Pd(88)Ir(12)/C catalyst was applied to the hybrid Li-air battery, the polarization of the cell was reduced and the energy efficiency of the cell was about 30% higher than that of the cell with Pd/C.
High throughput DNA damage quantification of human tissue with home-based collection device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costes, Sylvain V.; Tang, Jonathan; Yannone, Steven M.
Kits, methods and systems for providing a service to provide a subject with information regarding the state of a subject's DNA damage. Collection, processing and analysis of samples are also described.
High Count-Rate Study of Two TES X-Ray Microcalorimeters With Different Transition Temperatures
NASA Technical Reports Server (NTRS)
Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.;
2017-01-01
We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures T(sub c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T(sub c)(sup s) had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.eV at 6 keV from lower and higher T(sub c) devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the socalled event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96 Percent throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T(sub c) (faster) device, and 5.8 eV FWHM with 97 Percent throughput with the lower T(sub c) (slower) device at 722 Hz.
Handheld Fluorescence Microscopy based Flow Analyzer.
Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva
2016-03-01
Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.
A rapid enzymatic assay for high-throughput screening of adenosine-producing strains
Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei
2015-01-01
Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842
Klukas, Christian; Chen, Dijun; Pape, Jean-Michel
2014-01-01
High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818
toxoMine: an integrated omics data warehouse for Toxoplasma gondii systems biology research
Rhee, David B.; Croken, Matthew McKnight; Shieh, Kevin R.; Sullivan, Julie; Micklem, Gos; Kim, Kami; Golden, Aaron
2015-01-01
Toxoplasma gondii (T. gondii) is an obligate intracellular parasite that must monitor for changes in the host environment and respond accordingly; however, it is still not fully known which genetic or epigenetic factors are involved in regulating virulence traits of T. gondii. There are on-going efforts to elucidate the mechanisms regulating the stage transition process via the application of high-throughput epigenomics, genomics and proteomics techniques. Given the range of experimental conditions and the typical yield from such high-throughput techniques, a new challenge arises: how to effectively collect, organize and disseminate the generated data for subsequent data analysis. Here, we describe toxoMine, which provides a powerful interface to support sophisticated integrative exploration of high-throughput experimental data and metadata, providing researchers with a more tractable means toward understanding how genetic and/or epigenetic factors play a coordinated role in determining pathogenicity of T. gondii. As a data warehouse, toxoMine allows integration of high-throughput data sets with public T. gondii data. toxoMine is also able to execute complex queries involving multiple data sets with straightforward user interaction. Furthermore, toxoMine allows users to define their own parameters during the search process that gives users near-limitless search and query capabilities. The interoperability feature also allows users to query and examine data available in other InterMine systems, which would effectively augment the search scope beyond what is available to toxoMine. toxoMine complements the major community database ToxoDB by providing a data warehouse that enables more extensive integrative studies for T. gondii. Given all these factors, we believe it will become an indispensable resource to the greater infectious disease research community. Database URL: http://toxomine.org PMID:26130662
OptoDyCE: Automated system for high-throughput all-optical dynamic cardiac electrophysiology
NASA Astrophysics Data System (ADS)
Klimas, Aleksandra; Yu, Jinzhu; Ambrosi, Christina M.; Williams, John C.; Bien, Harold; Entcheva, Emilia
2016-02-01
In the last two decades, <30% of drugs withdrawals from the market were due to cardiac toxicity, where unintended interactions with ion channels disrupt the heart's normal electrical function. Consequently, all new drugs must undergo preclinical testing for cardiac liability, adding to an already expensive and lengthy process. Recognition that proarrhythmic effects often result from drug action on multiple ion channels demonstrates a need for integrative and comprehensive measurements. Additionally, patient-specific therapies relying on emerging technologies employing stem-cell derived cardiomyocytes (e.g. induced pluripotent stem-cell-derived cardiomyocytes, iPSC-CMs) require better screening methods to become practical. However, a high-throughput, cost-effective approach for cellular cardiac electrophysiology has not been feasible. Optical techniques for manipulation and recording provide a contactless means of dynamic, high-throughput testing of cells and tissues. Here, we consider the requirements for all-optical electrophysiology for drug testing, and we implement and validate OptoDyCE, a fully automated system for all-optical cardiac electrophysiology. We demonstrate the high-throughput capabilities using multicellular samples in 96-well format by combining optogenetic actuation with simultaneous fast high-resolution optical sensing of voltage or intracellular calcium. The system can also be implemented using iPSC-CMs and other cell-types by delivery of optogenetic drivers, or through the modular use of dedicated light-sensitive somatic cells in conjunction with non-modified cells. OptoDyCE provides a truly modular and dynamic screening system, capable of fully-automated acquisition of high-content information integral for improved discovery and development of new drugs and biologics, as well as providing a means of better understanding of electrical disturbances in the heart.
Quality Control for Ambient Sampling of PCDD/PCDF from Open Combustion Sources
Both long duration (> 6 h) and high temperature (up to 139o C) sampling efforts were conducted using ambient air sampling methods to determine if either high volume throughput or higher than ambient sampling temperatures resulted in loss of target polychlorinated dibenzodioxins/d...
Murlidhar, Vasudha; Zeinali, Mina; Grabauskiene, Svetlana; Ghannad-Rezaie, Mostafa; Wicha, Max S; Simeone, Diane M; Ramnath, Nithya; Reddy, Rishindra M; Nagrath, Sunitha
2014-12-10
Circulating tumor cells (CTCs) are believed to play an important role in metastasis, a process responsible for the majority of cancer-related deaths. But their rarity in the bloodstream makes microfluidic isolation complex and time-consuming. Additionally the low processing speeds can be a hindrance to obtaining higher yields of CTCs, limiting their potential use as biomarkers for early diagnosis. Here, a high throughput microfluidic technology, the OncoBean Chip, is reported. It employs radial flow that introduces a varying shear profile across the device, enabling efficient cell capture by affinity at high flow rates. The recovery from whole blood is validated with cancer cell lines H1650 and MCF7, achieving a mean efficiency >80% at a throughput of 10 mL h(-1) in contrast to a flow rate of 1 mL h(-1) standardly reported with other microfluidic devices. Cells are recovered with a viability rate of 93% at these high speeds, increasing the ability to use captured CTCs for downstream analysis. Broad clinical application is demonstrated using comparable flow rates from blood specimens obtained from breast, pancreatic, and lung cancer patients. Comparable CTC numbers are recovered in all the samples at the two flow rates, demonstrating the ability of the technology to perform at high throughputs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Simulation and Optimization of an Astrophotonic Reformatter
NASA Astrophysics Data System (ADS)
Anagnos, Th; Harris, R. J.; Corrigan, M. K.; Reeves, A. P.; Townson, M. J.; MacLachlan, D. G.; Thomson, R. R.; Morris, T. J.; Schwab, C.; Quirrenbach, A.
2018-05-01
Image slicing is a powerful technique in astronomy. It allows the instrument designer to reduce the slit width of the spectrograph, increasing spectral resolving power whilst retaining throughput. Conventionally this is done using bulk optics, such as mirrors and prisms, however more recently astrophotonic components known as PLs and photonic reformatters have also been used. These devices reformat the MM input light from a telescope into SM outputs, which can then be re-arranged to suit the spectrograph. The PD is one such device, designed to reduce the dependence of spectrograph size on telescope aperture and eliminate modal noise. We simulate the PD, by optimising the throughput and geometrical design using Soapy and BeamProp. The simulated device shows a transmission between 8 and 20 %, depending upon the type of AO correction applied, matching the experimental results well. We also investigate our idealised model of the PD and show that the barycentre of the slit varies only slightly with time, meaning that the modal noise contribution is very low when compared to conventional fibre systems. We further optimise our model device for both higher throughput and reduced modal noise. This device improves throughput by 6.4 % and reduces the movement of the slit output by 50%, further improving stability. This shows the importance of properly simulating such devices, including atmospheric effects. Our work complements recent work in the field and is essential for optimising future photonic reformatters.
SDN based millimetre wave radio over fiber (RoF) network
NASA Astrophysics Data System (ADS)
Amate, Ahmed; Milosavljevic, Milos; Kourtessis, Pandelis; Robinson, Matthew; Senior, John M.
2015-01-01
This paper introduces software-defined, millimeter Wave (mm-Wave) networks with Radio over Fiber (RoF) for the delivery of gigabit connectivity required to develop fifth generation (5G) mobile. This network will enable an effective open access system allowing providers to manage and lease the infrastructure to service providers through unbundling new business models. Exploiting the inherited benefits of RoF, complete base station functionalities are centralized at the edges of the metro and aggregation network, leaving remote radio heads (RRHs) with only tunable filtering and amplification. A Software Defined Network (SDN) Central Controller (SCC) is responsible for managing the resource across several mm-Wave Radio Access Networks (RANs) providing a global view of the several network segments. This ensures flexible resource allocation for reduced overall latency and increased throughput. The SDN based mm-Wave RAN also allows for inter edge node communication. Therefore, certain packets can be routed between different RANs supported by the same edge node, reducing latency. System level simulations of the complete network have shown significant improvement of the overall throughput and SINR for wireless users by providing effective resource allocation and coordination among interfering cells. A new Coordinated Multipoint (CoMP) algorithm exploiting the benefits of the SCC global network view for reduced delay in control message exchange is presented, accounting for a minimum packet delay and limited Channel State Information (CSI) in a Long Term Evolution-Advanced (LTE-A), Cloud RAN (CRAN) configuration. The algorithm does not require detailed CSI feedback from UEs but it rather considers UE location (determined by the eNB) as the required parameter. UE throughput in the target sector is represented using a Cumulative Distributive Function (CDF). The drawn characteristics suggest that there is a significant 60% improvement in UE cell edge throughput following the application, in the coordinating cells, of the new CoMP algorithm. Results also show a further improvement of 36% in cell edge UE throughput when eNBs are centralized in a CRAN backhaul architecture. The SINR distribution of UEs in the cooperating cells has also been evaluated using a box plot. As expected, UEs with CoMP perform better demonstrating an increase of over 2 dB at the median between the transmission scenarios.
Design of portable ultraminiature flow cytometers for medical diagnostics
NASA Astrophysics Data System (ADS)
Leary, James F.
2018-02-01
Design of portable microfluidic flow/image cytometry devices for measurements in the field (e.g. initial medical diagnostics) requires careful design in terms of power requirements and weight to allow for realistic portability. True portability with high-throughput microfluidic systems also requires sampling systems without the need for sheath hydrodynamic focusing both to avoid the need for sheath fluid and to enable higher volumes of actual sample, rather than sheath/sample combinations. Weight/power requirements dictate use of super-bright LEDs with top-hat excitation beam architectures and very small silicon photodiodes or nanophotonic sensors that can both be powered by small batteries. Signal-to-noise characteristics can be greatly improved by appropriately pulsing the LED excitation sources and sampling and subtracting noise in between excitation pulses. Microfluidic cytometry also requires judicious use of small sample volumes and appropriate statistical sampling by microfluidic cytometry or imaging for adequate statistical significance to permit real-time (typically in less than 15 minutes) initial medical decisions for patients in the field. This is not something conventional cytometry traditionally worries about, but is very important for development of small, portable microfluidic devices with small-volume throughputs. It also provides a more reasonable alternative to conventional tubes of blood when sampling geriatric and newborn patients for whom a conventional peripheral blood draw can be problematical. Instead one or two drops of blood obtained by pin-prick should be able to provide statistically meaningful results for use in making real-time medical decisions without the need for blood fractionation, which is not realistic in the doctor's office or field.
Pandey, Udai Bhan
2011-01-01
The common fruit fly, Drosophila melanogaster, is a well studied and highly tractable genetic model organism for understanding molecular mechanisms of human diseases. Many basic biological, physiological, and neurological properties are conserved between mammals and D. melanogaster, and nearly 75% of human disease-causing genes are believed to have a functional homolog in the fly. In the discovery process for therapeutics, traditional approaches employ high-throughput screening for small molecules that is based primarily on in vitro cell culture, enzymatic assays, or receptor binding assays. The majority of positive hits identified through these types of in vitro screens, unfortunately, are found to be ineffective and/or toxic in subsequent validation experiments in whole-animal models. New tools and platforms are needed in the discovery arena to overcome these limitations. The incorporation of D. melanogaster into the therapeutic discovery process holds tremendous promise for an enhanced rate of discovery of higher quality leads. D. melanogaster models of human diseases provide several unique features such as powerful genetics, highly conserved disease pathways, and very low comparative costs. The fly can effectively be used for low- to high-throughput drug screens as well as in target discovery. Here, we review the basic biology of the fly and discuss models of human diseases and opportunities for therapeutic discovery for central nervous system disorders, inflammatory disorders, cardiovascular disease, cancer, and diabetes. We also provide information and resources for those interested in pursuing fly models of human disease, as well as those interested in using D. melanogaster in the drug discovery process. PMID:21415126
Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.
2016-01-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567
Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A
2011-01-01
Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.
Mass Transfer Testing of a 12.5-cm Rotor Centrifugal Contactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. H. Meikrantz; T. G. Garn; J. D. Law
2008-09-01
TRUEX mass transfer tests were performed using a single stage commercially available 12.5 cm centrifugal contactor and stable cerium (Ce) and europium (Eu). Test conditions included throughputs ranging from 2.5 to 15 Lpm and rotor speeds of 1750 and 2250 rpm. Ce and Eu extraction forward distribution coefficients ranged from 13 to 19. The first and second stage strip back distributions were 0.5 to 1.4 and .002 to .004, respectively, throughout the dynamic test conditions studied. Visual carryover of aqueous entrainment in all organic phase samples was estimated at < 0.1 % and organic carryover into all aqueous phase samplesmore » was about ten times less. Mass transfer efficiencies of = 98 % for both Ce and Eu in the extraction section were obtained over the entire range of test conditions. The first strip stage mass transfer efficiencies ranged from 75 to 93% trending higher with increasing throughput. Second stage mass transfer was greater than 99% in all cases. Increasing the rotor speed from 1750 to 2250 rpm had no significant effect on efficiency for all throughputs tested.« less
Liu, Chao; Xue, Chundong; Chen, Xiaodong; Shan, Lei; Tian, Yu; Hu, Guoqing
2015-06-16
Viscoelasticity-induced particle migration has recently received increasing attention due to its ability to obtain high-quality focusing over a wide range of flow rates. However, its application is limited to low throughput regime since the particles can defocus as flow rate increases. Using an engineered carrier medium with constant and low viscosity and strong elasticity, the sample flow rates are improved to be 1 order of magnitude higher than those in existing studies. Utilizing differential focusing of particles of different sizes, here, we present sheathless particle/cell separation in simple straight microchannels that possess excellent parallelizability for further throughput enhancement. The present method can be implemented over a wide range of particle/cell sizes and flow rates. We successfully separate small particles from larger particles, MCF-7 cells from red blood cells (RBCs), and Escherichia coli (E. coli) bacteria from RBCs in different straight microchannels. The proposed method could broaden the applications of viscoelastic microfluidic devices to particle/cell separation due to the enhanced sample throughput and simple channel design.
Optimisation of wavelength modulated Raman spectroscopy: towards high throughput cell screening.
Praveen, Bavishna B; Mazilu, Michael; Marchington, Robert F; Herrington, C Simon; Riches, Andrew; Dholakia, Kishan
2013-01-01
In the field of biomedicine, Raman spectroscopy is a powerful technique to discriminate between normal and cancerous cells. However the strong background signal from the sample and the instrumentation affects the efficiency of this discrimination technique. Wavelength Modulated Raman spectroscopy (WMRS) may suppress the background from the Raman spectra. In this study we demonstrate a systematic approach for optimizing the various parameters of WMRS to achieve a reduction in the acquisition time for potential applications such as higher throughput cell screening. The Signal to Noise Ratio (SNR) of the Raman bands depends on the modulation amplitude, time constant and total acquisition time. It was observed that the sampling rate does not influence the signal to noise ratio of the Raman bands if three or more wavelengths are sampled. With these optimised WMRS parameters, we increased the throughput in the binary classification of normal human urothelial cells and bladder cancer cells by reducing the total acquisition time to 6 s which is significantly lower in comparison to previous acquisition times required for the discrimination between similar cell types.
Chow, Nancy A; Lindsley, Mark D; McCotter, Orion Z; Kangiser, Dave; Wohrle, Ron D; Clifford, Wayne R; Yaglom, Hayley D; Adams, Laura E; Komatsu, Kenneth; Durkin, Michelle M; Baker, Rocky J; Shubitz, Lisa F; Derado, Gordana; Chiller, Tom M; Litvintseva, Anastasia P
2017-01-01
Coccidioides is a soil-dwelling fungus that causes coccidioidomycosis, a disease also known as Valley fever, which affects humans and a variety of animal species. Recent findings of Coccidioides in new, unexpected areas of the United States have demonstrated the need for a better understanding of its geographic distribution. Large serological studies on animals could provide important information on the geographic distribution of this pathogen. To facilitate such studies, we used protein A/G, a recombinant protein that binds IgG antibodies from a variety of mammalian species, to develop an enzyme immunoassay (EIA) that detects IgG antibodies against Coccidioides in a highly sensitive and high-throughput manner. We showed the potential of this assay to be adapted to multiple animal species by testing a collection of serum and/or plasma samples from dogs, mice, and humans with or without confirmed coccidioidomycosis. We then evaluated the performance of the assay in dogs, using sera from dogs residing in a highly endemic area, and found seropositivity rates significantly higher than those in dogs of non-endemic areas. We further evaluated the specificity of the assay in dogs infected with other fungal pathogens known to cross-react with Coccidioides. Finally, we used the assay to perform a cross-sectional serosurvey investigating dogs from Washington, a state in which infection with Coccidioides has recently been documented. In summary, we have developed a Coccidioides EIA for the detection of antibodies in canines that is more sensitive and has higher throughput than currently available methods, and by testing this assay in mice and humans, we have shown a proof of principle of its adaptability for other animal species.
Lindsley, Mark D.; McCotter, Orion Z.; Kangiser, Dave; Wohrle, Ron D.; Clifford, Wayne R.; Yaglom, Hayley D.; Adams, Laura E.; Komatsu, Kenneth; Durkin, Michelle M.; Baker, Rocky J.; Shubitz, Lisa F.; Derado, Gordana; Chiller, Tom M.; Litvintseva, Anastasia P.
2017-01-01
Coccidioides is a soil-dwelling fungus that causes coccidioidomycosis, a disease also known as Valley fever, which affects humans and a variety of animal species. Recent findings of Coccidioides in new, unexpected areas of the United States have demonstrated the need for a better understanding of its geographic distribution. Large serological studies on animals could provide important information on the geographic distribution of this pathogen. To facilitate such studies, we used protein A/G, a recombinant protein that binds IgG antibodies from a variety of mammalian species, to develop an enzyme immunoassay (EIA) that detects IgG antibodies against Coccidioides in a highly sensitive and high-throughput manner. We showed the potential of this assay to be adapted to multiple animal species by testing a collection of serum and/or plasma samples from dogs, mice, and humans with or without confirmed coccidioidomycosis. We then evaluated the performance of the assay in dogs, using sera from dogs residing in a highly endemic area, and found seropositivity rates significantly higher than those in dogs of non-endemic areas. We further evaluated the specificity of the assay in dogs infected with other fungal pathogens known to cross-react with Coccidioides. Finally, we used the assay to perform a cross-sectional serosurvey investigating dogs from Washington, a state in which infection with Coccidioides has recently been documented. In summary, we have developed a Coccidioides EIA for the detection of antibodies in canines that is more sensitive and has higher throughput than currently available methods, and by testing this assay in mice and humans, we have shown a proof of principle of its adaptability for other animal species. PMID:28380017
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
A versatile toolkit for high throughput functional genomics with Trichoderma reesei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuster, Andre; Bruno, Kenneth S.; Collett, James R.
2012-01-02
The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and thenmore » transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.« less
Exceptionally fast water desalination at complete salt rejection by pristine graphyne monolayers.
Xue, Minmin; Qiu, Hu; Guo, Wanlin
2013-12-20
Desalination that produces clean freshwater from seawater holds the promise of solving the global water shortage for drinking, agriculture and industry. However, conventional desalination technologies such as reverse osmosis and thermal distillation involve large amounts of energy consumption, and the semipermeable membranes widely used in reverse osmosis face the challenge to provide a high throughput at high salt rejection. Here we find by comprehensive molecular dynamics simulations and first principles modeling that pristine graphyne, one of the graphene-like one-atom-thick carbon allotropes, can achieve 100% rejection of nearly all ions in seawater including Na(+), Cl(-), Mg(2+), K(+) and Ca(2+), at an exceptionally high water permeability about two orders of magnitude higher than those for commercial state-of-the-art reverse osmosis membranes at a salt rejection of ~98.5%. This complete ion rejection by graphyne, independent of the salt concentration and the operating pressure, is revealed to be originated from the significantly higher energy barriers for ions than for water. This intrinsic specialty of graphyne should provide a new possibility for the efforts to alleviate the global shortage of freshwater and other environmental problems.
Hybrid scheduling mechanisms for Next-generation Passive Optical Networks based on network coding
NASA Astrophysics Data System (ADS)
Zhao, Jijun; Bai, Wei; Liu, Xin; Feng, Nan; Maier, Martin
2014-10-01
Network coding (NC) integrated into Passive Optical Networks (PONs) is regarded as a promising solution to achieve higher throughput and energy efficiency. To efficiently support multimedia traffic under this new transmission mode, novel NC-based hybrid scheduling mechanisms for Next-generation PONs (NG-PONs) including energy management, time slot management, resource allocation, and Quality-of-Service (QoS) scheduling are proposed in this paper. First, we design an energy-saving scheme that is based on Bidirectional Centric Scheduling (BCS) to reduce the energy consumption of both the Optical Line Terminal (OLT) and Optical Network Units (ONUs). Next, we propose an intra-ONU scheduling and an inter-ONU scheduling scheme, which takes NC into account to support service differentiation and QoS assurance. The presented simulation results show that BCS achieves higher energy efficiency under low traffic loads, clearly outperforming the alternative NC-based Upstream Centric Scheduling (UCS) scheme. Furthermore, BCS is shown to provide better QoS assurance.
A catalog of putative adverse outcome pathways (AOPs) that ...
A number of putative AOPs for several distinct MIEs of thyroid disruption have been formulated for amphibian metamorphosis and fish swim bladder inflation. These have been entered into the AOP knowledgebase on the OECD WIKI. The EDSP has been actively advancing high-throughput screening for chemical activity toward estrogen, androgen and thyroid targets. However, it has been recently identified that coverage for thyroid-related targets is lagging behind estrogen and androgen assay coverage. As thyroid-related medium-high throughput assays are actively being developed for inclusion in the ToxCast chemical screening program, a parallel effort is underway to characterize putative adverse outcome pathways (AOPs) specific to these thyroid-related targets. This effort is intended to provide biological and ecological context that will enhance the utility of ToxCast high throughput screening data for hazard identification.
A high-throughput assay for DNA topoisomerases and other enzymes, based on DNA triplex formation.
Burrell, Matthew R; Burton, Nicolas P; Maxwell, Anthony
2010-01-01
We have developed a rapid, high-throughput assay for measuring the catalytic activity (DNA supercoiling or relaxation) of topoisomerase enzymes that is also capable of monitoring the activity of other enzymes that alter the topology of DNA. The assay utilises intermolecular triplex formation to resolve supercoiled and relaxed forms of DNA, the principle being the greater efficiency of a negatively supercoiled plasmid to form an intermolecular triplex with an immobilised oligonucleotide than the relaxed form. The assay provides a number of advantages over the standard gel-based methods, including greater speed of analysis, reduced sample handling, better quantitation and improved reliability and accuracy of output data. The assay is performed in microtitre plates and can be adapted to high-throughput screening of libraries of potential inhibitors of topoisomerases including bacterial DNA gyrase.
tcpl: the ToxCast pipeline for high-throughput screening data.
Filer, Dayne L; Kothiya, Parth; Setzer, R Woodrow; Judson, Richard S; Martin, Matthew T
2017-02-15
Large high-throughput screening (HTS) efforts are widely used in drug development and chemical toxicity screening. Wide use and integration of these data can benefit from an efficient, transparent and reproducible data pipeline. Summary: The tcpl R package and its associated MySQL database provide a generalized platform for efficiently storing, normalizing and dose-response modeling of large high-throughput and high-content chemical screening data. The novel dose-response modeling algorithm has been tested against millions of diverse dose-response series, and robustly fits data with outliers and cytotoxicity-related signal loss. tcpl is freely available on the Comprehensive R Archive Network under the GPL-2 license. martin.matt@epa.gov. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
Accounting Artifacts in High-Throughput Toxicity Assays.
Hsieh, Jui-Hua
2016-01-01
Compound activity identification is the primary goal in high-throughput screening (HTS) assays. However, assay artifacts including both systematic (e.g., compound auto-fluorescence) and nonsystematic (e.g., noise) complicate activity interpretation. In addition, other than the traditional potency parameter, half-maximal effect concentration (EC50), additional activity parameters (e.g., point-of-departure, POD) could be derived from HTS data for activity profiling. A data analysis pipeline has been developed to handle the artifacts and to provide compound activity characterization with either binary or continuous metrics. This chapter outlines the steps in the pipeline using Tox21 glucocorticoid receptor (GR) β-lactamase assays, including the formats to identify either agonists or antagonists, as well as the counter-screen assays for identifying artifacts as examples. The steps can be applied to other lower-throughput assays with concentration-response data.
Resolving the Evolution of Extant and Extinct Ruminants With High-Throughput Phylogenomics
USDA-ARS?s Scientific Manuscript database
The Pecorans (higher ruminants) are believed to have rapidly speciated in the Mid-Eocene, resulting in five distinct extant families; Antilocapridae, Giraffidae, Moschidae, Cervidae, and Bovidae. Due to the rapid radiation, the Pecoran phylogeny has proven difficult to resolve and eleven of the fift...
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
Maintaining Momentum toward Graduation: OER and the Course Throughput Rate
ERIC Educational Resources Information Center
Hilton, John, III; Fischer, Lane; Wiley, David; Williams, Linda
2016-01-01
"Open Educational Resources" (OER) have the potential to replace traditional textbooks in higher education. Previous studies indicate that use of OER results in high student and faculty satisfaction, lower costs, and similar or better educational outcomes. In this case study, we compared students using traditional textbooks with those…
NEW TECHNOLOGIES TO SOLVE OLD PROBLEMS AND ADDRESS ISSUES IN RISK ASSESSMENT
Appropriate utilization of data is an ongoing concern of the regulated industries and the agencies charged with assessing safety or risk. An area of current interest is the possibility that toxicogenomics will enhance our ability to develop higher or high-throughput models for pr...
Impact of Higher Natural Gas Prices on Local Distribution Companies and Residential Customers
2007-01-01
This report examines some of the problems faced by natural gas consumers as a result of increasing heating bills in recent years and problems associated with larger amounts of uncollectible revenue and lower throughput for the local distribution companies (LDCs) supplying the natural gas.
Yoshii, Yukie; Furukawa, Takako; Waki, Atsuo; Okuyama, Hiroaki; Inoue, Masahiro; Itoh, Manabu; Zhang, Ming-Rong; Wakizaka, Hidekatsu; Sogawa, Chizuru; Kiyono, Yasushi; Yoshii, Hiroshi; Fujibayashi, Yasuhisa; Saga, Tsuneo
2015-05-01
Anti-cancer drug development typically utilizes high-throughput screening with two-dimensional (2D) cell culture. However, 2D culture induces cellular characteristics different from tumors in vivo, resulting in inefficient drug development. Here, we report an innovative high-throughput screening system using nanoimprinting 3D culture to simulate in vivo conditions, thereby facilitating efficient drug development. We demonstrated that cell line-based nanoimprinting 3D screening can more efficiently select drugs that effectively inhibit cancer growth in vivo as compared to 2D culture. Metabolic responses after treatment were assessed using positron emission tomography (PET) probes, and revealed similar characteristics between the 3D spheroids and in vivo tumors. Further, we developed an advanced method to adopt cancer cells from patient tumor tissues for high-throughput drug screening with nanoimprinting 3D culture, which we termed Cancer tissue-Originated Uniformed Spheroid Assay (COUSA). This system identified drugs that were effective in xenografts of the original patient tumors. Nanoimprinting 3D spheroids showed low permeability and formation of hypoxic regions inside, similar to in vivo tumors. Collectively, the nanoimprinting 3D culture provides easy-handling high-throughput drug screening system, which allows for efficient drug development by mimicking the tumor environment. The COUSA system could be a useful platform for drug development with patient cancer cells. Copyright © 2015 Elsevier Ltd. All rights reserved.
Performance Assessment of the Digital Array Scanned Interferometer (DASI) Concept
NASA Technical Reports Server (NTRS)
Katzberg, Stephen J.; Statham, Richard B.
1996-01-01
Interferometers are known to have higher throughput than grating spectrometers for the same resolvance. The digital array scanned interferometer (DASI) has been proposed as an instrument that can capitalize on the superior throughput of the interferometer and, simultaneously, be adapted to imaging. The DASI is not the first implementation of the dual purpose concept, but it is one that has made several claims of major performance superiority, and it has been developed into a complete instrument. This paper reviews the DASI concept, summarizes its claims, and gives an assessment of how well the claims are justified. It is shown that the claims of signal-to-noise ratio superiority and operational simplicity are realized only modestly, if at all.
Microelectroporation device for genomic screening
Perroud, Thomas D.; Renzi, Ronald F.; Negrete, Oscar; Claudnic, Mark R.
2014-09-09
We have developed an microelectroporation device that combines microarrays of oligonucleotides, microfluidic channels, and electroporation for cell transfection and high-throughput screening applications (e.g. RNA interference screens). Microarrays allow the deposition of thousands of different oligonucleotides in microscopic spots. Microfluidic channels and microwells enable efficient loading of cells into the device and prevent cross-contamination between different oligonucleotides spots. Electroporation allows optimal transfection of nucleic acids into cells (especially hard-to-transfect cells such as primary cells) by minimizing cell death while maximizing transfection efficiency. This invention has the advantage of a higher throughput and lower cost, while preventing cross-contamination compared to conventional screening technologies. Moreover, this device does not require bulky robotic liquid handling equipment and is inherently safer given that it is a closed system.
High-throughput ultraviolet photoacoustic microscopy with multifocal excitation
NASA Astrophysics Data System (ADS)
Imai, Toru; Shi, Junhui; Wong, Terence T. W.; Li, Lei; Zhu, Liren; Wang, Lihong V.
2018-03-01
Ultraviolet photoacoustic microscopy (UV-PAM) is a promising intraoperative tool for surgical margin assessment (SMA), one that can provide label-free histology-like images with high resolution. In this study, using a microlens array and a one-dimensional (1-D) array ultrasonic transducer, we developed a high-throughput multifocal UV-PAM (MF-UV-PAM). Our new system achieved a 1.6 ± 0.2 μm lateral resolution and produced images 40 times faster than the previously developed point-by-point scanning UV-PAM. MF-UV-PAM provided a readily comprehensible photoacoustic image of a mouse brain slice with specific absorption contrast in ˜16 min, highlighting cell nuclei. Individual cell nuclei could be clearly resolved, showing its practical potential for intraoperative SMA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR
Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less
2005-09-01
This research explores the need for a high throughput, high speed network for use in a network centric wartime environment and how commercial...Automated Digital Network System (ADNS). This research explores the need for a high-throughput, high-speed network for use in a network centric ...1 C. DEPARTMENT OF DEFENSE (DOD) DESIRED END STATE ..............2 1. DOD Transformation to Network Centric Warfare (NCW) Operations
Extended length microchannels for high density high throughput electrophoresis systems
Davidson, James C.; Balch, Joseph W.
2000-01-01
High throughput electrophoresis systems which provide extended well-to-read distances on smaller substrates, thus compacting the overall systems. The electrophoresis systems utilize a high density array of microchannels for electrophoresis analysis with extended read lengths. The microchannel geometry can be used individually or in conjunction to increase the effective length of a separation channel while minimally impacting the packing density of channels. One embodiment uses sinusoidal microchannels, while another embodiment uses plural microchannels interconnected by a via. The extended channel systems can be applied to virtually any type of channel confined chromatography.
Three applications of backscatter x-ray imaging technology to homeland defense
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2005-05-01
A brief review of backscatter x-ray imaging and a description of three systems currently applying it to homeland defense missions (BodySearch, ZBV and ZBP). These missions include detection of concealed weapons, explosives and contraband on personnel, in vehicles and large cargo containers. An overview of the x-ray imaging subsystems is provided as well as sample images from each system. Key features such as x-ray safety, throughput and detection are discussed. Recent trends in operational modes are described that facilitate 100% inspection at high throughput chokepoints.
Hupert, Mateusz L; Jackson, Joshua M; Wang, Hong; Witek, Małgorzata A; Kamande, Joyce; Milowsky, Matthew I; Whang, Young E; Soper, Steven A
2014-10-01
Microsystem-based technologies are providing new opportunities in the area of in vitro diagnostics due to their ability to provide process automation enabling point-of-care operation. As an example, microsystems used for the isolation and analysis of circulating tumor cells (CTCs) from complex, heterogeneous samples in an automated fashion with improved recoveries and selectivity are providing new opportunities for this important biomarker. Unfortunately, many of the existing microfluidic systems lack the throughput capabilities and/or are too expensive to manufacture to warrant their widespread use in clinical testing scenarios. Here, we describe a disposable, all-polymer, microfluidic system for the high-throughput (HT) isolation of CTCs directly from whole blood inputs. The device employs an array of high aspect ratio (HAR), parallel, sinusoidal microchannels (25 µm × 150 µm; W × D; AR = 6.0) with walls covalently decorated with anti-EpCAM antibodies to provide affinity-based isolation of CTCs. Channel width, which is similar to an average CTC diameter (12-25 µm), plays a critical role in maximizing the probability of cell/wall interactions and allows for achieving high CTC recovery. The extended channel depth allows for increased throughput at the optimized flow velocity (2 mm/s in a microchannel); maximizes cell recovery, and prevents clogging of the microfluidic channels during blood processing. Fluidic addressing of the microchannel array with a minimal device footprint is provided by large cross-sectional area feed and exit channels poised orthogonal to the network of the sinusoidal capillary channels (so-called Z-geometry). Computational modeling was used to confirm uniform addressing of the channels in the isolation bed. Devices with various numbers of parallel microchannels ranging from 50 to 320 have been successfully constructed. Cyclic olefin copolymer (COC) was chosen as the substrate material due to its superior properties during UV-activation of the HAR microchannels surfaces prior to antibody attachment. Operation of the HT-CTC device has been validated by isolation of CTCs directly from blood secured from patients with metastatic prostate cancer. High CTC sample purities (low number of contaminating white blood cells, WBCs) allowed for direct lysis and molecular profiling of isolated CTCs.
Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y
2014-07-08
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).
Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475
Mason, Annaliese S; Zhang, Jing; Tollenaere, Reece; Vasquez Teuber, Paula; Dalton-Morgan, Jessica; Hu, Liyong; Yan, Guijun; Edwards, David; Redden, Robert; Batley, Jacqueline
2015-09-01
Germplasm collections provide an extremely valuable resource for breeders and researchers. However, misclassification of accessions by species often hinders the effective use of these collections. We propose that use of high-throughput genotyping tools can provide a fast, efficient and cost-effective way of confirming species in germplasm collections, as well as providing valuable genetic diversity data. We genotyped 180 Brassicaceae samples sourced from the Australian Grains Genebank across the recently released Illumina Infinium Brassica 60K SNP array. Of these, 76 were provided on the basis of suspected misclassification and another 104 were sourced independently from the germplasm collection. Presence of the A- and C-genomes combined with principle components analysis clearly separated Brassica rapa, B. oleracea, B. napus, B. carinata and B. juncea samples into distinct species groups. Several lines were further validated using chromosome counts. Overall, 18% of samples (32/180) were misclassified on the basis of species. Within these 180 samples, 23/76 (30%) supplied on the basis of suspected misclassification were misclassified, and 9/105 (9%) of the samples randomly sourced from the Australian Grains Genebank were misclassified. Surprisingly, several individuals were also found to be the product of interspecific hybridization events. The SNP (single nucleotide polymorphism) array proved effective at confirming species, and provided useful information related to genetic diversity. As similar genomic resources become available for different crops, high-throughput molecular genotyping will offer an efficient and cost-effective method to screen germplasm collections worldwide, facilitating more effective use of these valuable resources by breeders and researchers. © 2015 John Wiley & Sons Ltd.
An Overview of the HST Advanced Camera for Surveys' On-orbit Performance
NASA Astrophysics Data System (ADS)
Hartig, G. F.; Ford, H. C.; Illingworth, G. D.; Clampin, M.; Bohlin, R. C.; Cox, C.; Krist, J.; Sparks, W. B.; De Marchi, G.; Martel, A. R.; McCann, W. J.; Meurer, G. R.; Sirianni, M.; Tsvetanov, Z.; Bartko, F.; Lindler, D. J.
2002-05-01
The Advanced Camera for Surveys (ACS) was installed in the HST on 7 March 2002 during the fourth servicing mission to the observatory, and is now beginning science operations. The ACS provides HST observers with a considerably more sensitive, higher-resolution camera with wider field and polarimetric, coronagraphic, low-resolution spectrographic and solar-blind FUV capabilities. We review selected results of the early verification and calibration program, comparing the achieved performance with the advertised specifications. Emphasis is placed on the optical characteristics of the camera, including image quality, throughput, geometric distortion and stray-light performance. More detailed analyses of various aspects of the ACS performance are presented in other papers at this meeting. This work was supported by a NASA contract and a NASA grant.
Phase-Sensitive Surface Plasmon Resonance Sensors: Recent Progress and Future Prospects
Deng, Shijie; Wang, Peng; Yu, Xinglong
2017-01-01
Surface plasmon resonance (SPR) is an optical sensing technique that is capable of performing real-time, label-free and high-sensitivity monitoring of molecular interactions. SPR biosensors can be divided according to their operating principles into angle-, wavelength-, intensity- and phase-interrogated devices. With their complex optical configurations, phase-interrogated SPR sensors generally provide higher sensitivity and throughput, and have thus recently emerged as prominent biosensing devices. To date, several methods have been developed for SPR phase interrogation, including heterodyne detection, polarimetry, shear interferometry, spatial phase modulation interferometry and temporal phase modulation interferometry. This paper summarizes the fundamentals of phase-sensitive SPR sensing, reviews the available methods for phase interrogation of these sensors, and discusses the future prospects for and trends in the development of this technology. PMID:29206182
Zhang, Le; Lawson, Ken; Yeung, Bernice; Wypych, Jette
2015-01-06
A purity method based on capillary zone electrophoresis (CZE) has been developed for the separation of isoforms of a highly glycosylated protein. The separation was found to be driven by the number of sialic acids attached to each isoform. The method has been characterized using orthogonal assays and shown to have excellent specificity, precision and accuracy. We have demonstrated the CZE method is a useful in-process assay to support cell culture and purification development of this glycoprotein. Compared to isoelectric focusing (IEF), the CZE method provides more quantitative results and higher sample throughput with excellent accuracy, qualities that are required for process development. In addition, the CZE method has been applied in the stability testing of purified glycoprotein samples.
Possibilities for serial femtosecond crystallography sample delivery at future light sourcesa)
Chavas, L. M. G.; Gumprecht, L.; Chapman, H. N.
2015-01-01
Serial femtosecond crystallography (SFX) uses X-ray pulses from free-electron laser (FEL) sources that can outrun radiation damage and thereby overcome long-standing limits in the structure determination of macromolecular crystals. Intense X-ray FEL pulses of sufficiently short duration allow the collection of damage-free data at room temperature and give the opportunity to study irreversible time-resolved events. SFX may open the way to determine the structure of biological molecules that fail to crystallize readily into large well-diffracting crystals. Taking advantage of FELs with high pulse repetition rates could lead to short measurement times of just minutes. Automated delivery of sample suspensions for SFX experiments could potentially give rise to a much higher rate of obtaining complete measurements than at today's third generation synchrotron radiation facilities, as no crystal alignment or complex robotic motions are required. This capability will also open up extensive time-resolved structural studies. New challenges arise from the resulting high rate of data collection, and in providing reliable sample delivery. Various developments for fully automated high-throughput SFX experiments are being considered for evaluation, including new implementations for a reliable yet flexible sample environment setup. Here, we review the different methods developed so far that best achieve sample delivery for X-ray FEL experiments and present some considerations towards the goal of high-throughput structure determination with X-ray FELs. PMID:26798808
Modeling congenital disease and inborn errors of development in Drosophila melanogaster
Moulton, Matthew J.; Letsou, Anthea
2016-01-01
ABSTRACT Fly models that faithfully recapitulate various aspects of human disease and human health-related biology are being used for research into disease diagnosis and prevention. Established and new genetic strategies in Drosophila have yielded numerous substantial successes in modeling congenital disorders or inborn errors of human development, as well as neurodegenerative disease and cancer. Moreover, although our ability to generate sequence datasets continues to outpace our ability to analyze these datasets, the development of high-throughput analysis platforms in Drosophila has provided access through the bottleneck in the identification of disease gene candidates. In this Review, we describe both the traditional and newer methods that are facilitating the incorporation of Drosophila into the human disease discovery process, with a focus on the models that have enhanced our understanding of human developmental disorders and congenital disease. Enviable features of the Drosophila experimental system, which make it particularly useful in facilitating the much anticipated move from genotype to phenotype (understanding and predicting phenotypes directly from the primary DNA sequence), include its genetic tractability, the low cost for high-throughput discovery, and a genome and underlying biology that are highly evolutionarily conserved. In embracing the fly in the human disease-gene discovery process, we can expect to speed up and reduce the cost of this process, allowing experimental scales that are not feasible and/or would be too costly in higher eukaryotes. PMID:26935104
Han, Bomie; Higgs, Richard E
2008-09-01
High-throughput HPLC-mass spectrometry (HPLC-MS) is routinely used to profile biological samples for potential protein markers of disease, drug efficacy and toxicity. The discovery technology has advanced to the point where translating hypotheses from proteomic profiling studies into clinical use is the bottleneck to realizing the full potential of these approaches. The first step in this translation is the development and analytical validation of a higher throughput assay with improved sensitivity and selectivity relative to typical profiling assays. Multiple reaction monitoring (MRM) assays are an attractive approach for this stage of biomarker development given their improved sensitivity and specificity, the speed at which the assays can be developed and the quantitative nature of the assay. While the profiling assays are performed with ion trap mass spectrometers, MRM assays are traditionally developed in quadrupole-based mass spectrometers. Development of MRM assays from the same instrument used in the profiling analysis enables a seamless and rapid transition from hypothesis generation to validation. This report provides guidelines for rapidly developing an MRM assay using the same mass spectrometry platform used for profiling experiments (typically ion traps) and reviews methodological and analytical validation considerations. The analytical validation guidelines presented are drawn from existing practices on immunological assays and are applicable to any mass spectrometry platform technology.
Emerging Genomic Tools for Legume Breeding: Current Status and Future Prospects
Pandey, Manish K.; Roorkiwal, Manish; Singh, Vikas K.; Ramalingam, Abirami; Kudapa, Himabindu; Thudi, Mahendar; Chitikineni, Anu; Rathore, Abhishek; Varshney, Rajeev K.
2016-01-01
Legumes play a vital role in ensuring global nutritional food security and improving soil quality through nitrogen fixation. Accelerated higher genetic gains is required to meet the demand of ever increasing global population. In recent years, speedy developments have been witnessed in legume genomics due to advancements in next-generation sequencing (NGS) and high-throughput genotyping technologies. Reference genome sequences for many legume crops have been reported in the last 5 years. The availability of the draft genome sequences and re-sequencing of elite genotypes for several important legume crops have made it possible to identify structural variations at large scale. Availability of large-scale genomic resources and low-cost and high-throughput genotyping technologies are enhancing the efficiency and resolution of genetic mapping and marker-trait association studies. Most importantly, deployment of molecular breeding approaches has resulted in development of improved lines in some legume crops such as chickpea and groundnut. In order to support genomics-driven crop improvement at a fast pace, the deployment of breeder-friendly genomics and decision support tools seems appear to be critical in breeding programs in developing countries. This review provides an overview of emerging genomics and informatics tools/approaches that will be the key driving force for accelerating genomics-assisted breeding and ultimately ensuring nutritional and food security in developing countries. PMID:27199998
Hu, Xiaolong; Liu, Gang; Shafer, Aaron B. A.; Wei, Yuting; Zhou, Juntong; Lin, Shaobi; Wu, Haibin; Zhou, Mi; Hu, Defu; Liu, Shuqiang
2017-01-01
The gut ecosystem is characterized by dynamic and reciprocal interactions between the host and bacteria. Although characterizing microbiota for herbivores has become recognized as important tool for gauging species health, no study to date has investigated the bacterial communities and evaluated the age-related bacterial dynamics of musk deer. Moreover, gastrointestinal diseases have been hypothesized to be a limiting factor of population growth in captive musk deer. Here, high-throughput sequencing of the bacterial 16S rRNA gene was used to profile the fecal bacterial communities in juvenile and adult alpine and forest musk deer. The two musk deer species harbored similar bacterial communities at the phylum level, whereas the key genera for the two species were distinct. The bacterial communities were dominated by Firmicutes and Bacteroidetes, with the bacterial diversity being higher in forest musk deer. The Firmicutes to Bacteroidetes ratio also increased from juvenile to adult, while the bacterial diversity, within-group and between-group similarity, all increased with age. This work serves as the first sequence-based analysis of variation in bacterial communities within and between musk deer species, and demonstrates how the gut microbial community dynamics vary among closely related species and shift with age. As gastrointestinal diseases have been observed in captive populations, this study provides valuable data that might benefit captive management and future reintroduction programs. PMID:28421061
Fast and Flexible Successive-Cancellation List Decoders for Polar Codes
NASA Astrophysics Data System (ADS)
Hashemi, Seyyed Ali; Condo, Carlo; Gross, Warren J.
2017-11-01
Polar codes have gained significant amount of attention during the past few years and have been selected as a coding scheme for the next generation of mobile broadband standard. Among decoding schemes, successive-cancellation list (SCL) decoding provides a reasonable trade-off between the error-correction performance and hardware implementation complexity when used to decode polar codes, at the cost of limited throughput. The simplified SCL (SSCL) and its extension SSCL-SPC increase the speed of decoding by removing redundant calculations when encountering particular information and frozen bit patterns (rate one and single parity check codes), while keeping the error-correction performance unaltered. In this paper, we improve SSCL and SSCL-SPC by proving that the list size imposes a specific number of bit estimations required to decode rate one and single parity check codes. Thus, the number of estimations can be limited while guaranteeing exactly the same error-correction performance as if all bits of the code were estimated. We call the new decoding algorithms Fast-SSCL and Fast-SSCL-SPC. Moreover, we show that the number of bit estimations in a practical application can be tuned to achieve desirable speed, while keeping the error-correction performance almost unchanged. Hardware architectures implementing both algorithms are then described and implemented: it is shown that our design can achieve 1.86 Gb/s throughput, higher than the best state-of-the-art decoders.
The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences
USDA-ARS?s Scientific Manuscript database
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...
New Applications for the Testing and Visualization of Wireless Networks
NASA Technical Reports Server (NTRS)
Griffin, Robert I.; Cauley, Michael A.; Pleva, Michael A.; Seibert, Marc A.; Lopez, Isaac
2005-01-01
Traditional techniques for examining wireless networks use physical link characteristics such as Signal-to-Noise (SNR) ratios to assess the performance of wireless networks. Such measurements may not be reliable indicators of available bandwidth. This work describes two new software applications developed at NASA Glenn Research Center for the investigation of wireless networks. GPSIPerf combines measurements of Transmission Control Protocol (TCP) throughput with Global Positioning System (GPS) coordinates to give users a map of wireless bandwidth for outdoor environments where a wireless infrastructure has been deployed. GPSIPerfView combines the data provided by GPSIPerf with high-resolution digital elevation maps (DEM) to help users visualize and assess the impact of elevation features on wireless networks in a given sample area. These applications were used to examine TCP throughput in several wireless network configurations at desert field sites near Hanksville, Utah during May of 2004. Use of GPSIPerf and GPSIPerfView provides a geographically referenced picture of the extent and deterioration of TCP throughput in tested wireless network configurations. GPSIPerf results from field-testing in Utah suggest that it can be useful in assessing other wireless network architectures, and may be useful to future human-robotic exploration missions.
Lontis, Eugen R; Lund, Morten E; Christensen, Henrik V; Bentsen, Bo; Gaihede, Michael; Caltenco, Hector A; Andreasen Struijk, Lotte N S
2010-01-01
Typing performance of a full alphabet keyboard and a joystick type of mouse (with on-screen keyboard) provided by a wireless integrated tongue control system (TCS) has been investigated. The speed and accuracy have been measured in a form of a throughput defining the true correct words per minute [cwpm]. Training character sequences were typed in a dedicated interface that provided visual feedback of activated sensors, a map of the alphabet associated, and the task character. Testing sentences were typed in Word, with limited visual feedback, using non-predictive typing (map of characters in alphabetic order associated to sensors) and predictive typing (LetterWise) for TCS keyboard, and non-predictive typing for TCS mouse. Two subjects participated for four and three consecutive days, respectively, two sessions per day. Maximal throughput of 2.94, 2.46, and 2.06, 1.68 [cwpm] were obtained with TCS keyboard by subject 1 and 2 with predictive and non-predictive typing respectively. Maximal throughput of 2.09 and 1.71 [cwpm] was obtained with TCS mouse by subject 1 and 2, respectively. Same experimental protocol has been planned for a larger number of subjects.
High-Throughput Sequencing Reveals Principles of Adeno-Associated Virus Serotype 2 Integration
Janovitz, Tyler; Klein, Isaac A.; Oliveira, Thiago; Mukherjee, Piali; Nussenzweig, Michel C.; Sadelain, Michel
2013-01-01
Viral integrations are important in human biology, yet genome-wide integration profiles have not been determined for many viruses. Adeno-associated virus (AAV) infects most of the human population and is a prevalent gene therapy vector. AAV integrates into the human genome with preference for a single locus, termed AAVS1. However, the genome-wide integration of AAV has not been defined, and the principles underlying this recombination remain unclear. Using a novel high-throughput approach, integrant capture sequencing, nearly 12 million AAV junctions were recovered from a human cell line, providing five orders of magnitude more data than were previously available. Forty-five percent of integrations occurred near AAVS1, and several thousand novel integration hotspots were identified computationally. Most of these occurred in genes, with dozens of hotspots targeting known oncogenes. Viral replication protein binding sites (RBS) and transcriptional activity were major factors favoring integration. In a first for eukaryotic viruses, the data reveal a unique asymmetric integration profile with distinctive directional orientation of viral genomes. These studies provide a new understanding of AAV integration biology through the use of unbiased high-throughput data acquisition and bioinformatics. PMID:23720718
Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.
2006-05-01
Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less
Aryee, Martin J.; Jaffe, Andrew E.; Corrada-Bravo, Hector; Ladd-Acosta, Christine; Feinberg, Andrew P.; Hansen, Kasper D.; Irizarry, Rafael A.
2014-01-01
Motivation: The recently released Infinium HumanMethylation450 array (the ‘450k’ array) provides a high-throughput assay to quantify DNA methylation (DNAm) at ∼450 000 loci across a range of genomic features. Although less comprehensive than high-throughput sequencing-based techniques, this product is more cost-effective and promises to be the most widely used DNAm high-throughput measurement technology over the next several years. Results: Here we describe a suite of computational tools that incorporate state-of-the-art statistical techniques for the analysis of DNAm data. The software is structured to easily adapt to future versions of the technology. We include methods for preprocessing, quality assessment and detection of differentially methylated regions from the kilobase to the megabase scale. We show how our software provides a powerful and flexible development platform for future methods. We also illustrate how our methods empower the technology to make discoveries previously thought to be possible only with sequencing-based methods. Availability and implementation: http://bioconductor.org/packages/release/bioc/html/minfi.html. Contact: khansen@jhsph.edu; rafa@jimmy.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24478339
High-Throughput Light Sheet Microscopy for the Automated Live Imaging of Larval Zebrafish
NASA Astrophysics Data System (ADS)
Baker, Ryan; Logan, Savannah; Dudley, Christopher; Parthasarathy, Raghuveer
The zebrafish is a model organism with a variety of useful properties; it is small and optically transparent, it reproduces quickly, it is a vertebrate, and there are a large variety of transgenic animals available. Because of these properties, the zebrafish is well suited to study using a variety of optical technologies including light sheet fluorescence microscopy (LSFM), which provides high-resolution three-dimensional imaging over large fields of view. Research progress, however, is often not limited by optical techniques but instead by the number of samples one can examine over the course of an experiment, which in the case of light sheet imaging has so far been severely limited. Here we present an integrated fluidic circuit and microscope which provides rapid, automated imaging of zebrafish using several imaging modes, including LSFM, Hyperspectral Imaging, and Differential Interference Contrast Microscopy. Using this system, we show that we can increase our imaging throughput by a factor of 10 compared to previous techniques. We also show preliminary results visualizing zebrafish immune response, which is sensitive to gut microbiota composition, and which shows a strong variability between individuals that highlights the utility of high throughput imaging. National Science Foundation, Award No. DBI-1427957.
Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B
2009-03-01
The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.
Microarray Detection of Duplex and Triplex DNA Binders with DNA-Modified Gold Nanoparticles
Lytton-Jean, Abigail K. R.; Han, Min Su; Mirkin, Chad A.
2008-01-01
We have designed a chip-based assay, using microarray technology, for determining the relative binding affinities of duplex and triplex DNA binders. This assay combines the high discrimination capabilities afforded by DNA-modified Au nanoparticles with the high-throughput capabilities of DNA microarrays. The detection and screening of duplex DNA binders are important because these molecules, in many cases, are potential anticancer agents as well as toxins. Triplex DNA binders are also promising drug candidates. These molecules, in conjunction with triplex forming oligonucleotides, could potentially be used to achieve control of gene expression by interfering with transcription factors that bind to DNA. Therefore, the ability to screen for these molecules in a high-throughput fashion could dramatically improve the drug screening process. The assay reported here provides excellent discrimination between strong, intermediate, and weak duplex and triplex DNA binders in a high-throughput fashion. PMID:17614366
Advancements in zebrafish applications for 21st century toxicology.
Garcia, Gloria R; Noyes, Pamela D; Tanguay, Robert L
2016-05-01
The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. Copyright © 2016 Elsevier Inc. All rights reserved.
Advancements in zebrafish applications for 21st century toxicology
Garcia, Gloria R.; Noyes, Pamela D.; Tanguay, Robert L.
2016-01-01
The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. PMID:27016469
A high-throughput exploration of magnetic materials by using structure predicting methods
NASA Astrophysics Data System (ADS)
Arapan, S.; Nieves, P.; Cuesta-López, S.
2018-02-01
We study the capability of a structure predicting method based on genetic/evolutionary algorithm for a high-throughput exploration of magnetic materials. We use the USPEX and VASP codes to predict stable and generate low-energy meta-stable structures for a set of representative magnetic structures comprising intermetallic alloys, oxides, interstitial compounds, and systems containing rare-earths elements, and for both types of ferromagnetic and antiferromagnetic ordering. We have modified the interface between USPEX and VASP codes to improve the performance of structural optimization as well as to perform calculations in a high-throughput manner. We show that exploring the structure phase space with a structure predicting technique reveals large sets of low-energy metastable structures, which not only improve currently exiting databases, but also may provide understanding and solutions to stabilize and synthesize magnetic materials suitable for permanent magnet applications.
High throughput ion-channel pharmacology: planar-array-based voltage clamp.
Kiss, Laszlo; Bennett, Paul B; Uebele, Victor N; Koblan, Kenneth S; Kane, Stefanie A; Neagle, Brad; Schroeder, Kirk
2003-02-01
Technological advances often drive major breakthroughs in biology. Examples include PCR, automated DNA sequencing, confocal/single photon microscopy, AFM, and voltage/patch-clamp methods. The patch-clamp method, first described nearly 30 years ago, was a major technical achievement that permitted voltage-clamp analysis (membrane potential control) of ion channels in most cells and revealed a role for channels in unimagined areas. Because of the high information content, voltage clamp is the best way to study ion-channel function; however, throughput is too low for drug screening. Here we describe a novel breakthrough planar-array-based HT patch-clamp technology developed by Essen Instruments capable of voltage-clamping thousands of cells per day. This technology provides greater than two orders of magnitude increase in throughput compared with the traditional voltage-clamp techniques. We have applied this method to study the hERG K(+) channel and to determine the pharmacological profile of QT prolonging drugs.
History, applications, and challenges of immune repertoire research.
Liu, Xiao; Wu, Jinghua
2018-02-27
The diversity of T and B cells in terms of their receptor sequences is huge in the vertebrate's immune system and provides broad protection against the vast diversity of pathogens. Immune repertoire is defined as the sum of T cell receptors and B cell receptors (also named immunoglobulin) that makes the organism's adaptive immune system. Before the emergence of high-throughput sequencing, the studies on immune repertoire were limited by the underdeveloped methodologies, since it was impossible to capture the whole picture by the low-throughput tools. The massive paralleled sequencing technology suits perfectly the researches on immune repertoire. In this article, we review the history of immune repertoire studies, in terms of technologies and research applications. Particularly, we discuss several aspects of challenges in this field and highlight the efforts to develop potential solutions, in the era of high-throughput sequencing of the immune repertoire.
High-speed zero-copy data transfer for DAQ applications
NASA Astrophysics Data System (ADS)
Pisani, Flavio; Cámpora Pérez, Daniel Hugo; Neufeld, Niko
2015-05-01
The LHCb Data Acquisition (DAQ) will be upgraded in 2020 to a trigger-free readout. In order to achieve this goal we will need to connect around 500 nodes with a total network capacity of 32 Tb/s. To get such an high network capacity we are testing zero-copy technology in order to maximize the theoretical link throughput without adding excessive CPU and memory bandwidth overhead, leaving free resources for data processing resulting in less power, space and money used for the same result. We develop a modular test application which can be used with different transport layers. For the zero-copy implementation we choose the OFED IBVerbs API because it can provide low level access and high throughput. We present throughput and CPU usage measurements of 40 GbE solutions using Remote Direct Memory Access (RDMA), for several network configurations to test the scalability of the system.
Rotem, Asaf; Janzer, Andreas; Izar, Benjamin; Ji, Zhe; Doench, John G.; Garraway, Levi A.; Struhl, Kevin
2015-01-01
Colony formation in soft agar is the gold-standard assay for cellular transformation in vitro, but it is unsuited for high-throughput screening. Here, we describe an assay for cellular transformation that involves growth in low attachment (GILA) conditions and is strongly correlated with the soft-agar assay. Using GILA, we describe high-throughput screens for drugs and genes that selectively inhibit or increase transformation, but not proliferation. Such molecules are unlikely to be found through conventional drug screening, and they include kinase inhibitors and drugs for noncancer diseases. In addition to known oncogenes, the genetic screen identifies genes that contribute to cellular transformation. Lastly, we demonstrate the ability of Food and Drug Administration-approved noncancer drugs to selectively kill ovarian cancer cells derived from patients with chemotherapy-resistant disease, suggesting this approach may provide useful information for personalized cancer treatment. PMID:25902495
Rotem, Asaf; Janzer, Andreas; Izar, Benjamin; Ji, Zhe; Doench, John G; Garraway, Levi A; Struhl, Kevin
2015-05-05
Colony formation in soft agar is the gold-standard assay for cellular transformation in vitro, but it is unsuited for high-throughput screening. Here, we describe an assay for cellular transformation that involves growth in low attachment (GILA) conditions and is strongly correlated with the soft-agar assay. Using GILA, we describe high-throughput screens for drugs and genes that selectively inhibit or increase transformation, but not proliferation. Such molecules are unlikely to be found through conventional drug screening, and they include kinase inhibitors and drugs for noncancer diseases. In addition to known oncogenes, the genetic screen identifies genes that contribute to cellular transformation. Lastly, we demonstrate the ability of Food and Drug Administration-approved noncancer drugs to selectively kill ovarian cancer cells derived from patients with chemotherapy-resistant disease, suggesting this approach may provide useful information for personalized cancer treatment.
[Weighted gene co-expression network analysis in biomedicine research].
Liu, Wei; Li, Li; Ye, Hua; Tu, Wei
2017-11-25
High-throughput biological technologies are now widely applied in biology and medicine, allowing scientists to monitor thousands of parameters simultaneously in a specific sample. However, it is still an enormous challenge to mine useful information from high-throughput data. The emergence of network biology provides deeper insights into complex bio-system and reveals the modularity in tissue/cellular networks. Correlation networks are increasingly used in bioinformatics applications. Weighted gene co-expression network analysis (WGCNA) tool can detect clusters of highly correlated genes. Therefore, we systematically reviewed the application of WGCNA in the study of disease diagnosis, pathogenesis and other related fields. First, we introduced principle, workflow, advantages and disadvantages of WGCNA. Second, we presented the application of WGCNA in disease, physiology, drug, evolution and genome annotation. Then, we indicated the application of WGCNA in newly developed high-throughput methods. We hope this review will help to promote the application of WGCNA in biomedicine research.
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
Quigley, Lisa; O'Sullivan, Orla; Beresford, Tom P.; Ross, R. Paul; Fitzgerald, Gerald F.
2012-01-01
Here, high-throughput sequencing was employed to reveal the highly diverse bacterial populations present in 62 Irish artisanal cheeses and, in some cases, associated cheese rinds. Using this approach, we revealed the presence of several genera not previously associated with cheese, including Faecalibacterium, Prevotella, and Helcococcus and, for the first time, detected the presence of Arthrobacter and Brachybacterium in goats' milk cheese. Our analysis confirmed many previously observed patterns, such as the dominance of typical cheese bacteria, the fact that the microbiota of raw and pasteurized milk cheeses differ, and that the level of cheese maturation has a significant influence on Lactobacillus populations. It was also noted that cheeses containing adjunct ingredients had lower proportions of Lactococcus species. It is thus apparent that high-throughput sequencing-based investigations can provide valuable insights into the microbial populations of artisanal foods. PMID:22685131
Quigley, Lisa; O'Sullivan, Orla; Beresford, Tom P; Ross, R Paul; Fitzgerald, Gerald F; Cotter, Paul D
2012-08-01
Here, high-throughput sequencing was employed to reveal the highly diverse bacterial populations present in 62 Irish artisanal cheeses and, in some cases, associated cheese rinds. Using this approach, we revealed the presence of several genera not previously associated with cheese, including Faecalibacterium, Prevotella, and Helcococcus and, for the first time, detected the presence of Arthrobacter and Brachybacterium in goats' milk cheese. Our analysis confirmed many previously observed patterns, such as the dominance of typical cheese bacteria, the fact that the microbiota of raw and pasteurized milk cheeses differ, and that the level of cheese maturation has a significant influence on Lactobacillus populations. It was also noted that cheeses containing adjunct ingredients had lower proportions of Lactococcus species. It is thus apparent that high-throughput sequencing-based investigations can provide valuable insights into the microbial populations of artisanal foods.
Ching, Travers; Zhu, Xun; Garmire, Lana X
2018-04-01
Artificial neural networks (ANN) are computing architectures with many interconnections of simple neural-inspired computing elements, and have been applied to biomedical fields such as imaging analysis and diagnosis. We have developed a new ANN framework called Cox-nnet to predict patient prognosis from high throughput transcriptomics data. In 10 TCGA RNA-Seq data sets, Cox-nnet achieves the same or better predictive accuracy compared to other methods, including Cox-proportional hazards regression (with LASSO, ridge, and mimimax concave penalty), Random Forests Survival and CoxBoost. Cox-nnet also reveals richer biological information, at both the pathway and gene levels. The outputs from the hidden layer node provide an alternative approach for survival-sensitive dimension reduction. In summary, we have developed a new method for accurate and efficient prognosis prediction on high throughput data, with functional biological insights. The source code is freely available at https://github.com/lanagarmire/cox-nnet.
Zhong, Qing; Rüschoff, Jan H.; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J.; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J.; Rupp, Niels J.; Fankhauser, Christian; Buhmann, Joachim M.; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A.; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C.; Jochum, Wolfram; Wild, Peter J.
2016-01-01
Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility. PMID:27052161
Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J
2016-04-07
Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.
Short-read, high-throughput sequencing technology for STR genotyping
Bornman, Daniel M.; Hester, Mark E.; Schuetter, Jared M.; Kasoji, Manjula D.; Minard-Smith, Angela; Barden, Curt A.; Nelson, Scott C.; Godbold, Gene D.; Baker, Christine H.; Yang, Boyu; Walther, Jacquelyn E.; Tornes, Ivan E.; Yan, Pearlly S.; Rodriguez, Benjamin; Bundschuh, Ralf; Dickens, Michael L.; Young, Brian A.; Faith, Seth A.
2013-01-01
DNA-based methods for human identification principally rely upon genotyping of short tandem repeat (STR) loci. Electrophoretic-based techniques for variable-length classification of STRs are universally utilized, but are limited in that they have relatively low throughput and do not yield nucleotide sequence information. High-throughput sequencing technology may provide a more powerful instrument for human identification, but is not currently validated for forensic casework. Here, we present a systematic method to perform high-throughput genotyping analysis of the Combined DNA Index System (CODIS) STR loci using short-read (150 bp) massively parallel sequencing technology. Open source reference alignment tools were optimized to evaluate PCR-amplified STR loci using a custom designed STR genome reference. Evaluation of this approach demonstrated that the 13 CODIS STR loci and amelogenin (AMEL) locus could be accurately called from individual and mixture samples. Sensitivity analysis showed that as few as 18,500 reads, aligned to an in silico referenced genome, were required to genotype an individual (>99% confidence) for the CODIS loci. The power of this technology was further demonstrated by identification of variant alleles containing single nucleotide polymorphisms (SNPs) and the development of quantitative measurements (reads) for resolving mixed samples. PMID:25621315
Bond, Thomas E H; Sorenson, Alanna E; Schaeffer, Patrick M
2017-12-01
Biotin protein ligase (BirA) has been identified as an emerging drug target in Mycobacterium tuberculosis due to its essential metabolic role. Indeed, it is the only enzyme capable of covalently attaching biotin onto the biotin carboxyl carrier protein subunit of the acetyl-CoA carboxylase. Despite recent interest in this protein, there is still a gap in cost-effective high-throughput screening assays for rapid identification of mycobacterial BirA-targeting inhibitors. We present for the first time the cloning, expression, purification of mycobacterial GFP-tagged BirA and its application for the development of a high-throughput assay building on the principle of differential scanning fluorimetry of GFP-tagged proteins. The data obtained in this study reveal how biotin and ATP significantly increase the thermal stability (ΔT m =+16.5°C) of M. tuberculosis BirA and lead to formation of a high affinity holoenzyme complex (K obs =7.7nM). The new findings and mycobacterial BirA high-throughput assay presented in this work could provide an efficient platform for future anti-tubercular drug discovery campaigns. Copyright © 2017 Elsevier GmbH. All rights reserved.
NASA Astrophysics Data System (ADS)
Pfeiffer, Hans
1999-12-01
Projection reduction exposure with variable axis immersion lenses (PREVAIL) represents the high throughput e-beam projection approach to next generation lithography (NGL), which IBM is pursuing in cooperation with Nikon Corporation as an alliance partner. This paper discusses the challenges and accomplishments of the PREVAIL project. The supreme challenge facing all e-beam lithography approaches has been and still is throughput. Since the throughput of e-beam projection systems is severely limited by the available optical field size, the key to success is the ability to overcome this limitation. The PREVAIL technique overcomes field-limiting off-axis aberrations through the use of variable axis lenses, which electronically shift the optical axis simultaneously with the deflected beam, so that the beam effectively remains on axis. The resist images obtained with the proof-of-concept (POC) system demonstrate that PREVAIL effectively eliminates off-axis aberrations affecting both the resolution and placement accuracy of pixels. As part of the POC system a high emittance gun has been developed to provide uniform illumination of the patterned subfield, and to fill the large numerical aperture projection optics designed to significantly reduce beam blur caused by Coulombinteraction.
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila.
Chiaraviglio, Lucius; Kirby, James E
2015-12-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila
Chiaraviglio, Lucius
2015-01-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509
Lens-free shadow image based high-throughput continuous cell monitoring technique.
Jin, Geonsoo; Yoo, In-Hwa; Pack, Seung Pil; Yang, Ji-Woon; Ha, Un-Hwan; Paek, Se-Hwan; Seo, Sungkyu
2012-01-01
A high-throughput continuous cell monitoring technique which does not require any labeling reagents or destruction of the specimen is demonstrated. More than 6000 human alveolar epithelial A549 cells are monitored for up to 72 h simultaneously and continuously with a single digital image within a cost and space effective lens-free shadow imaging platform. In an experiment performed within a custom built incubator integrated with the lens-free shadow imaging platform, the cell nucleus division process could be successfully characterized by calculating the signal-to-noise ratios (SNRs) and the shadow diameters (SDs) of the cell shadow patterns. The versatile nature of this platform also enabled a single cell viability test followed by live cell counting. This study firstly shows that the lens-free shadow imaging technique can provide a continuous cell monitoring without any staining/labeling reagent and destruction of the specimen. This high-throughput continuous cell monitoring technique based on lens-free shadow imaging may be widely utilized as a compact, low-cost, and high-throughput cell monitoring tool in the fields of drug and food screening or cell proliferation and viability testing. Copyright © 2012 Elsevier B.V. All rights reserved.
De Diego, Nuria; Fürst, Tomáš; Humplík, Jan F; Ugena, Lydia; Podlešáková, Kateřina; Spíchal, Lukáš
2017-01-01
High-throughput plant phenotyping platforms provide new possibilities for automated, fast scoring of several plant growth and development traits, followed over time using non-invasive sensors. Using Arabidops is as a model offers important advantages for high-throughput screening with the opportunity to extrapolate the results obtained to other crops of commercial interest. In this study we describe the development of a highly reproducible high-throughput Arabidopsis in vitro bioassay established using our OloPhen platform, suitable for analysis of rosette growth in multi-well plates. This method was successfully validated on example of multivariate analysis of Arabidopsis rosette growth in different salt concentrations and the interaction with varying nutritional composition of the growth medium. Several traits such as changes in the rosette area, relative growth rate, survival rate and homogeneity of the population are scored using fully automated RGB imaging and subsequent image analysis. The assay can be used for fast screening of the biological activity of chemical libraries, phenotypes of transgenic or recombinant inbred lines, or to search for potential quantitative trait loci. It is especially valuable for selecting genotypes or growth conditions that improve plant stress tolerance.
Cellular resolution functional imaging in behaving rats using voluntary head restraint
Scott, Benjamin B.; Brody, Carlos D.; Tank, David W.
2013-01-01
SUMMARY High-throughput operant conditioning systems for rodents provide efficient training on sophisticated behavioral tasks. Combining these systems with technologies for cellular resolution functional imaging would provide a powerful approach to study neural dynamics during behavior. Here we describe an integrated two-photon microscope and behavioral apparatus that allows cellular resolution functional imaging of cortical regions during epochs of voluntary head restraint. Rats were trained to initiate periods of restraint up to 8 seconds in duration, which provided the mechanical stability necessary for in vivo imaging while allowing free movement between behavioral trials. A mechanical registration system repositioned the head to within a few microns, allowing the same neuronal populations to be imaged on each trial. In proof-of-principle experiments, calcium dependent fluorescence transients were recorded from GCaMP-labeled cortical neurons. In contrast to previous methods for head restraint, this system can also be incorporated into high-throughput operant conditioning systems. PMID:24055015
Pharmacological profiling of the TRPV3 channel in recombinant and native assays.
Grubisha, Olivera; Mogg, Adrian J; Sorge, Jessica L; Ball, Laura-Jayne; Sanger, Helen; Ruble, Cara L A; Folly, Elizabeth A; Ursu, Daniel; Broad, Lisa M
2014-05-01
Transient receptor potential vanilloid subtype 3 (TRPV3) is implicated in nociception and certain skin conditions. As such, it is an attractive target for pharmaceutical research. Understanding of endogenous TRPV3 function and pharmacology remains elusive as selective compounds and native preparations utilizing higher throughput methodologies are lacking. In this study, we developed medium-throughput recombinant and native cellular assays to assess the detailed pharmacological profile of human, rat and mouse TRPV3 channels. Medium-throughput cellular assays were developed using a Ca(2+) -sensitive dye and a fluorescent imaging plate reader. Human and rat TRPV3 pharmacology was examined in recombinant cell lines, while the mouse 308 keratinocyte cell line was used to assess endogenous TRPV3 activity. A recombinant rat TRPV3 cellular assay was successfully developed after solving a discrepancy in the published rat TRPV3 protein sequence. A medium-throughput, native, mouse TRPV3 keratinocyte assay was also developed and confirmed using genetic approaches. Whereas the recombinant human and rat TRPV3 assays exhibited similar agonist and antagonist profiles, the native mouse assay showed important differences, namely, TRPV3 activity was detected only in the presence of potentiator or during agonist synergy. Furthermore, the native assay was more sensitive to block by some antagonists. Our findings demonstrate similarities but also notable differences in TRPV3 pharmacology between recombinant and native systems. These findings offer insights into TRPV3 function and these assays should aid further research towards developing TRPV3 therapies. © 2013 The British Pharmacological Society.
2016-09-13
FUTURE WORK As future work we recommend the enhancement and further optimization of the T2 CUDA Library by using more powerful cards. The Titan X...card is expected to be available in the summer of 2016. The card will have higher throughput and bandwidth than the latest Tesla K40 and Titan X and
Open Access, Retention and Throughput at the Central University of Technology
ERIC Educational Resources Information Center
de Beer, K. J.
2006-01-01
The most debatable question in higher education today is: Why first "open access" to promote massafication and now "capping" to restrict learner intake? (cf. SA Media Information 2004). Concerning the managing of this difficult and extremely sensitive issue, the Central University of Technology, Free State (CUT) has come a long…
Breeding for improved potato nutrition: High amylose starch potatoes show promise as fiber source
USDA-ARS?s Scientific Manuscript database
Potato starch is composed of approximately 75% amylopectin and 25% amylose. We are interested in breeding for higher amylose content, which would increase the fiber content of potato and decrease glycemic index. In order to make progress in a breeding program, we have developed a high throughput ass...
Tacchi, Luca; Larragoite, Erin; Salinas, Irene
2013-01-01
J chain is a small polypeptide responsible for immunoglobulin (Ig) polymerization and transport of Igs across mucosal surfaces in higher vertebrates. We identified a J chain in dipnoid fish, the African lungfish (Protopterus dolloi) by high throughput sequencing of the transcriptome. P. dolloi J chain is 161 aa long and contains six of the eight Cys residues present in mammalian J chain. Phylogenetic studies place the lungfish J chain closer to tetrapod J chain than to the coelacanth or nurse shark sequences. J chain expression occurs in all P. dolloi immune tissues examined and it increases in the gut and kidney in response to an experimental bacterial infection. Double fluorescent in-situ hybridization shows that 88.5% of IgM+ cells in the gut co-express J chain, a significantly higher percentage than in the pre-pyloric spleen. Importantly, J chain expression is not restricted to the B-cell compartment since gut epithelial cells also express J chain. These results improve our current view of J chain from a phylogenetic perspective. PMID:23967082
Parametric study on mixing process in an in-plane spiral micromixer utilizing chaotic advection.
Vatankhah, Parham; Shamloo, Amir
2018-08-31
Recent advances in the field of microfabrication have made the application of high-throughput microfluidics feasible. Mixing which is an essential part of any miniaturized standalone system remains the key challenge. This paper proposes a geometrically simple micromixer for efficient mixing for high-throughput microfluidic devices. The proposed micromixer utilizes a curved microchannel (spiral microchannel) to induce chaotic advection and enhance the mixing process. It is shown that the spiral microchannel is more efficient in comparison to a straight microchannel, mixing wise. The pressure drop in the spiral microchannel is only slightly higher than that in the straight microchannel. It is found that the mixing process in the spiral microchannel enhances with increasing the inlet velocity, unlike what happens in the straight microchannel. It is also realized that the initial radius of the spiral microchannel plays a prominent role in enhancing the mixing process. Studying different cross sections, it is gathered that the square cross section yields a higher mixing quality. Copyright © 2018 Elsevier B.V. All rights reserved.
Wang, Taojun; Zhao, Liang; Sun, Yanan; Ren, Fazheng; Chen, Shanbin; Zhang, Hao; Guo, Huiyuan
2016-11-01
Changes in the microbiota of lamb were investigated under vacuum packaging (VP) and under 20% CO2/80% N2 (LC), 60% CO2/40% N2 (MC), and 100% CO2 (HC) modified atmosphere packaging (MAP) during chilled storage. Viable counts were monitored, and the total microbial communities were assessed by high-throughput sequencing. The starting community had the highest microbial diversity, after which Lactococcus and Carnobacterium spp. outcompeted during the 28-day storage. The relative abundances of Brochothrix spp. in the LC atmosphere were much higher than those of the other groups on days 7 and 28. The bacterial inhibiting effect of the MAP environments on microbial growth was positively correlated with the CO2 concentration. The HC atmosphere inhibited microbial growth and delayed changes in the microbial community composition, extending the lamb's shelf life by approximately 7days compared with the VP atmosphere. Lamb packaged in the VP atmosphere had a more desirable colour but a higher weight loss than lamb packaged in the MAP atmospheres. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optima MDxt: A high throughput 335 keV mid-dose implanter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eisner, Edward; David, Jonathan; Justesen, Perry
2012-11-06
The continuing demand for both energy purity and implant angle control along with high wafer throughput drove the development of the Axcelis Optima MDxt mid-dose ion implanter. The system utilizes electrostatic scanning, an electrostatic parallelizing lens and an electrostatic energy filter to produce energetically pure beams with high angular integrity. Based on field proven components, the Optima MDxt beamline architecture offers the high beam currents possible with singly charged species including arsenic at energies up to 335 keV as well as large currents from multiply charged species at energies extending over 1 MeV. Conversely, the excellent energy filtering capability allowsmore » high currents at low beam energies, since it is safe to utilize large deceleration ratios. This beamline is coupled with the >500 WPH capable endstation technology used on the Axcelis Optima XEx high energy ion implanter. The endstation includes in-situ angle measurements of the beam in order to maintain excellent beam-to-wafer implant angle control in both the horizontal and vertical directions. The Optima platform control system provides new generation dose control system that assures excellent dosimetry and charge control. This paper will describe the features and technologies that allow the Optima MDxt to provide superior process performance at the highest wafer throughput, and will provide examples of the process performance achievable.« less
Microarray-Based Gene Expression Analysis for Veterinary Pathologists: A Review.
Raddatz, Barbara B; Spitzbarth, Ingo; Matheis, Katja A; Kalkuhl, Arno; Deschl, Ulrich; Baumgärtner, Wolfgang; Ulrich, Reiner
2017-09-01
High-throughput, genome-wide transcriptome analysis is now commonly used in all fields of life science research and is on the cusp of medical and veterinary diagnostic application. Transcriptomic methods such as microarrays and next-generation sequencing generate enormous amounts of data. The pathogenetic expertise acquired from understanding of general pathology provides veterinary pathologists with a profound background, which is essential in translating transcriptomic data into meaningful biological knowledge, thereby leading to a better understanding of underlying disease mechanisms. The scientific literature concerning high-throughput data-mining techniques usually addresses mathematicians or computer scientists as the target audience. In contrast, the present review provides the reader with a clear and systematic basis from a veterinary pathologist's perspective. Therefore, the aims are (1) to introduce the reader to the necessary methodological background; (2) to introduce the sequential steps commonly performed in a microarray analysis including quality control, annotation, normalization, selection of differentially expressed genes, clustering, gene ontology and pathway analysis, analysis of manually selected genes, and biomarker discovery; and (3) to provide references to publically available and user-friendly software suites. In summary, the data analysis methods presented within this review will enable veterinary pathologists to analyze high-throughput transcriptome data obtained from their own experiments, supplemental data that accompany scientific publications, or public repositories in order to obtain a more in-depth insight into underlying disease mechanisms.
A high throughput mechanical screening device for cartilage tissue engineering.
Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L
2014-06-27
Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Close, Dan; Webb, James; Ripp, Steven; Patterson, Stacey; Sayler, Gary
2012-06-01
Traditionally, human toxicant bioavailability screening has been forced to proceed in either a high throughput fashion using prokaryotic or lower eukaryotic targets with minimal applicability to humans, or in a more expensive, lower throughput manner that uses fluorescent or bioluminescent human cells to directly provide human bioavailability data. While these efforts are often sufficient for basic scientific research, they prevent the rapid and remote identification of potentially toxic chemicals required for modern biosecurity applications. To merge the advantages of high throughput, low cost screening regimens with the direct bioavailability assessment of human cell line use, we re-engineered the bioluminescent bacterial luciferase gene cassette to function autonomously (without exogenous stimulation) within human cells. Optimized cassette expression provides for fully endogenous bioluminescent production, allowing continuous, real time monitoring of the bioavailability and toxicology of various compounds in an automated fashion. To access the functionality of this system, two sets of bioluminescent human cells were developed. The first was programed to suspend bioluminescent production upon toxicological challenge to mimic the non-specific detection of a toxicant. The second induced bioluminescence upon detection of a specific compound to demonstrate autonomous remote target identification. These cells were capable of responding to μM concentrations of the toxicant n-decanal, and allowed for continuous monitoring of cellular health throughout the treatment process. Induced bioluminescence was generated through treatment with doxycycline and was detectable upon dosage at a 100 ng/ml concentration. These results demonstrate that leveraging autonomous bioluminescence allows for low-cost, high throughput direct assessment of toxicant bioavailability.
Bioconductor | Informatics Technology for Cancer Research (ITCR)
Bioconductor provides tools for the analysis and comprehension of high-throughput genomic data. R/Bioconductor will be enhanced to meet the increasing complexity of multiassay cancer genomics experiments.
NASA Astrophysics Data System (ADS)
Kruse, Daniele Francesco
2014-06-01
Physics data stored in CERN tapes is quickly reaching the 100 PB milestone. Tape is an ever-changing technology that is still following Moore's law in terms of capacity. This means we can store every year more and more data in the same amount of tapes. However this doesn't come for free: the first obvious cost is the new higher capacity media. The second less known cost is related to moving the data from the old tapes to the new ones. This activity is what we call repack. Repack is vital for any large tape user: without it, one would have to buy more tape libraries and more floor space and, eventually, data on old non supported tapes would become unreadable and be lost forever. In this paper we describe the challenge of repacking 115 PB before LHC data taking starts in the beginning of 2015. This process will have to run concurrently with the existing experiment tape activities, and therefore needs to be as transparent as possible for users. Making sure that this works out seamlessly implies careful planning of the resources and the various policies for sharing them fairly and conveniently. To tackle this problem we need to fully exploit the speed and throughput of our modern tape drives. This involves proper dimensioning and configuration of the disk arrays and all the links between them and the tape servers, i.e the machines responsible for managing the tape drives. It is also equally important to provide tools to improve the efficiency with which we use our tape libraries. The new repack setup we deployed has on average increased tape drive throughput by 80%, allowing them to perform closer to their design specifications. This improvement in turn means a 48% decrease in the number of drives needed to achieve the required throughput to complete the full repack on time.
Demonstration of electronic design automation flow for massively parallel e-beam lithography
NASA Astrophysics Data System (ADS)
Brandt, Pieter; Belledent, Jérôme; Tranquillin, Céline; Figueiro, Thiago; Meunier, Stéfanie; Bayle, Sébastien; Fay, Aurélien; Milléquant, Matthieu; Icard, Beatrice; Wieland, Marco
2014-07-01
For proximity effect correction in 5 keV e-beam lithography, three elementary building blocks exist: dose modulation, geometry (size) modulation, and background dose addition. Combinations of these three methods are quantitatively compared in terms of throughput impact and process window (PW). In addition, overexposure in combination with negative bias results in PW enhancement at the cost of throughput. In proximity effect correction by over exposure (PEC-OE), the entire layout is set to fixed dose and geometry sizes are adjusted. In PEC-dose to size (DTS) both dose and geometry sizes are locally optimized. In PEC-background (BG), a background is added to correct the long-range part of the point spread function. In single e-beam tools (Gaussian or Shaped-beam), throughput heavily depends on the number of shots. In raster scan tools such as MAPPER Lithography's FLX 1200 (MATRIX platform) this is not the case and instead of pattern density, the maximum local dose on the wafer is limiting throughput. The smallest considered half-pitch is 28 nm, which may be considered the 14-nm node for Metal-1 and the 10-nm node for the Via-1 layer, achieved in a single exposure with e-beam lithography. For typical 28-nm-hp Metal-1 layouts, it was shown that dose latitudes (size of process window) of around 10% are realizable with available PEC methods. For 28-nm-hp Via-1 layouts this is even higher at 14% and up. When the layouts do not reach the highest densities (up to 10∶1 in this study), PEC-BG and PEC-OE provide the capability to trade throughput for dose latitude. At the highest densities, PEC-DTS is required for proximity correction, as this method adjusts both geometry edges and doses and will reduce the dose at the densest areas. For 28-nm-hp lines critical dimension (CD), hole&dot (CD) and line ends (edge placement error), the data path errors are typically 0.9, 1.0 and 0.7 nm (3σ) and below, respectively. There is not a clear data path performance difference between the investigated PEC methods. After the simulations, the methods were successfully validated in exposures on a MAPPER pre-alpha tool. A 28-nm half pitch Metal-1 and Via-1 layouts show good performance in resist that coincide with the simulation result. Exposures of soft-edge stitched layouts show that beam-to-beam position errors up to ±7 nm specified for FLX 1200 show no noticeable impact on CD. The research leading to these results has been performed in the frame of the industrial collaborative consortium IMAGINE.
iPSC-derived neurons as a higher-throughput readout for autism: Promises and pitfalls
Prilutsky, Daria; Palmer, Nathan P.; Smedemark-Margulies, Niklas; Schlaeger, Thorsten M.; Margulies, David M.; Kohane, Isaac S.
2014-01-01
The elucidation of disease etiologies and establishment of robust, scalable, high-throughput screening assays for autism spectrum disorders (ASDs) have been impeded by both inaccessibility of disease-relevant neuronal tissue and the genetic heterogeneity of the disorder. Neuronal cells derived from induced pluripotent stem cells (iPSCs) from autism patients may circumvent these obstacles and serve as relevant cell models. To date, derived cells are characterized and screened by assessing their neuronal phenotypes. These characterizations are often etiology-specific or lack reproducibility and stability. In this manuscript, we present an overview of efforts to study iPSC-derived neurons as a model for autism, and we explore the plausibility of gene expression profiling as a reproducible and stable disease marker. PMID:24374161
Sameshima, Tomoya; Miyahisa, Ikuo; Homma, Misaki; Aikawa, Katsuji; Hixon, Mark S; Matsui, Junji
2014-12-15
Identification of inhibitors for protein-protein interactions (PPIs) from high-throughput screening (HTS) is challenging due to the weak affinity of primary hits. We present a hit validation strategy of PPI inhibitors using quantitative ligand displacement assay. From an HTS for Bcl-xL/Mcl-1 inhibitors, we obtained a hit candidate, I1, which potentially forms a reactive Michael acceptor, I2, inhibiting Bcl-xL/Mcl-1 through covalent modification. We confirmed rapid reversible and competitive binding of I1 with a probe peptide, suggesting non-covalent binding. The advantages of our approach over biophysical assays include; simplicity, higher throughput, low protein consumption and universal application to PPIs including insoluble membrane proteins. Copyright © 2014 Elsevier Ltd. All rights reserved.
Identifying genes that extend life span using a high-throughput screening system.
Chen, Cuiying; Contreras, Roland
2007-01-01
We developed a high-throughput functional genomic screening system that allows identification of genes prolonging lifespan in the baker's yeast Saccharomyces cerevisiae. The method is based on isolating yeast mother cells with a higher than average number of cell divisions as indicated by the number of bud scars on their surface. Fluorescently labeled wheat germ agglutinin (WGA) was used for specific staining of chitin, a major component of bud scars. The critical new steps in our bud-scar-sorting system are the use of small microbeads, which allows successive rounds of purification and regrowth of the mother cells (M-cell), and utilization of flow cytometry to sort and isolate cells with a longer lifespan based on the number of bud scars specifically labeled with WGA.
Healy, B J; van der Merwe, D; Christaki, K E; Meghzifene, A
2017-02-01
Medical linear accelerators (linacs) and cobalt-60 machines are both mature technologies for external beam radiotherapy. A comparison is made between these two technologies in terms of infrastructure and maintenance, dosimetry, shielding requirements, staffing, costs, security, patient throughput and clinical use. Infrastructure and maintenance are more demanding for linacs due to the complex electric componentry. In dosimetry, a higher beam energy, modulated dose rate and smaller focal spot size mean that it is easier to create an optimised treatment with a linac for conformal dose coverage of the tumour while sparing healthy organs at risk. In shielding, the requirements for a concrete bunker are similar for cobalt-60 machines and linacs but extra shielding and protection from neutrons are required for linacs. Staffing levels can be higher for linacs and more staff training is required for linacs. Life cycle costs are higher for linacs, especially multi-energy linacs. Security is more complex for cobalt-60 machines because of the high activity radioactive source. Patient throughput can be affected by source decay for cobalt-60 machines but poor maintenance and breakdowns can severely affect patient throughput for linacs. In clinical use, more complex treatment techniques are easier to achieve with linacs, and the availability of electron beams on high-energy linacs can be useful for certain treatments. In summary, there is no simple answer to the question of the choice of either cobalt-60 machines or linacs for radiotherapy in low- and middle-income countries. In fact a radiotherapy department with a combination of technologies, including orthovoltage X-ray units, may be an option. Local needs, conditions and resources will have to be factored into any decision on technology taking into account the characteristics of both forms of teletherapy, with the primary goal being the sustainability of the radiotherapy service over the useful lifetime of the equipment. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Developing science gateways for drug discovery in a grid environment.
Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra
2016-01-01
Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.
Microfluidic cell chips for high-throughput drug screening
Chi, Chun-Wei; Ahmed, AH Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong
2016-01-01
The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell–drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers. PMID:27071838
Purushothama, Shobha; Dysinger, Mark; Chen, Yao; Österlund, Karolina; Mora, Johanna; Chunyk, Allison Given; Peloquin, Russ
2018-02-01
This manuscript aims to provide insights and updates on emerging technologies from a throughput and multiplexing perspective and to update readers on changes in previously reported technologies. The technologies discussed range from nascent (ultrasensitive Cira, Intellicyt ® , Dynaxi and Captsure™) to the more established (Ella and SQIDlite™). For the nascent technologies, there was an emphasis on user interviews and reviews, where available, to help provide an unbiased view to our readers. For the Ella, a review of published user data as well as author and other user experiences are summarized. Due to their emergent nature, all the technologies described are applicable in the early drug development stage, may require an upfront investment of capital and may not perform as expected.
Predicting Novel Bulk Metallic Glasses via High- Throughput Calculations
NASA Astrophysics Data System (ADS)
Perim, E.; Lee, D.; Liu, Y.; Toher, C.; Gong, P.; Li, Y.; Simmons, W. N.; Levy, O.; Vlassak, J.; Schroers, J.; Curtarolo, S.
Bulk metallic glasses (BMGs) are materials which may combine key properties from crystalline metals, such as high hardness, with others typically presented by plastics, such as easy processability. However, the cost of the known BMGs poses a significant obstacle for the development of applications, which has lead to a long search for novel, economically viable, BMGs. The emergence of high-throughput DFT calculations, such as the library provided by the AFLOWLIB consortium, has provided new tools for materials discovery. We have used this data to develop a new glass forming descriptor combining structural factors with thermodynamics in order to quickly screen through a large number of alloy systems in the AFLOWLIB database, identifying the most promising systems and the optimal compositions for glass formation. National Science Foundation (DMR-1436151, DMR-1435820, DMR-1436268).
Solar Confocal interferometers for Sub-Picometer-Resolution Spectral Filters
NASA Technical Reports Server (NTRS)
Gary, G. Allen; Pietraszewski, Chris; West, Edward A.; Dines. Terence C.
2007-01-01
The confocal Fabry-Perot interferometer allows sub-picometer spectral resolution of Fraunhofer line profiles. Such high spectral resolution is needed to keep pace with the higher spatial resolution of the new set of large-aperture solar telescopes. The line-of-sight spatial resolution derived for line profile inversions would then track the improvements of the transverse spatial scale provided by the larger apertures. In particular, profile inversion allows improved velocity and magnetic field gradients to be determined independent of multiple line analysis using different energy levels and ions. The confocal interferometer's unique properties allow a simultaneous increase in both etendue and spectral power. The higher throughput for the interferometer provides significant decrease in the aperture, which is important in spaceflight considerations. We have constructed and tested two confocal interferometers. A slow-response thermal-controlled interferometer provides a stable system for laboratory investigation, while a piezoelectric interferometer provides a rapid response for solar observations. In this paper we provide design parameters, show construction details, and report on the laboratory test for these interferometers. The field of view versus aperture for confocal interferometers is compared with other types of spectral imaging filters. We propose a multiple etalon system for observing with these units using existing planar interferometers as pre-filters. The radiometry for these tests established that high spectral resolution profiles can be obtained with imaging confocal interferometers. These sub-picometer spectral data of the photosphere in both the visible and near-infrared can provide important height variation information. However, at the diffraction-limited spatial resolution of the telescope, the spectral data is photon starved due to the decreased spectral passband.
Goodman, Corey W; Major, Heather J; Walls, William D; Sheffield, Val C; Casavant, Thomas L; Darbro, Benjamin W
2015-04-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. Copyright © 2015 Elsevier Inc. All rights reserved.
Sensitive high-throughput screening for the detection of reducing sugars.
Mellitzer, Andrea; Glieder, Anton; Weis, Roland; Reisinger, Christoph; Flicker, Karlheinz
2012-01-01
The exploitation of renewable resources for the production of biofuels relies on efficient processes for the enzymatic hydrolysis of lignocellulosic materials. The development of enzymes and strains for these processes requires reliable and fast activity-based screening assays. Additionally, these assays are also required to operate on the microscale and on the high-throughput level. Herein, we report the development of a highly sensitive reducing-sugar assay in a 96-well microplate screening format. The assay is based on the formation of osazones from reducing sugars and para-hydroxybenzoic acid hydrazide. By using this sensitive assay, the enzyme loads and conversion times during lignocellulose hydrolysis can be reduced, thus allowing higher throughput. The assay is about five times more sensitive than the widely applied dinitrosalicylic acid based assay and can reliably detect reducing sugars down to 10 μM. The assay-specific variation over one microplate was determined for three different lignocellulolytic enzymes and ranges from 2 to 8%. Furthermore, the assay was combined with a microscale cultivation procedure for the activity-based screening of Pichia pastoris strains expressing functional Thermomyces lanuginosus xylanase A, Trichoderma reesei β-mannanase, or T. reesei cellobiohydrolase 2. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-throughput Cloning and Expression of Integral Membrane Proteins in Escherichia coli
Bruni, Renato
2014-01-01
Recently, several structural genomics centers have been established and a remarkable number of three-dimensional structures of soluble proteins have been solved. For membrane proteins, the number of structures solved has been significantly trailing those for their soluble counterparts, not least because over-expression and purification of membrane proteins is a much more arduous process. By using high throughput technologies, a large number of membrane protein targets can be screened simultaneously and a greater number of expression and purification conditions can be employed, leading to a higher probability of successfully determining the structure of membrane proteins. This unit describes the cloning, expression and screening of membrane proteins using high throughput methodologies developed in our laboratory. Basic Protocol 1 deals with the cloning of inserts into expression vectors by ligation-independent cloning. Basic Protocol 2 describes the expression and purification of the target proteins on a miniscale. Lastly, for the targets that express at the miniscale, basic protocols 3 and 4 outline the methods employed for the expression and purification of targets at the midi-scale, as well as a procedure for detergent screening and identification of detergent(s) in which the target protein is stable. PMID:24510647
NASA Astrophysics Data System (ADS)
Eigenbrot, Arthur D.; Bershady, Matthew A.; Wood, Corey M.
2012-09-01
We present measurements of how multimode fiber focal-ratio degradation (FRD) and throughput vary with levels of fiber surface polish from 60 to 0.5 micron grit. Measurements used full-beam and laser injection methods at wavelengths between 0.4 and 0.8 microns on 17 meter lengths of Polymicro FBP 300 and 400 μm core fiber. Full-beam injection probed input focal-ratios between f/3 and f/13.5, while laser injection allowed us to isolate FRD at discrete injection angles up to 17 degrees (f/1.6 marginal ray). We find (1) FRD effects decrease as grit size decreases, with the largest gains in beam quality occurring at grit sizes above 5 μm (2) total throughput increases as grit size decreases, reaching 90% at 790 nm with the finest polishing levels; (3) total throughput is higher at redder wavelengths for coarser polishing grit, indicating surface-scattering as the primary source of loss. We also quantify the angular dependence of FRD as a function of polishing level. Our results indicate that a commonly adopted micro-bending model for FRD is a poor descriptor of the observed phenomenon.
Shirotani, Keiro; Futakawa, Satoshi; Nara, Kiyomitsu; Hoshi, Kyoka; Saito, Toshie; Tohyama, Yuriko; Kitazume, Shinobu; Yuasa, Tatsuhiko; Miyajima, Masakazu; Arai, Hajime; Kuno, Atsushi; Narimatsu, Hisashi; Hashimoto, Yasuhiro
2011-01-01
We have established high-throughput lectin-antibody ELISAs to measure different glycans on transferrin (Tf) in cerebrospinal fluid (CSF) using lectins and an anti-transferrin antibody (TfAb). Lectin blot and precipitation analysis of CSF revealed that PVL (Psathyrella velutina lectin) bound an unique N-acetylglucosamine-terminated N-glycans on “CSF-type” Tf whereas SSA (Sambucus sieboldiana agglutinin) bound α2,6-N-acetylneuraminic acid-terminated N-glycans on “serum-type” Tf. PVL-TfAb ELISA of 0.5 μL CSF samples detected “CSF-type” Tf but not “serum-type” Tf whereas SSA-TfAb ELISA detected “serum-type” Tf but not “CSF-type” Tf, demonstrating the specificity of the lectin-TfAb ELISAs. In idiopathic normal pressure hydrocephalus (iNPH), a senile dementia associated with ventriculomegaly, amounts of the SSA-reactive Tf were significantly higher than in non-iNPH patients, indicating that Tf glycan analysis by the high-throughput lectin-TfAb ELISAs could become practical diagnostic tools for iNPH. The lectin-antibody ELISAs of CSF proteins might be useful for diagnosis of the other neurological diseases. PMID:21876827
Shirotani, Keiro; Futakawa, Satoshi; Nara, Kiyomitsu; Hoshi, Kyoka; Saito, Toshie; Tohyama, Yuriko; Kitazume, Shinobu; Yuasa, Tatsuhiko; Miyajima, Masakazu; Arai, Hajime; Kuno, Atsushi; Narimatsu, Hisashi; Hashimoto, Yasuhiro
2011-01-01
We have established high-throughput lectin-antibody ELISAs to measure different glycans on transferrin (Tf) in cerebrospinal fluid (CSF) using lectins and an anti-transferrin antibody (TfAb). Lectin blot and precipitation analysis of CSF revealed that PVL (Psathyrella velutina lectin) bound an unique N-acetylglucosamine-terminated N-glycans on "CSF-type" Tf whereas SSA (Sambucus sieboldiana agglutinin) bound α2,6-N-acetylneuraminic acid-terminated N-glycans on "serum-type" Tf. PVL-TfAb ELISA of 0.5 μL CSF samples detected "CSF-type" Tf but not "serum-type" Tf whereas SSA-TfAb ELISA detected "serum-type" Tf but not "CSF-type" Tf, demonstrating the specificity of the lectin-TfAb ELISAs. In idiopathic normal pressure hydrocephalus (iNPH), a senile dementia associated with ventriculomegaly, amounts of the SSA-reactive Tf were significantly higher than in non-iNPH patients, indicating that Tf glycan analysis by the high-throughput lectin-TfAb ELISAs could become practical diagnostic tools for iNPH. The lectin-antibody ELISAs of CSF proteins might be useful for diagnosis of the other neurological diseases.
Future technologies for monitoring HIV drug resistance and cure.
Parikh, Urvi M; McCormick, Kevin; van Zyl, Gert; Mellors, John W
2017-03-01
Sensitive, scalable and affordable assays are critically needed for monitoring the success of interventions for preventing, treating and attempting to cure HIV infection. This review evaluates current and emerging technologies that are applicable for both surveillance of HIV drug resistance (HIVDR) and characterization of HIV reservoirs that persist despite antiretroviral therapy and are obstacles to curing HIV infection. Next-generation sequencing (NGS) has the potential to be adapted into high-throughput, cost-efficient approaches for HIVDR surveillance and monitoring during continued scale-up of antiretroviral therapy and rollout of preexposure prophylaxis. Similarly, improvements in PCR and NGS are resulting in higher throughput single genome sequencing to detect intact proviruses and to characterize HIV integration sites and clonal expansions of infected cells. Current population genotyping methods for resistance monitoring are high cost and low throughput. NGS, combined with simpler sample collection and storage matrices (e.g. dried blood spots), has considerable potential to broaden global surveillance and patient monitoring for HIVDR. Recent adaptions of NGS to identify integration sites of HIV in the human genome and to characterize the integrated HIV proviruses are likely to facilitate investigations of the impact of experimental 'curative' interventions on HIV reservoirs.
In Vivo Small Animal Imaging using Micro-CT and Digital Subtraction Angiography
Badea, C.T.; Drangova, M.; Holdsworth, D.W.; Johnson, G.A.
2009-01-01
Small animal imaging has a critical role in phenotyping, drug discovery, and in providing a basic understanding of mechanisms of disease. Translating imaging methods from humans to small animals is not an easy task. The purpose of this work is to review in vivo X-ray based small animal imaging, with a focus on in vivo micro-computed tomography (micro-CT) and digital subtraction angiography (DSA). We present the principles, technologies, image quality parameters and types of applications. We show that both methods can be used not only to provide morphological, but also functional information, such as cardiac function estimation or perfusion. Compared to other modalities, x-ray based imaging is usually regarded as being able to provide higher throughput at lower cost and adequate resolution. The limitations are usually associated with the relatively poor contrast mechanisms and potential radiation damage due to ionizing radiation, although the use of contrast agents and careful design of studies can address these limitations. We hope that the information will effectively address how x-ray based imaging can be exploited for successful in vivo preclinical imaging. PMID:18758005
Removal of central obscuration and spiders for coronagraphy
NASA Astrophysics Data System (ADS)
Abe, L.; Nishikawa, J.; Murakami, N.; Tamura, M.
2006-06-01
We present a method to remove the central obscuration and spiders, or any kind of geometry inside a telescope pupil. The technique relies on the combination of a first focal plane diffracting mask, and a complex amplitude pupil mask. In this combination, the central obscuration and eventual spider arms patterns in the re-imaged pupil (after the diffracting mask) are filled with coherent light. Adding an appropriate complex amplitude pupil mask allows virtually any kind of pupil shaping (in both amplitude and/or phase). We show that the obtained output pupil can feed a high efficiency coronagraph (any kind) with a very reasonable overall throughput and good performance even when considering pointing errors. In this paper, we specifically assess the performance of this technique when using apodized entrance pupils. This technique is relevant for ground based telescopes foreseeing the advent of higher order (so called ExAO) adaptive optics systems providing very high Strehl ratios. Some feasibility points are also discussed. adaptive optics systems providing very high Strehl ratios. Some feasibility points are also discussed.
Tools Fit for Chemical Risk Prioritization (EC JRC presentation)
We would like to know more about the risk posed by thousands of chemicals in the environment – which are most worthy of further study? High throughput screening (HTS) provides a path forward for identifying potential hazard. Exposure and dosimetry provide real world context to ha...
HiTC: exploration of high-throughput ‘C’ experiments
Servant, Nicolas; Lajoie, Bryan R.; Nora, Elphège P.; Giorgetti, Luca; Chen, Chong-Jian; Heard, Edith; Dekker, Job; Barillot, Emmanuel
2012-01-01
Summary: The R/Bioconductor package HiTC facilitates the exploration of high-throughput 3C-based data. It allows users to import and export ‘C’ data, to transform, normalize, annotate and visualize interaction maps. The package operates within the Bioconductor framework and thus offers new opportunities for future development in this field. Availability and implementation: The R package HiTC is available from the Bioconductor website. A detailed vignette provides additional documentation and help for using the package. Contact: nicolas.servant@curie.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22923296
Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A
2007-03-27
A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.
Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing
Shafer, Aaron B. A.; Northrup, Joseph M.; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B. W.
2016-01-01
Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations. PMID:26745372
Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena
2017-07-28
High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.
He, Ji; Dai, Xinbin; Zhao, Xuechun
2007-02-09
BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Personal BLAST Navigator (PLAN) is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1) query and target sequence database management, (2) automated high-throughput BLAST searching, (3) indexing and searching of results, (4) filtering results online, (5) managing results of personal interest in favorite categories, (6) automated sequence annotation (such as NCBI NR and ontology-based annotation). PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results. The PLAN web interface is platform-independent, easily configurable and capable of comprehensive expansion, and user-intuitive. PLAN is freely available to academic users at http://bioinfo.noble.org/plan/. The source code for local deployment is provided under free license. Full support on system utilization, installation, configuration and customization are provided to academic users.
He, Ji; Dai, Xinbin; Zhao, Xuechun
2007-01-01
Background BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Results Personal BLAST Navigator (PLAN) is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1) query and target sequence database management, (2) automated high-throughput BLAST searching, (3) indexing and searching of results, (4) filtering results online, (5) managing results of personal interest in favorite categories, (6) automated sequence annotation (such as NCBI NR and ontology-based annotation). PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. Conclusion PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results. The PLAN web interface is platform-independent, easily configurable and capable of comprehensive expansion, and user-intuitive. PLAN is freely available to academic users at . The source code for local deployment is provided under free license. Full support on system utilization, installation, configuration and customization are provided to academic users. PMID:17291345
Single-cell genomics for the masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tringe, Susannah G.
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
Single-cell genomics for the masses
Tringe, Susannah G.
2017-07-12
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
Work Integrated Learning for Engineering Qualifications: A Spanner in the Works?
ERIC Educational Resources Information Center
Mutereko, Sybert; Wedekind, Volker
2016-01-01
Work-integrated learning (WIL) has been identified as a way of equipping graduates with attributes that make them work-ready. Many higher education institutions (HEIs) require their students to go through a compulsory work place learning form (WPL) of WIL. The complications of WPL can affect HEIs' student throughput. To address this challenge, a…
ERIC Educational Resources Information Center
Ellery, K.
2011-01-01
Gross participation and throughput rates in higher education institutions in South Africa indicate an inequitable and poorly functioning system. This interpretive study argues for an approach that enhances epistemological and ontological access and examines how an intervention that includes an overt approach in dealing with the nature of science,…
Nanoparticles (NPs) may translocate to the brain following inhalation or oral exposures, yet higher throughput methods to screen NPs for potential neurotoxicity are lacking. The present study examined effects of 5 Ce02 (5- 1288 nm), and 4 Ti02 (6-142 nm) NPs and microparticles (M...
Wilkinson, Samuel L.; John, Shibu; Walsh, Roddy; Novotny, Tomas; Valaskova, Iveta; Gupta, Manu; Game, Laurence; Barton, Paul J R.; Cook, Stuart A.; Ware, James S.
2013-01-01
Background Molecular genetic testing is recommended for diagnosis of inherited cardiac disease, to guide prognosis and treatment, but access is often limited by cost and availability. Recently introduced high-throughput bench-top DNA sequencing platforms have the potential to overcome these limitations. Methodology/Principal Findings We evaluated two next-generation sequencing (NGS) platforms for molecular diagnostics. The protein-coding regions of six genes associated with inherited arrhythmia syndromes were amplified from 15 human samples using parallelised multiplex PCR (Access Array, Fluidigm), and sequenced on the MiSeq (Illumina) and Ion Torrent PGM (Life Technologies). Overall, 97.9% of the target was sequenced adequately for variant calling on the MiSeq, and 96.8% on the Ion Torrent PGM. Regions missed tended to be of high GC-content, and most were problematic for both platforms. Variant calling was assessed using 107 variants detected using Sanger sequencing: within adequately sequenced regions, variant calling on both platforms was highly accurate (Sensitivity: MiSeq 100%, PGM 99.1%. Positive predictive value: MiSeq 95.9%, PGM 95.5%). At the time of the study the Ion Torrent PGM had a lower capital cost and individual runs were cheaper and faster. The MiSeq had a higher capacity (requiring fewer runs), with reduced hands-on time and simpler laboratory workflows. Both provide significant cost and time savings over conventional methods, even allowing for adjunct Sanger sequencing to validate findings and sequence exons missed by NGS. Conclusions/Significance MiSeq and Ion Torrent PGM both provide accurate variant detection as part of a PCR-based molecular diagnostic workflow, and provide alternative platforms for molecular diagnosis of inherited cardiac conditions. Though there were performance differences at this throughput, platforms differed primarily in terms of cost, scalability, protocol stability and ease of use. Compared with current molecular genetic diagnostic tests for inherited cardiac arrhythmias, these NGS approaches are faster, less expensive, and yet more comprehensive. PMID:23861798
Direct write electron beam lithography: a historical overview
NASA Astrophysics Data System (ADS)
Pfeiffer, Hans C.
2010-09-01
Maskless pattern generation capability in combination with practically limitless resolution made probe-forming electron beam systems attractive tools in the semiconductor fabrication process. However, serial exposure of pattern elements with a scanning beam is a slow process and throughput presented a key challenge in electron beam lithography from the beginning. To meet this challenge imaging concepts with increasing exposure efficiency have been developed projecting ever larger number of pixels in parallel. This evolution started in the 1960s with the SEM-type Gaussian beam systems writing one pixel at a time directly on wafers. During the 1970s IBM pioneered the concept of shaped beams containing multiple pixels which led to higher throughput and an early success of e-beam direct write (EBDW) in large scale manufacturing of semiconductor chips. EBDW in a mix-and match approach with optical lithography provided unique flexibility in part number management and cycle time reduction and proved extremely cost effective in IBM's Quick-Turn-Around-Time (QTAT) facilities. But shaped beams did not keep pace with Moore's law because of limitations imposed by the physics of charged particles: Coulomb interactions between beam electrons cause image blur and consequently limit beam current and throughput. A new technology approach was needed. Physically separating beam electrons into multiple beamlets to reduce Coulomb interaction led to the development of massively parallel projection of pixels. Electron projection lithography (EPL) - a mask based imaging technique emulating optical steppers - was pursued during the 1990s by Bell Labs with SCALPEL and by IBM with PREVAIL in partnership with Nikon. In 2003 Nikon shipped the first NCR-EB1A e-beam stepper based on the PREVAIL technology to Selete. It exposed pattern segments containing 10 million pixels in single shot and represented the first successful demonstration of massively parallel pixel projection. However the window of opportunity for EPL had closed with the quick implementation of immersion lithography and the interest of the industry has since shifted back to maskless lithography (ML2). This historical overview of EBDW will highlight opportunities and limitation of the technology with particular focus on technical challenges facing the current ML2 development efforts in Europe and the US. A brief status report and risk assessment of the ML2 approaches will be provided.
Nanoimprint system development and status for high volume semiconductor manufacturing
NASA Astrophysics Data System (ADS)
Hiura, Hiromi; Takabayashi, Yukio; Takashima, Tsuneo; Emoto, Keiji; Choi, Jin; Schumaker, Phil
2016-10-01
Imprint lithography has been shown to be an effective technique for replication of nano-scale features. Jet and Flash Imprint Lithography* (J-FIL*) involves the field-by-field deposition and exposure of a low viscosity resist deposited by jetting technology onto the substrate. The patterned mask is lowered into the fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are many criteria that determine whether a particular technology is ready for wafer manufacturing. For imprint lithography, recent attention has been given to the areas of overlay, throughput, defectivity, and mask replication. This paper reviews progress in these critical areas. Recent demonstrations have proven that mix and match overlay of less than 5nm can achieved. Further reductions require a higher order correction system. Modeling and experimental data are presented which provide a path towards reducing the overlay errors to less than 3nm. Throughput is mainly impacted by the fill time of the relief images on the mask. Improvement in resist materials provides a solution that allows 15 wafers per hour per station, or a tool throughput of 60 wafers per hour. Defectivity and mask life play a significant role relative to meeting the cost of ownership (CoO) requirements in the production of semiconductor devices. Hard particles on a wafer or mask create the possibility of inducing a permanent defect on the mask that can impact device yield and mask life. By using material methods to reduce particle shedding and by introducing an air curtain system, the lifetime of both the master mask and the replica mask can be extended. In this work, we report results that demonstrate a path towards achieving mask lifetimes of better than 1000 wafers. Finally, on the mask side, a new replication tool, the FPA-1100NR2 is introduced. Mask replication is required for nanoimprint lithography (NIL), and criteria that are crucial to the success of a replication platform include both particle control and IP accuracy. In particular, by improving the specifications on the mask chuck, residual errors of only 1nm can be realized.
Lossless compression algorithm for REBL direct-write e-beam lithography system
NASA Astrophysics Data System (ADS)
Cramer, George; Liu, Hsin-I.; Zakhor, Avideh
2010-03-01
Future lithography systems must produce microchips with smaller feature sizes, while maintaining throughputs comparable to those of today's optical lithography systems. This places stringent constraints on the effective data throughput of any maskless lithography system. In recent years, we have developed a datapath architecture for direct-write lithography systems, and have shown that compression plays a key role in reducing throughput requirements of such systems. Our approach integrates a low complexity hardware-based decoder with the writers, in order to decompress a compressed data layer in real time on the fly. In doing so, we have developed a spectrum of lossless compression algorithms for integrated circuit layout data to provide a tradeoff between compression efficiency and hardware complexity, the latest of which is Block Golomb Context Copy Coding (Block GC3). In this paper, we present a modified version of Block GC3 called Block RGC3, specifically tailored to the REBL direct-write E-beam lithography system. Two characteristic features of the REBL system are a rotary stage resulting in arbitrarily-rotated layout imagery, and E-beam corrections prior to writing the data, both of which present significant challenges to lossless compression algorithms. Together, these effects reduce the effectiveness of both the copy and predict compression methods within Block GC3. Similar to Block GC3, our newly proposed technique Block RGC3, divides the image into a grid of two-dimensional "blocks" of pixels, each of which copies from a specified location in a history buffer of recently-decoded pixels. However, in Block RGC3 the number of possible copy locations is significantly increased, so as to allow repetition to be discovered along any angle of orientation, rather than horizontal or vertical. Also, by copying smaller groups of pixels at a time, repetition in layout patterns is easier to find and take advantage of. As a side effect, this increases the total number of copy locations to transmit; this is combated with an extra region-growing step, which enforces spatial coherence among neighboring copy locations, thereby improving compression efficiency. We characterize the performance of Block RGC3 in terms of compression efficiency and encoding complexity on a number of rotated Metal 1, Poly, and Via layouts at various angles, and show that Block RGC3 provides higher compression efficiency than existing lossless compression algorithms, including JPEG-LS, ZIP, BZIP2, and Block GC3.
Optical detection of metastatic cancer cells using a scanned laser pico-projection system
NASA Astrophysics Data System (ADS)
Huang, Chih-Ling; Chiu, Wen-Tai; Lo, Yu-Lung; Chuang, Chin-Ho; Chen, Yu-Bin; Chang, Shu-Jing; Ke, Tung-Ting; Cheng, Hung-Chi; Wu, Hua-Lin
2015-03-01
Metastasis is responsible for 90% of all cancer-related deaths in humans. As a result, reliable techniques for detecting metastatic cells are urgently required. Although various techniques have been proposed for metastasis detection, they are generally capable of detecting metastatic cells only once migration has already occurred. Accordingly, the present study proposes an optical method for physical characterization of metastatic cancer cells using a scanned laser pico-projection system (SLPP). The validity of the proposed method is demonstrated using five pairs of cancer cell lines and two pairs of non-cancer cell lines treated by IPTG induction in order to mimic normal cells with an overexpression of oncogene. The results show that for all of the considered cell lines, the SLPP speckle contrast of the high-metastatic cells is significantly higher than that of the low-metastatic cells. As a result, the speckle contrast measurement provides a reliable means of distinguishing quantitatively between low- and high-metastatic cells of the same origin. Compared to existing metastasis detection methods, the proposed SLPP approach has many advantages, including a higher throughput, a lower cost, a larger sample size and a more reliable diagnostic performance. As a result, it provides a highly promising solution for physical characterization of metastatic cancer cells in vitro.
Next generation PET data acquisition architectures
NASA Astrophysics Data System (ADS)
Jones, W. F.; Reed, J. H.; Everman, J. L.; Young, J. W.; Seese, R. D.
1997-06-01
New architectures for higher performance data acquisition in PET are proposed. Improvements are demanded primarily by three areas of advancing PET state of the art. First, larger detector arrays such as the Hammersmith ECAT/sup (R/) EXACT HR/sup ++/ exceed the addressing capacity of 32 bit coincidence event words. Second, better scintillators (LSO) make depth-of interaction (DOI) and time-of-flight (TOF) operation more practical. Third, fully optimized single photon attenuation correction requires higher rates of data collection. New technologies which enable the proposed third generation Real Time Sorter (RTS III) include: (1) 80 Mbyte/sec Fibre Channel RAID disk systems, (2) PowerPC on both VMEbus and PCI Local bus, and (3) quadruple interleaved DRAM controller designs. Data acquisition flexibility is enhanced through a wider 64 bit coincidence event word. PET methodology support includes DOI (6 bits), TOF (6 bits), multiple energy windows (6 bits), 512/spl times/512 sinogram indexes (18 bits), and 256 crystal rings (16 bits). Throughput of 10 M events/sec is expected for list-mode data collection as well as both on-line and replay histogramming. Fully efficient list-mode storage for each PET application is provided by real-time bit packing of only the active event word bits. Real-time circuits provide DOI rebinning.
A new approach to pattern metrology
NASA Astrophysics Data System (ADS)
Ausschnitt, Christopher P.
2004-05-01
We describe an approach to pattern metrology that enables the simultaneous determination of critical dimensions, overlay and film thickness. A single optical system captures nonzero- and zero-order diffracted signals from illuminated grating targets, as well as unpatterned regions of the surrounding substrate. Differential targets provide in situ dimensional calibration. CD target signals are analyzed to determine average dimension, profile attributes, and effective dose and defocus. In turn, effective dose and defocus determines all CDs pre-correlated to the dose and focus settings of the exposure tool. Overlay target signals are analyzed to determine the relative reflectivity of the layer pair and the overlay error between them. Compared to commercially available pattern metrology (SEM, optical microscopy, AFM, scatterometry and schnitzlometry), our approach promises improved signal-to-noise, higher throughput and smaller targets. We have dubbed this optical chimera MOXIE (Metrology Of eXtremely Irrational Exuberance).
NASA Astrophysics Data System (ADS)
Nicolardi, Simone; Giera, Martin; Kooijman, Pieter; Kraj, Agnieszka; Chervet, Jean-Pierre; Deelder, André M.; van der Burgt, Yuri E. M.
2013-12-01
Particularly in the field of middle- and top-down peptide and protein analysis, disulfide bridges can severely hinder fragmentation and thus impede sequence analysis (coverage). Here we present an on-line/electrochemistry/ESI-FTICR-MS approach, which was applied to the analysis of the primary structure of oxytocin, containing one disulfide bridge, and of hepcidin, containing four disulfide bridges. The presented workflow provided up to 80 % (on-line) conversion of disulfide bonds in both peptides. With minimal sample preparation, such reduction resulted in a higher number of peptide backbone cleavages upon CID or ETD fragmentation, and thus yielded improved sequence coverage. The cycle times, including electrode recovery, were rapid and, therefore, might very well be coupled with liquid chromatography for protein or peptide separation, which has great potential for high-throughput analysis.
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.
2002-02-01
Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-01-01
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution. PMID:26961061
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-03-10
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution.
Re-engineering adenovirus vector systems to enable high-throughput analyses of gene function.
Stanton, Richard J; McSharry, Brian P; Armstrong, Melanie; Tomasec, Peter; Wilkinson, Gavin W G
2008-12-01
With the enhanced capacity of bioinformatics to interrogate extensive banks of sequence data, more efficient technologies are needed to test gene function predictions. Replication-deficient recombinant adenovirus (Ad) vectors are widely used in expression analysis since they provide for extremely efficient expression of transgenes in a wide range of cell types. To facilitate rapid, high-throughput generation of recombinant viruses, we have re-engineered an adenovirus vector (designated AdZ) to allow single-step, directional gene insertion using recombineering technology. Recombineering allows for direct insertion into the Ad vector of PCR products, synthesized sequences, or oligonucleotides encoding shRNAs without requirement for a transfer vector Vectors were optimized for high-throughput applications by making them "self-excising" through incorporating the I-SceI homing endonuclease into the vector removing the need to linearize vectors prior to transfection into packaging cells. AdZ vectors allow genes to be expressed in their native form or with strep, V5, or GFP tags. Insertion of tetracycline operators downstream of the human cytomegalovirus major immediate early (HCMV MIE) promoter permits silencing of transgenes in helper cells expressing the tet repressor thus making the vector compatible with the cloning of toxic gene products. The AdZ vector system is robust, straightforward, and suited to both sporadic and high-throughput applications.
Verdirame, Maria; Veneziano, Maria; Alfieri, Anna; Di Marco, Annalise; Monteagudo, Edith; Bonelli, Fabio
2010-03-11
Turbulent Flow Chromatography (TFC) is a powerful approach for on-line extraction in bioanalytical studies. It improves sensitivity and reduces sample preparation time, two factors that are of primary importance in drug discovery. In this paper the application of the ARIA system to the analytical support of in vivo pharmacokinetics (PK) and in vitro drug metabolism studies is described, with an emphasis in high throughput optimization. For PK studies, a comparison between acetonitrile plasma protein precipitation (APPP) and TFC was carried out. Our optimized TFC methodology gave better S/N ratios and lower limit of quantification (LOQ) than conventional procedures. A robust and high throughput analytical method to support hepatocyte metabolic stability screening of new chemical entities was developed by hyphenation of TFC with mass spectrometry. An in-loop dilution injection procedure was implemented to overcome one of the main issues when using TFC, that is the early elution of hydrophilic compounds that renders low recoveries. A comparison between off-line solid phase extraction (SPE) and TFC was also carried out, and recovery, sensitivity (LOQ), matrix effect and robustness were evaluated. The use of two parallel columns in the configuration of the system provided a further increase of the throughput. Copyright 2009 Elsevier B.V. All rights reserved.
Application of High-Throughput In Vitro Assays for Risk-Based ...
Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos
PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.
Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R
2017-10-01
High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.
Evaluation of High-Throughput Chemical Exposure Models ...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer parent chemical exposures from biomonitoring measurements and forward models to predict multi-pathway exposures from chemical use information and/or residential media concentrations. Here, both forward and reverse modeling methods are used to characterize the relationship between matched near-field environmental (air and dust) and biomarker measurements. Indoor air, house dust, and urine samples from a sample of 120 females (aged 60 to 80 years) were analyzed. In the measured data, 78% of the residential media measurements (across 80 chemicals) and 54% of the urine measurements (across 21 chemicals) were censored, i.e. below the limit of quantification (LOQ). Because of the degree of censoring, we applied a Bayesian approach to impute censored values for 69 chemicals having at least 15% of measurements above LOQ. This resulted in 10 chemicals (5 phthalates, 5 pesticides) with matched air, dust, and urine metabolite measurements. The population medians of indoor air and dust concentrations were compared to population median exposures inferred from urine metabolites concentrations using a high-throughput reverse-dosimetry approach. Median air and dust concentrations were found to be correl
20150325 - Application of High-Throughput In Vitro Assays for ...
Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos
Bharat, Amrita; Blanchard, Jan E.; Brown, Eric D.
2014-01-01
The synthesis of ribosomes is an essential process, which is aided by a variety of transacting factors in bacteria. Among these is a group of GTPases essential for bacterial viability and emerging as promising targets for new antibacterial agents. Herein, we describe a robust high-throughput screening process for inhibitors of one such GTPase, the Escherichia coli EngA protein. The primary screen employed an assay of phosphate production in 384-well density. Reaction conditions were chosen to maximize sensitivity for the discovery of competitive inhibitors while maintaining a strong signal amplitude and low noise. In a pilot screen of 31,800 chemical compounds, 44 active compounds were identified. Further, we describe the elimination of non-specific inhibitors that were detergent-sensitive or reactive as well as those that interfered with the high-throughput phosphate assay. Four inhibitors survived these common counter-screens for non-specificity but these chemicals were also inhibitors of the unrelated enzyme dihydrofolate reductase, suggesting that they too were promiscuously active. The high-throughput screen of the EngA protein described here provides a meticulous pilot study in the search for specific inhibitors of GTPases involved in ribosome biogenesis. PMID:23606650
Tipton, Jeremiah D; Tran, John C; Catherman, Adam D; Ahlf, Dorothy R; Durbin, Kenneth R; Lee, Ji Eun; Kellie, John F; Kelleher, Neil L; Hendrickson, Christopher L; Marshall, Alan G
2012-03-06
Current high-throughput top-down proteomic platforms provide routine identification of proteins less than 25 kDa with 4-D separations. This short communication reports the application of technological developments over the past few years that improve protein identification and characterization for masses greater than 25 kDa. Advances in separation science have allowed increased numbers of proteins to be identified, especially by nanoliquid chromatography (nLC) prior to mass spectrometry (MS) analysis. Further, a goal of high-throughput top-down proteomics is to extend the mass range for routine nLC MS analysis up to 80 kDa because gene sequence analysis predicts that ~70% of the human proteome is transcribed to be less than 80 kDa. Normally, large proteins greater than 50 kDa are identified and characterized by top-down proteomics through fraction collection and direct infusion at relatively low throughput. Further, other MS-based techniques provide top-down protein characterization, however at low resolution for intact mass measurement. Here, we present analysis of standard (up to 78 kDa) and whole cell lysate proteins by Fourier transform ion cyclotron resonance mass spectrometry (nLC electrospray ionization (ESI) FTICR MS). The separation platform reduced the complexity of the protein matrix so that, at 14.5 T, proteins from whole cell lysate up to 72 kDa are baseline mass resolved on a nano-LC chromatographic time scale. Further, the results document routine identification of proteins at improved throughput based on accurate mass measurement (less than 10 ppm mass error) of precursor and fragment ions for proteins up to 50 kDa.
Fast and Adaptive Lossless Onboard Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kimesh, Matthew A.
2012-01-01
Modern hyperspectral imaging systems are able to acquire far more data than can be downlinked from a spacecraft. Onboard data compression helps to alleviate this problem, but requires a system capable of power efficiency and high throughput. Software solutions have limited throughput performance and are power-hungry. Dedicated hardware solutions can provide both high throughput and power efficiency, while taking the load off of the main processor. Thus a hardware compression system was developed. The implementation uses a field-programmable gate array (FPGA). The implementation is based on the fast lossless (FL) compression algorithm reported in Fast Lossless Compression of Multispectral-Image Data (NPO-42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which achieves excellent compression performance and has low complexity. This algorithm performs predictive compression using an adaptive filtering method, and uses adaptive Golomb coding. The implementation also packetizes the coded data. The FL algorithm is well suited for implementation in hardware. In the FPGA implementation, one sample is compressed every clock cycle, which makes for a fast and practical realtime solution for space applications. Benefits of this implementation are: 1) The underlying algorithm achieves a combination of low complexity and compression effectiveness that exceeds that of techniques currently in use. 2) The algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. 3) Hardware acceleration provides a throughput improvement of 10 to 100 times vs. the software implementation. A prototype of the compressor is available in software, but it runs at a speed that does not meet spacecraft requirements. The hardware implementation targets the Xilinx Virtex IV FPGAs, and makes the use of this compressor practical for Earth satellites as well as beyond-Earth missions with hyperspectral instruments.
Parriot, Sandi; Hudson, Thomas H.; Lang, Thierry; Ngundam, Franklyn; Leed, Susan; Sena, Jenell; Harris, Michael; O'Neil, Michael; Sciotti, Richard; Read, Lisa; Lecoeur, Herve; Grogl, Max
2017-01-01
ABSTRACT In any drug discovery and development effort, a reduction in the time of the lead optimization cycle is critical to decrease the time to license and reduce costs. In addition, ethical guidelines call for the more ethical use of animals to minimize the number of animals used and decrease their suffering. Therefore, any effort to develop drugs to treat cutaneous leishmaniasis requires multiple tiers of in vivo testing that start with higher-throughput efficacy assessments and progress to lower-throughput models with the most clinical relevance. Here, we describe the validation of a high-throughput, first-tier, noninvasive model of lesion suppression that uses an in vivo optical imaging technology for the initial screening of compounds. A strong correlation between luciferase activity and the parasite load at up to 18 days postinfection was found. This correlation allows the direct assessment of the effects of drug treatment on parasite burden. We demonstrate that there is a strong correlation between drug efficacy measured on day 18 postinfection and the suppression of lesion size by day 60 postinfection, which allows us to reach an accurate conclusion on drug efficacy in only 18 days. Compounds demonstrating a significant reduction in the bioluminescence signal compared to that in control animals can be tested in lower-throughput, more definitive tests of lesion cure in BALB/c mice and Golden Syrian hamsters (GSH) using Old World and New World parasites. PMID:28137819
Marinković, Aleksandar; Mih, Justin D.; Park, Jin-Ah; Liu, Fei
2012-01-01
Lung fibroblast functions such as matrix remodeling and activation of latent transforming growth factor-β1 (TGF-β1) are associated with expression of the myofibroblast phenotype and are directly linked to fibroblast capacity to generate force and deform the extracellular matrix. However, the study of fibroblast force-generating capacities through methods such as traction force microscopy is hindered by low throughput and time-consuming procedures. In this study, we improved at the detail level methods for higher-throughput traction measurements on polyacrylamide hydrogels using gel-surface-bound fluorescent beads to permit autofocusing and automated displacement mapping, and transduction of fibroblasts with a fluorescent label to streamline cell boundary identification. Together these advances substantially improve the throughput of traction microscopy and allow us to efficiently compute the forces exerted by lung fibroblasts on substrates spanning the stiffness range present in normal and fibrotic lung tissue. Our results reveal that lung fibroblasts dramatically alter the forces they transmit to the extracellular matrix as its stiffness changes, with very low forces generated on matrices as compliant as normal lung tissue. Moreover, exogenous TGF-β1 selectively accentuates tractions on stiff matrices, mimicking fibrotic lung, but not on physiological stiffness matrices, despite equivalent changes in Smad2/3 activation. Taken together, these results demonstrate a pivotal role for matrix mechanical properties in regulating baseline and TGF-β1-stimulated contraction of lung fibroblasts and suggest that stiff fibrotic lung tissue may promote myofibroblast activation through contractility-driven events, whereas normal lung tissue compliance may protect against such feedback amplification of fibroblast activation. PMID:22659883
Alexander, Crispin G.; Wanner, Randy; Johnson, Christopher M.; Breitsprecher, Dennis; Winter, Gerhard; Duhr, Stefan; Baaske, Philipp; Ferguson, Neil
2014-01-01
Chemical denaturant titrations can be used to accurately determine protein stability. However, data acquisition is typically labour intensive, has low throughput and is difficult to automate. These factors, combined with high protein consumption, have limited the adoption of chemical denaturant titrations in commercial settings. Thermal denaturation assays can be automated, sometimes with very high throughput. However, thermal denaturation assays are incompatible with proteins that aggregate at high temperatures and large extrapolation of stability parameters to physiological temperatures can introduce significant uncertainties. We used capillary-based instruments to measure chemical denaturant titrations by intrinsic fluorescence and microscale thermophoresis. This allowed higher throughput, consumed several hundred-fold less protein than conventional, cuvette-based methods yet maintained the high quality of the conventional approaches. We also established efficient strategies for automated, direct determination of protein stability at a range of temperatures via chemical denaturation, which has utility for characterising stability for proteins that are difficult to purify in high yield. This approach may also have merit for proteins that irreversibly denature or aggregate in classical thermal denaturation assays. We also developed procedures for affinity ranking of protein–ligand interactions from ligand-induced changes in chemical denaturation data, and proved the principle for this by correctly ranking the affinity of previously unreported peptide–PDZ domain interactions. The increased throughput, automation and low protein consumption of protein stability determinations afforded by using capillary-based methods to measure denaturant titrations, can help to revolutionise protein research. We believe that the strategies reported are likely to find wide applications in academia, biotherapeutic formulation and drug discovery programmes. PMID:25262836
Incorporating High-Throughput Exposure Predictions with ...
We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compared against regulatory exposure estimates, providing an activity-to-exposure ratio (AER) useful for a risk-based ranking strategy. As ToxCast™ efforts expand (i.e., Phase II) beyond food-use pesticides towards a wider chemical domain that lacks exposure and toxicity information, prediction tools become increasingly important. In this study, in vitro hepatic clearance and plasma protein binding were measured to estimate OEDs for a subset of Phase II chemicals. OEDs were compared against high-throughput (HT) exposure predictions generated using probabilistic modeling and Bayesian approaches generated by the U.S. EPA ExpoCast™ program. This approach incorporated chemical-specific use and national production volume data with biomonitoring data to inform the exposure predictions. This HT exposure modeling approach provided predictions for all Phase II chemicals assessed in this study whereas estimates from regulatory sources were available for only 7% of chemicals. Of the 163 chemicals assessed in this study, three or 13 chemicals possessed AERs <1 or <100, respectively. Diverse bioactivities y across a range of assays and concentrations was also noted across the wider chemical space su
2010-01-01
Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245
QoS-aware integrated fiber-wireless standard compliant architecture based on XGPON and EDCA
NASA Astrophysics Data System (ADS)
Kaur, Ravneet; Srivastava, Anand
2018-01-01
Converged Fiber-Wireless (FiWi) broadband access network proves to be a promising candidate that is reliable, robust, cost efficient, ubiquitous and capable of providing huge amount of bandwidth. To meet the ever-increasing bandwidth requirements, it has become very crucial to investigate the performance issues that arise with the deployment of next-generation Passive Optical Network (PON) and its integration with various wireless technologies. Apart from providing high speed internet access for mass use, this combined architecture aims to enable delivery of high quality and effective e-services in different categories including health, education, finance, banking, agriculture and e-government. In this work, we present an integrated architecture of 10-Gigabit-capable PON (XG-PON) and Enhanced Distributed Channel Access (EDCA) that combines the benefits of both technologies to meet the QoS demands of subscribers. Performance evaluation of the standards-compliant hybrid network is done using discrete-event Network Simulator-3 (NS-3) and results are reported in terms of throughput, average delay, average packet loss rate and fairness index. Per-class throughput signifies effectiveness of QoS distribution whereas aggregate throughput indicates effective utilization of wireless channel. This work has not been reported so far to the best of our knowledge.
Devailly, Guillaume; Mantsoki, Anna; Joshi, Anagha
2016-11-01
Better protocols and decreasing costs have made high-throughput sequencing experiments now accessible even to small experimental laboratories. However, comparing one or few experiments generated by an individual lab to the vast amount of relevant data freely available in the public domain might be limited due to lack of bioinformatics expertise. Though several tools, including genome browsers, allow such comparison at a single gene level, they do not provide a genome-wide view. We developed Heat*seq, a web-tool that allows genome scale comparison of high throughput experiments chromatin immuno-precipitation followed by sequencing, RNA-sequencing and Cap Analysis of Gene Expression) provided by a user, to the data in the public domain. Heat*seq currently contains over 12 000 experiments across diverse tissues and cell types in human, mouse and drosophila. Heat*seq displays interactive correlation heatmaps, with an ability to dynamically subset datasets to contextualize user experiments. High quality figures and tables are produced and can be downloaded in multiple formats. Web application: http://www.heatstarseq.roslin.ed.ac.uk/ Source code: https://github.com/gdevailly CONTACT: Guillaume.Devailly@roslin.ed.ac.uk or Anagha.Joshi@roslin.ed.ac.ukSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Seok, Junhee; Kaushal, Amit; Davis, Ronald W; Xiao, Wenzhong
2010-01-18
The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data.
NASA Astrophysics Data System (ADS)
Simpson, R. A.; Davis, D. E.
1982-09-01
This paper describes techniques to detect submicron pattern defects on optical photomasks with an enhanced direct-write, electron-beam lithographic tool. EL-3 is a third generation, shaped spot, electron-beam lithography tool developed by IBM to fabricate semiconductor devices and masks. This tool is being upgraded to provide 100% inspection of optical photomasks for submicron pattern defects, which are subsequently repaired. Fixed-size overlapped spots are stepped over the mask patterns while a signal derived from the back-scattered electrons is monitored to detect pattern defects. Inspection does not require pattern recognition because the inspection scan patterns are derived from the original design data. The inspection spot is square and larger than the minimum defect to be detected, to improve throughput. A new registration technique provides the beam-to-pattern overlay required to locate submicron defects. The 'guard banding" of inspection shapes prevents mask and system tolerances from producing false alarms that would occur should the spots be mispositioned such that they only partially covered a shape being inspected. A rescanning technique eliminates noise-related false alarms and significantly improves throughput. Data is accumulated during inspection and processed offline, as required for defect repair. EL-3 will detect 0.5 um pattern defects at throughputs compatible with mask manufacturing.
Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob
2018-05-17
Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.
Kwak, Jihoon; Genovesio, Auguste; Kang, Myungjoo; Hansen, Michael Adsett Edberg; Han, Sung-Jun
2015-01-01
Genotoxicity testing is an important component of toxicity assessment. As illustrated by the European registration, evaluation, authorization, and restriction of chemicals (REACH) directive, it concerns all the chemicals used in industry. The commonly used in vivo mammalian tests appear to be ill adapted to tackle the large compound sets involved, due to throughput, cost, and ethical issues. The somatic mutation and recombination test (SMART) represents a more scalable alternative, since it uses Drosophila, which develops faster and requires less infrastructure. Despite these advantages, the manual scoring of the hairs on Drosophila wings required for the SMART limits its usage. To overcome this limitation, we have developed an automated SMART readout. It consists of automated imaging, followed by an image analysis pipeline that measures individual wing genotoxicity scores. Finally, we have developed a wing score-based dose-dependency approach that can provide genotoxicity profiles. We have validated our method using 6 compounds, obtaining profiles almost identical to those obtained from manual measures, even for low-genotoxicity compounds such as urethane. The automated SMART, with its faster and more reliable readout, fulfills the need for a high-throughput in vivo test. The flexible imaging strategy we describe and the analysis tools we provide should facilitate the optimization and dissemination of our methods. PMID:25830368
Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik
2017-11-03
High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.
Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz
2017-01-01
High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791
Mass spectrometry-driven drug discovery for development of herbal medicine.
Zhang, Aihua; Sun, Hui; Wang, Xijun
2018-05-01
Herbal medicine (HM) has made a major contribution to the drug discovery process with regard to identifying products compounds. Currently, more attention has been focused on drug discovery from natural compounds of HM. Despite the rapid advancement of modern analytical techniques, drug discovery is still a difficult and lengthy process. Fortunately, mass spectrometry (MS) can provide us with useful structural information for drug discovery, has been recognized as a sensitive, rapid, and high-throughput technology for advancing drug discovery from HM in the post-genomic era. It is essential to develop an efficient, high-quality, high-throughput screening method integrated with an MS platform for early screening of candidate drug molecules from natural products. We have developed a new chinmedomics strategy reliant on MS that is capable of capturing the candidate molecules, facilitating their identification of novel chemical structures in the early phase; chinmedomics-guided natural product discovery based on MS may provide an effective tool that addresses challenges in early screening of effective constituents of herbs against disease. This critical review covers the use of MS with related techniques and methodologies for natural product discovery, biomarker identification, and determination of mechanisms of action. It also highlights high-throughput chinmedomics screening methods suitable for lead compound discovery illustrated by recent successes. © 2016 Wiley Periodicals, Inc.
Oguntimein, Gbekeloluwa B; Rodriguez, Miguel; Dumitrache, Alexandru; Shollenberger, Todd; Decker, Stephen R; Davison, Brian H; Brown, Steven D
2018-02-01
To develop and prototype a high-throughput microplate assay to assess anaerobic microorganisms and lignocellulosic biomasses in a rapid, cost-effective screen for consolidated bioprocessing potential. Clostridium thermocellum parent Δhpt strain deconstructed Avicel to cellobiose, glucose, and generated lactic acid, formic acid, acetic acid and ethanol as fermentation products in titers and ratios similar to larger scale fermentations confirming the suitability of a plate-based method for C. thermocellum growth studies. C. thermocellum strain LL1210, with gene deletions in the key central metabolic pathways, produced higher ethanol titers in the Consolidated Bioprocessing (CBP) plate assay for both Avicel and switchgrass fermentations when compared to the Δhpt strain. A prototype microplate assay system is developed that will facilitate high-throughput bioprospecting for new lignocellulosic biomass types, genetic variants and new microbial strains for bioethanol production.
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Development of a High-Content Orthopoxvirus Infectivity and Neutralization Assays
Gates, Irina; Olson, Victoria; Smith, Scott; Patel, Nishi; Damon, Inger; Karem, Kevin
2015-01-01
Currently, a number of assays measure Orthopoxvirus neutralization with serum from individuals, vaccinated against smallpox. In addition to the traditional plaque reduction neutralization test (PRNT), newer higher throughput assays are based on neutralization of recombinant vaccinia virus, expressing reporter genes such as β-galactosidase or green fluorescent protein. These methods could not be used to evaluate neutralization of variola virus, since genetic manipulations of this virus are prohibited by international agreements. Currently, PRNT is the assay of choice to measure neutralization of variola virus. However, PRNT assays are time consuming, labor intensive, and require considerable volume of serum sample for testing. Here, we describe the development of a high-throughput, cell-based imaging assay that can be used to measure neutralization, and characterize replication kinetics of various Orthopoxviruses, including variola, vaccinia, monkeypox, and cowpox. PMID:26426117
Improved Breast Cancer Detection Using a Novel In Situ Method to Visualize Clonality.
1998-07-01
photosensitivity and laser driven systems are encouraging, and suggest the possibility of high-throughput systems. Biolithography may thus provide new opportunities for molecular diagnostics of solid tumors.
Zhang, Xin; Zhang, Xiaomei; Xu, Guoqiang; Zhang, Xiaojuan; Shi, Jinsong; Xu, Zhenghong
2018-05-03
L-Serine is widely used in the pharmaceutical, food, and cosmetics industries. Although direct fermentative production of L-serine from sugar in Corynebacterium glutamicum has been achieved, the L-serine yield remains relatively low. In this study, atmospheric and room temperature plasma (ARTP) mutagenesis was used to improve the L-serine yield based on engineered C. glutamicum ΔSSAAI strain. Subsequently, we developed a novel high-throughput screening method using a biosensor constructed based on NCgl0581, a transcriptional factor specifically responsive to L-serine, so that L-serine concentration within single cell of C. glutamicum can be monitored via fluorescence-activated cell sorting (FACS). Novel L-serine-producing mutants were isolated from a large library of mutagenized cells. The mutant strain A36-pDser was screened from 1.2 × 10 5 cells, and the magnesium ion concentration in the medium was optimized specifically for this mutant. C. glutamicum A36-pDser accumulated 34.78 g/L L-serine with a yield of 0.35 g/g sucrose, which were 35.9 and 66.7% higher than those of the parent C. glutamicum ΔSSAAI-pDser strain, respectively. The L-serine yield achieved in this mutant was the highest of all reported L-serine-producing strains of C. glutamicum. Moreover, the whole-genome sequencing identified 11 non-synonymous mutations of genes associated with metabolic and transport pathways, which might be responsible for the higher L-serine production and better cell growth in C. glutamicum A36-pDser. This study explored an effective mutagenesis strategy and reported a novel high-throughput screening method for the development of L-serine-producing strains.
Yong, Michelle K; Cameron, Paul U; Spelman, Tim; Elliott, Julian H; Fairley, Christopher K; Boyle, Jeffrey; Miyamasu, Misato; Lewin, Sharon R
2016-01-01
HIV infection is characterised by persistent immune dysfunction of both the adaptive and innate immune responses. The aim of this study was to evaluate these responses using a novel high throughput assay in healthy controls and HIV-infected individuals prior to and following anti-retroviral treatment (ART). Cross-sectional study. Whole blood was assessed using the QuantiFERON Monitor® (QFM) assay containing adaptive and innate immunostimulants. Interferon (IFN)-γ levels (IU/mL) were measured by enzyme-linked immunosorbent assay (ELISA). We recruited HIV-infected participants (n = 20 off ART and viremic; n = 59 on suppressive ART) and HIV-uninfected controls (n = 229). Median IFN-γ production was significantly higher in HIV-infected participants compared to controls (IFN-γ 512 vs 223 IU/ml, p<0.0001), but within the HIV-infected participants there was no difference between those on or off ART (median IFN-γ 512 vs 593 IU/ml p = 0.94). Amongst the HIV-infected participants, IFN-γ production was higher in individuals with CD4 count>350 compared to <350 cells/μL (IFN-γ IU/ml 561 vs 259 p = 0.02) and in males compared to females (IFN-γ 542 vs 77 IU/ml p = 0.04). There were no associations between IFN-γ production and age, plasma HIV RNA, nadir CD4 count or duration of HIV infection. Using a multivariable analysis, neither CD4 nor sex were independently predictive of IFN-γ production. Using a high throughput assay which assesses both adaptive and innate immune function, we showed elevated IFN-γ production in HIV-infected patients both on and off ART. Further research is warranted to determine if changes in QuantiFERON Monitor® are associated with clinical outcomes.
Characterization and screening of IgG binding to the neonatal Fc receptor
Neuber, Tobias; Frese, Katrin; Jaehrling, Jan; Jäger, Sebastian; Daubert, Daniela; Felderer, Karin; Linnemann, Mechthild; Höhne, Anne; Kaden, Stefan; Kölln, Johanna; Tiller, Thomas; Brocks, Bodo; Ostendorp, Ralf; Pabst, Stefan
2014-01-01
The neonatal Fc receptor (FcRn) protects immunoglobulin G (IgG) from degradation and increases the serum half-life of IgG, thereby contributing to a higher concentration of IgG in the serum. Because altered FcRn binding may result in a reduced or prolonged half-life of IgG molecules, it is advisable to characterize Fc receptor binding of therapeutic antibody lead candidates prior to the start of pre-clinical and clinical studies. In this study, we characterized the interactions between FcRn of different species (human, cynomolgus monkey, mouse and rat) and nine IgG molecules from different species and isotypes with common variable heavy (VH) and variable light chain (VL) domains. Binding was analyzed at acidic and neutral pH using surface plasmon resonance (SPR) and biolayer interferometry (BLI). Furthermore, we transferred the well-accepted, but low throughput SPR-based method for FcRn binding characterization to the BLI-based Octet platform to enable a higher sample throughput allowing the characterization of FcRn binding already during early drug discovery phase. We showed that the BLI-based approach is fit-for-purpose and capable of discriminating between IgG molecules with significant differences in FcRn binding affinities. Using this high-throughput approach we investigated FcRn binding of 36 IgG molecules that represented all VH/VL region combinations available in the fully human, recombinant antibody library Ylanthia®. Our results clearly showed normal FcRn binding profiles for all samples. Hence, the variations among the framework parts, complementarity-determining region (CDR) 1 and CDR2 of the fragment antigen binding (Fab) domain did not significantly change FcRn binding. PMID:24802048
Under-sampling in a Multiple-Channel Laser Vibrometry System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corey, Jordan
2007-03-01
Laser vibrometry is a technique used to detect vibrations on objects using the interference of coherent light with itself. Most vibrometry systems process only one target location at a time, but processing multiple locations simultaneously provides improved detection capabilities. Traditional laser vibrometry systems employ oversampling to sample the incoming modulated-light signal, however as the number of channels increases in these systems, certain issues arise such a higher computational cost, excessive heat, increased power requirements, and increased component cost. This thesis describes a novel approach to laser vibrometry that utilizes undersampling to control the undesirable issues associated with over-sampled systems. Undersamplingmore » allows for significantly less samples to represent the modulated-light signals, which offers several advantages in the overall system design. These advantages include an improvement in thermal efficiency, lower processing requirements, and a higher immunity to the relative intensity noise inherent in laser vibrometry applications. A unique feature of this implementation is the use of a parallel architecture to increase the overall system throughput. This parallelism is realized using a hierarchical multi-channel architecture based on off-the-shelf programmable logic devices (PLDs).« less
Rigter, Tessel; Henneman, Lidewij; Kristoffersson, Ulf; Hall, Alison; Yntema, Helger G; Borry, Pascal; Tönnies, Holger; Waisfisz, Quinten; Elting, Mariet W; Dondorp, Wybo J; Cornel, Martina C
2013-01-01
High-throughput nucleotide sequencing (often referred to as next-generation sequencing; NGS) is increasingly being chosen as a diagnostic tool for cases of expected but unresolved genetic origin. When exploring a higher number of genetic variants, there is a higher chance of detecting unsolicited findings. The consequential increased need for decisions on disclosure of these unsolicited findings poses a challenge for the informed consent procedure. This article discusses the ethical and practical dilemmas encountered when contemplating informed consent for NGS in diagnostics from a multidisciplinary point of view. By exploring recent similar experiences with unsolicited findings in other settings, an attempt is made to describe what can be learned so far for implementing NGS in standard genetic diagnostics. The article concludes with a set of points to consider in order to guide decision-making on the extent of return of results in relation to the mode of informed consent. We hereby aim to provide a sound basis for developing guidelines for optimizing the informed consent procedure. PMID:23784691
Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru
2014-10-15
Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Biodegradable microsphere-mediated cell perforation in microfluidic channel using femtosecond laser
NASA Astrophysics Data System (ADS)
Ishii, Atsuhiro; Ariyasu, Kazumasa; Mitsuhashi, Tatsuki; Heinemann, Dag; Heisterkamp, Alexander; Terakawa, Mitsuhiro
2016-05-01
The use of small particles has expanded the capability of ultrashort pulsed laser optoinjection technology toward simultaneous treatment of multiple cells. The microfluidic platform is one of the attractive systems that has obtained synergy with laser-based technology for cell manipulation, including optoinjection. We have demonstrated the delivery of molecules into suspended-flowing cells in a microfluidic channel by using biodegradable polymer microspheres and a near-infrared femtosecond laser pulse. The use of polylactic-co-glycolic acid microspheres realized not only a higher optoinjection ratio compared to that with polylactic acid microspheres but also avoids optical damage to the microfluidic chip, which is attributable to its higher optical intensity enhancement at the localized spot under a microsphere. Interestingly, optoinjection ratios to nucleus showed a difference for adhered cells and suspended cells. The use of biodegradable polymer microspheres provides high throughput optoinjection; i.e., multiple cells can be treated in a short time, which is promising for various applications in cell analysis, drug delivery, and ex vivo gene transfection to bone marrow cells and stem cells without concerns about residual microspheres.
NASA Technical Reports Server (NTRS)
Prater, Tracie
2016-01-01
Selective Laser Melting (SLM) is a powder bed fusion additive manufacturing process used increasingly in the aerospace industry to reduce the cost, weight, and fabrication time for complex propulsion components. SLM stands poised to revolutionize propulsion manufacturing, but there are a number of technical questions that must be addressed in order to achieve rapid, efficient fabrication and ensure adequate performance of parts manufactured using this process in safety-critical flight applications. Previous optimization studies for SLM using the Concept Laser M1 and M2 machines at NASA Marshall Space Flight Center have centered on machine default parameters. The objective of this work is to characterize the impact of higher throughput parameters (a previously unexplored region of the manufacturing operating envelope for this application) on material consolidation. In phase I of this work, density blocks were analyzed to explore the relationship between build parameters (laser power, scan speed, hatch spacing, and layer thickness) and material consolidation (assessed in terms of as-built density and porosity). Phase II additionally considers the impact of post-processing, specifically hot isostatic pressing and heat treatment, as well as deposition pattern on material consolidation in the same higher energy parameter regime considered in the phase I work. Density and microstructure represent the "first-gate" metrics for determining the adequacy of the SLM process in this parameter range and, as a critical initial indicator of material quality, will factor into a follow-on DOE that assesses the impact of these parameters on mechanical properties. This work will contribute to creating a knowledge base (understanding material behavior in all ranges of the AM equipment operating envelope) that is critical to transitioning AM from the custom low rate production sphere it currently occupies to the world of mass high rate production, where parts are fabricated at a rapid rate with confidence that they will meet or exceed all stringent functional requirements for spaceflight hardware. These studies will also provide important data on the sensitivity of material consolidation to process parameters that will inform the design and development of future flight articles using SLM.
Crystal Symmetry Algorithms in a High-Throughput Framework for Materials
NASA Astrophysics Data System (ADS)
Taylor, Richard
The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.
Mathematical and Computational Modeling in Complex Biological Systems
Li, Wenyang; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558
Urasaki, Yasuyo; Fiscus, Ronald R; Le, Thuc T
2016-04-01
We describe an alternative approach to classifying fatty liver by profiling protein post-translational modifications (PTMs) with high-throughput capillary isoelectric focusing (cIEF) immunoassays. Four strains of mice were studied, with fatty livers induced by different causes, such as ageing, genetic mutation, acute drug usage, and high-fat diet. Nutrient-sensitive PTMs of a panel of 12 liver metabolic and signalling proteins were simultaneously evaluated with cIEF immunoassays, using nanograms of total cellular protein per assay. Changes to liver protein acetylation, phosphorylation, and O-N-acetylglucosamine glycosylation were quantified and compared between normal and diseased states. Fatty liver tissues could be distinguished from one another by distinctive protein PTM profiles. Fatty liver is currently classified by morphological assessment of lipid droplets, without identifying the underlying molecular causes. In contrast, high-throughput profiling of protein PTMs has the potential to provide molecular classification of fatty liver. Copyright © 2016 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
High-Throughput Assessment of Cellular Mechanical Properties.
Darling, Eric M; Di Carlo, Dino
2015-01-01
Traditionally, cell analysis has focused on using molecular biomarkers for basic research, cell preparation, and clinical diagnostics; however, new microtechnologies are enabling evaluation of the mechanical properties of cells at throughputs that make them amenable to widespread use. We review the current understanding of how the mechanical characteristics of cells relate to underlying molecular and architectural changes, describe how these changes evolve with cell-state and disease processes, and propose promising biomedical applications that will be facilitated by the increased throughput of mechanical testing: from diagnosing cancer and monitoring immune states to preparing cells for regenerative medicine. We provide background about techniques that laid the groundwork for the quantitative understanding of cell mechanics and discuss current efforts to develop robust techniques for rapid analysis that aim to implement mechanophenotyping as a routine tool in biomedicine. Looking forward, we describe additional milestones that will facilitate broad adoption, as well as new directions not only in mechanically assessing cells but also in perturbing them to passively engineer cell state.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.