CFTLB: a novel cross-layer fault tolerant and load balancing protocol for WMN
NASA Astrophysics Data System (ADS)
Krishnaveni, N. N.; Chitra, K.
2017-12-01
Wireless mesh network (WMN) forms a wireless backbone framework for multi-hop transmission among the routers and clients in the extensible coverage area. To improve the throughput of WMNs with multiple gateways (GWs), several issues related to GW selection, load balancing and frequent link failures due to the presence of dynamic obstacles and channel interference should be addressed. This paper presents a novel cross-layer fault tolerant and load balancing (CFTLB) protocol to overcome the issues in WMN. Initially, the neighbour GW is searched and channel load is calculated. The GW having least channel load is selected which is estimated during the arrival of the new node. The proposed algorithm finds the alternate GWs and calculates the channel availability under high loading scenarios. If the current load in the GW is high, another GW is found and channel availability is calculated. Besides, it initiates the channel switching and establishes the communication with the mesh client effectively. The utilisation of hashing technique in proposed CFTLB verifies the status of the packets and achieves better performance in terms of router average throughput, throughput, average channel access time and lower end-to-end delay, communication overhead and average data loss in the channel compared to the existing protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, Ryan T.; Wang, Chenchen; Rausch, Sarah J.
2014-07-01
A hybrid microchip/capillary CE system was developed to allow unbiased and lossless sample loading and high throughput repeated injections. This new hybrid CE system consists of a polydimethylsiloxane (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel and a fused silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channelmore » and the fused silica capillary separation column. Analytes are rapidly separated in the fused silica capillary with high resolution. High sensitivity MS detection after CE separation is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a good linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates and CE separation voltages.« less
2015-01-01
A hybrid microchip/capillary electrophoresis (CE) system was developed to allow unbiased and lossless sample loading and high-throughput repeated injections. This new hybrid CE system consists of a poly(dimethylsiloxane) (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel, and a fused-silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channel and the fused-silica capillary separation column. Analytes are rapidly separated in the fused-silica capillary, and following separation, high-sensitivity MS detection is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high-throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates, and CE separation voltages. PMID:24865952
Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.
Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin
2016-02-01
High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.
Adaptation to high throughput batch chromatography enhances multivariate screening.
Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried
2015-09-01
High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sakai, Kenichi; Obata, Kouki; Yoshikawa, Mayumi; Takano, Ryusuke; Shibata, Masaki; Maeda, Hiroyuki; Mizutani, Akihiko; Terada, Katsuhide
2012-10-01
To design a high drug loading formulation of self-microemulsifying/micelle system. A poorly-soluble model drug (CH5137291), 8 hydrophilic surfactants (HS), 10 lipophilic surfactants (LS), 5 oils, and PEG400 were used. A high loading formulation was designed by a following stepwise approach using a high-throughput formulation screening (HTFS) system: (1) an oil/solvent was selected by solubility of the drug; (2) a suitable HS for highly loading was selected by the screenings of emulsion/micelle size and phase stability in binary systems (HS, oil/solvent) with increasing loading levels; (3) a LS that formed a broad SMEDDS/micelle area on a phase diagram containing the HS and oil/solvent was selected by the same screenings; (4) an optimized formulation was selected by evaluating the loading capacity of the crystalline drug. Aqueous solubility behavior and oral absorption (Beagle dog) of the optimized formulation were compared with conventional formulations (jet-milled, PEG400). As an optimized formulation, d-α-tocopheryl polyoxyethylene 1000 succinic ester: PEG400 = 8:2 was selected, and achieved the target loading level (200 mg/mL). The formulation formed fine emulsion/micelle (49.1 nm), and generated and maintained a supersaturated state at a higher level compared with the conventional formulations. In the oral absorption test, the area under the plasma concentration-time curve of the optimized formulation was 16.5-fold higher than that of the jet-milled formulation. The high loading formulation designed by the stepwise approach using the HTFS system improved the oral absorption of the poorly-soluble model drug.
NASA Astrophysics Data System (ADS)
Huntzinger, D. N.; McCray, J. E.; Siegrist, R.; Lowe, K.; VanCuyk, S.
2001-05-01
Sixteen, one-dimensional column lysimeters have been developed to evaluate the influence of loading regime and infiltrative surface character on hydraulic performance in wastewater soil absorption systems. A duplicate design was utilized to evaluate two infiltrative surface conditions (gravel-free vs. gravel-laden) under four hydraulic loading regimes representative of possible field conditions. By loading the columns at rates of 25 to 200 cm/day, the 17 weeks of column operation actually reflect up to approximately 13 yrs of field operation (at 5 cm/day). Therefore, the cumulative mass throughput and infiltrative rate loss for each loading regime can be examined to determine the viability of accelerated loading as a means to compress the time scale of observation, while still producing meaningfully results for the field scale. During operation, the columns were loaded with septic tank effluent at a prescribed rate and routinely monitoring for applied effluent composition, infiltration rate, time-dependant soil water content, water volume throughput, and percolate composition. Bromide tracer tests were completed prior to system startup and at weeks 2, 6, and 17 of system operation. Hydraulic characterization of the columns is based on measurements of the hydraulic loading rate, volumetric throughput, soil water content, and bromide breakthrough curves. Incipient ponding of wastewater developed during the 1st week of operation for columns loaded at the highest hydraulic rate (loading regimes 1 and 2), and during the 3rd and 6th week of operation for loading regimes 3 and 4, respectfully. The bromide breakthrough curves exhibit later breakthrough and tailing as system life increases, indicating the development of spatially variability in hydraulic conductivity within the column and the development of a clogging zone at the infiltrative surface. Throughput is assessed for each loading regime to determine the infiltration rate loss versus days of operation. Loading regimes 1 and 2 approach a comparable long-term throughput rate less than 20 cm/day, while loading regimes 3 and 4 reach a long-term throughput rate of less than 10 cm/day. These one-dimensional columns allow for the analysis of infiltrative rate loss and hydraulic behavior as a result of infiltrative surface character and loading regime.
High throughput chemical munitions treatment system
Haroldsen, Brent L [Manteca, CA; Stofleth, Jerome H [Albuquerque, NM; Didlake, Jr., John E.; Wu, Benjamin C-P [San Ramon, CA
2011-11-01
A new High-Throughput Explosive Destruction System is disclosed. The new system is comprised of two side-by-side detonation containment vessels each comprising first and second halves that feed into a single agent treatment vessel. Both detonation containment vessels further comprise a surrounding ventilation facility. Moreover, the detonation containment vessels are designed to separate into two half-shells, wherein one shell can be moved axially away from the fixed, second half for ease of access and loading. The vessels are closed by means of a surrounding, clam-shell type locking seal mechanisms.
Architecture, component, and microbiome of biofilm involved in the fouling of membrane bioreactors.
Inaba, Tomohiro; Hori, Tomoyuki; Aizawa, Hidenobu; Ogata, Atsushi; Habe, Hiroshi
2017-01-01
Biofilm formation on the filtration membrane and the subsequent clogging of membrane pores (called biofouling) is one of the most persistent problems in membrane bioreactors for wastewater treatment and reclamation. Here, we investigated the structure and microbiome of fouling-related biofilms in the membrane bioreactor using non-destructive confocal reflection microscopy and high-throughput Illumina sequencing of 16S rRNA genes. Direct confocal reflection microscopy indicated that the thin biofilms were formed and maintained regardless of the increasing transmembrane pressure, which is a common indicator of membrane fouling, at low organic-loading rates. Their solid components were primarily extracellular polysaccharides and microbial cells. In contrast, high organic-loading rates resulted in a rapid increase in the transmembrane pressure and the development of the thick biofilms mainly composed of extracellular lipids. High-throughput sequencing revealed that the biofilm microbiomes, including major and minor microorganisms, substantially changed in response to the organic-loading rates and biofilm development. These results demonstrated for the first time that the architectures, chemical components, and microbiomes of the biofilms on fouled membranes were tightly associated with one another and differed considerably depending on the organic-loading conditions in the membrane bioreactor, emphasizing the significance of alternative indicators other than the transmembrane pressure for membrane biofouling.
A High-Throughput Processor for Flight Control Research Using Small UAVs
NASA Technical Reports Server (NTRS)
Klenke, Robert H.; Sleeman, W. C., IV; Motter, Mark A.
2006-01-01
There are numerous autopilot systems that are commercially available for small (<100 lbs) UAVs. However, they all share several key disadvantages for conducting aerodynamic research, chief amongst which is the fact that most utilize older, slower, 8- or 16-bit microcontroller technologies. This paper describes the development and testing of a flight control system (FCS) for small UAV s based on a modern, high throughput, embedded processor. In addition, this FCS platform contains user-configurable hardware resources in the form of a Field Programmable Gate Array (FPGA) that can be used to implement custom, application-specific hardware. This hardware can be used to off-load routine tasks such as sensor data collection, from the FCS processor thereby further increasing the computational throughput of the system.
Effect of solar loading on greenhouse containers used in transpiration efficiency screening
USDA-ARS?s Scientific Manuscript database
Earlier we described a simple high throughput method of screening sorghum for transpiration efficiency (TE). Subsequently it was observed that while results were consistent between lines exhibiting high and low TE, ranking between lines with similar TE was variable. We hypothesized that variable mic...
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, Peter A.; Stewart, Gordon; Lackner, Matthew
Long-term fatigue loads for floating offshore wind turbines are hard to estimate because they require the evaluation of the integral of a highly nonlinear function over a wide variety of wind and wave conditions. Current design standards involve scanning over a uniform rectangular grid of metocean inputs (e.g., wind speed and direction and wave height and period), which becomes intractable in high dimensions as the number of required evaluations grows exponentially with dimension. Monte Carlo integration offers a potentially efficient alternative because it has theoretical convergence proportional to the inverse of the square root of the number of samples, whichmore » is independent of dimension. In this paper, we first report on the integration of the aeroelastic code FAST into NREL's systems engineering tool, WISDEM, and the development of a high-throughput pipeline capable of sampling from arbitrary distributions, running FAST on a large scale, and postprocessing the results into estimates of fatigue loads. Second, we use this tool to run a variety of studies aimed at comparing grid-based and Monte Carlo-based approaches with calculating long-term fatigue loads. We observe that for more than a few dimensions, the Monte Carlo approach can represent a large improvement in computational efficiency, but that as nonlinearity increases, the effectiveness of Monte Carlo is correspondingly reduced. The present work sets the stage for future research focusing on using advanced statistical methods for analysis of wind turbine fatigue as well as extreme loads.« less
Fernandes, Richard; Carey, Conn; Hynes, James; Papkovsky, Dmitri
2013-01-01
The importance of food safety has resulted in a demand for a more rapid, high-throughput method for total viable count (TVC). The industry standard for TVC determination (ISO 4833:2003) is widely used but presents users with some drawbacks. The method is materials- and labor-intensive, requiring multiple agar plates per sample. More importantly, the method is slow, with 72 h typically required for a definitive result. Luxcel Biosciences has developed the GreenLight Model 960, a microtiter plate-based assay providing a rapid high-throughput method of aerobic bacterial load assessment through analysis of microbial oxygen consumption. Results are generated in 1-12 h, depending on microbial load. The mix and measure procedure allows rapid detection of microbial oxygen consumption and equates oxygen consumption to microbial load (CFU/g), providing a simple, sensitive means of assessing the microbial contamination levels in foods (1). As bacteria in the test sample grow and respire, they deplete O2, which is detected as an increase in the GreenLight probe signal above the baseline level (2). The time required to reach this increase in signal can be used to calculate the CFU/g of the original sample, based on a predetermined calibration. The higher the initial microbial load, the earlier this threshold is reached (1).
Wu, Shanshan; Wu, Siying; Yi, Zheyuan; Zeng, Fei; Wu, Weizhen; Qiao, Yuan; Zhao, Xingzhong; Cheng, Xing; Tian, Yanqing
2018-02-13
In this study, we developed fluorescent dual pH and oxygen sensors loaded in multi-well plates for in-situ and high-throughput monitoring of oxygen respiration and extracellular acidification during microbial cell growth for understanding metabolism. Biocompatible PHEMA-co-PAM materials were used as the hydrogel matrix. A polymerizable oxygen probe (OS2) derived from PtTFPP and a polymerizable pH probe (S2) derived from fluorescein were chemically conjugated into the matrix to solve the problem of the probe leaching from the matrix. Gels were allowed to cure directly on the bottom of 96-well plates at room-temperature via redox polymerization. The influence of matrix's composition on the sensing behaviors was investigated to optimize hydrogels with enough robustness for repeatable use with good sensitivity. Responses of the dual sensing hydrogels to dissolved oxygen (DO) and pH were studied. These dual oxygen-pH sensing plates were successfully used for microbial cell-based screening assays, which are based on the measurement of fluorescence intensity changes induced by cellular oxygen consumption and pH changes during microbial growth. This method may provide a real-time monitoring of cellular respiration, acidification, and a rapid kinetic assessment of multiple samples for cell viability as well as high-throughput drug screening. All of these assays can be carried out by a conventional plate reader.
Fang, Hui; Xiao, Qing; Wu, Fanghui; Floreancig, Paul E.; Weber, Stephen G.
2010-01-01
A high-throughput screening system for homogeneous catalyst discovery has been developed by integrating a continuous-flow capillary-based microreactor with ultra-high pressure liquid chromatography (UHPLC) for fast online analysis. Reactions are conducted in distinct and stable zones in a flow stream that allows for time and temperature regulation. UHPLC detection at high temperature allows high throughput online determination of substrate, product, and byproduct concentrations. We evaluated the efficacies of a series of soluble acid catalysts for an intramolecular Friedel-Crafts addition into an acyliminium ion intermediate within one day and with minimal material investment. The effects of catalyst loading, reaction time, and reaction temperature were also screened. This system exhibited high reproducibility for high-throughput catalyst screening and allowed several acid catalysts for the reaction to be identified. Major side products from the reactions were determined through off-line mass spectrometric detection. Er(OTf)3, the catalyst that showed optimal efficiency in the screening, was shown to be effective at promoting the cyclization reaction on a preparative scale. PMID:20666502
Microbial Community in a Biofilter for Removal of Low Load Nitrobenzene Waste Gas
Zhai, Jian; Wang, Zhu; Shi, Peng; Long, Chao
2017-01-01
To improve biofilter performance, the microbial community of a biofilter must be clearly defined. In this study, the performance of a lab-scale polyurethane biofilter for treating waste gas with low loads of nitrobenzene (NB) (< 20 g m-3 h-1) was investigated when using different empty bed residence times (EBRT) (64, 55.4 and 34 s, respectively). In addition, the variations of the bacterial community in the biofilm on the longitudinal distribution of the biofilters were analysed by using Illumina MiSeq high-throughput sequencing. The results showed that NB waste gas was successfully degraded in the biofilter. High-throughput sequencing data suggested that the phylum Actinobacteria and genus Rhodococcus played important roles in the degradation of NB. The variations of the microbial community were attributed to the different intermediate degradation products of NB in each layer. The strains identified in this study were potential candidates for purifying waste gas effluents containing NB. PMID:28114416
Optimization of throughput in semipreparative chiral liquid chromatography using stacked injection.
Taheri, Mohammadreza; Fotovati, Mohsen; Hosseini, Seyed-Kiumars; Ghassempour, Alireza
2017-10-01
An interesting mode of chromatography for preparation of pure enantiomers from pure samples is the method of stacked injection as a pseudocontinuous procedure. Maximum throughput and minimal production costs can be achieved by the use of total chiral column length in this mode of chromatography. To maximize sample loading, often touching bands of the two enantiomers is automatically achieved. Conventional equations show direct correlation between touching-band loadability and the selectivity factor of two enantiomers. The important question for one who wants to obtain the highest throughput is "How to optimize different factors including selectivity, resolution, run time, and loading of the sample in order to save time without missing the touching-band resolution?" To answer this question, tramadol and propranolol were separated on cellulose 3,5-dimethyl phenyl carbamate, as two pure racemic mixtures with low and high solubilities in mobile phase, respectively. The mobile phase composition consisted of n-hexane solvent with alcohol modifier and diethylamine as the additive. A response surface methodology based on central composite design was used to optimize separation factors against the main responses. According to the stacked injection properties, two processes were investigated for maximizing throughput: one with a poorly soluble and another with a highly soluble racemic mixture. For each case, different optimization possibilities were inspected. It was revealed that resolution is a crucial response for separations of this kind. Peak area and run time are two critical parameters in optimization of stacked injection for binary mixtures which have low solubility in the mobile phase. © 2017 Wiley Periodicals, Inc.
High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.
Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L
2005-01-01
An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.
High-throughput microplate technique for enzymatic hydrolysis of lignocellulosic biomass.
Chundawat, Shishir P S; Balan, Venkatesh; Dale, Bruce E
2008-04-15
Several factors will influence the viability of a biochemical platform for manufacturing lignocellulosic based fuels and chemicals, for example, genetically engineering energy crops, reducing pre-treatment severity, and minimizing enzyme loading. Past research on biomass conversion has focused largely on acid based pre-treatment technologies that fractionate lignin and hemicellulose from cellulose. However, for alkaline based (e.g., AFEX) and other lower severity pre-treatments it becomes critical to co-hydrolyze cellulose and hemicellulose using an optimized enzyme cocktail. Lignocellulosics are appropriate substrates to assess hydrolytic activity of enzyme mixtures compared to conventional unrealistic substrates (e.g., filter paper, chromogenic, and fluorigenic compounds) for studying synergistic hydrolysis. However, there are few, if any, high-throughput lignocellulosic digestibility analytical platforms for optimizing biomass conversion. The 96-well Biomass Conversion Research Lab (BCRL) microplate method is a high-throughput assay to study digestibility of lignocellulosic biomass as a function of biomass composition, pre-treatment severity, and enzyme composition. The most suitable method for delivering milled biomass to the microplate was through multi-pipetting slurry suspensions. A rapid bio-enzymatic, spectrophotometric assay was used to determine fermentable sugars. The entire procedure was automated using a robotic pipetting workstation. Several parameters that affect hydrolysis in the microplate were studied and optimized (i.e., particle size reduction, slurry solids concentration, glucan loading, mass transfer issues, and time period for hydrolysis). The microplate method was optimized for crystalline cellulose (Avicel) and ammonia fiber expansion (AFEX) pre-treated corn stover. Copyright 2008 Wiley Periodicals, Inc.
Chebrolu, Kranthi K; Yousef, Gad G; Park, Ryan; Tanimura, Yoshinori; Brown, Allan F
2015-09-15
A high-throughput, robust and reliable method for simultaneous analysis of five carotenoids, four chlorophylls and one tocopherol was developed for rapid screening large sample populations to facilitate molecular biology and plant breeding. Separation was achieved for 10 known analytes and four unknown carotenoids in a significantly reduced run time of 10min. Identity of the 10 analytes was confirmed by their UV-Vis absorption spectras. Quantification of tocopherol, carotenoids and chlorophylls was performed at 290nm, 460nm and 650nm respectively. In this report, two sub two micron particle core-shell columns, Kinetex from Phenomenex (1.7μm particle size, 12% carbon load) and Cortecs from Waters (1.6μm particle size, 6.6% carbon load) were investigated and their separation efficiencies were evaluated. The peak resolutions were >1.5 for all analytes except for chlorophyll-a' with Cortecs column. The ruggedness of this method was evaluated in two identical but separate instruments that produced CV<2 in peak retentions for nine out of 10 analytes separated. Copyright © 2015 Elsevier B.V. All rights reserved.
Cai, Shaobo; Pourdeyhimi, Behnam; Loboa, Elizabeth G
2017-06-28
In this study, we report a high-throughput fabrication method at industrial pilot scale to produce a silver-nanoparticles-doped nanoclay-polylactic acid composite with a novel synergistic antibacterial effect. The obtained nanocomposite has a significantly lower affinity for bacterial adhesion, allowing the loading amount of silver nanoparticles to be tremendously reduced while maintaining satisfactory antibacterial efficacy at the material interface. This is a great advantage for many antibacterial applications in which cost is a consideration. Furthermore, unlike previously reported methods that require additional chemical reduction processes to produce the silver-nanoparticles-doped nanoclay, an in situ preparation method was developed in which silver nanoparticles were created simultaneously during the composite fabrication process by thermal reduction. This is the first report to show that altered material surface submicron structures created with the loading of nanoclay enables the creation of a nanocomposite with significantly lower affinity for bacterial adhesion. This study provides a promising scalable approach to produce antibacterial polymeric products with minimal changes to industry standard equipment, fabrication processes, or raw material input cost.
Apparatus for combinatorial screening of electrochemical materials
Kepler, Keith Douglas [Belmont, CA; Wang, Yu [Foster City, CA
2009-12-15
A high throughput combinatorial screening method and apparatus for the evaluation of electrochemical materials using a single voltage source (2) is disclosed wherein temperature changes arising from the application of an electrical load to a cell array (1) are used to evaluate the relative electrochemical efficiency of the materials comprising the array. The apparatus may include an array of electrochemical cells (1) that are connected to each other in parallel or in series, an electronic load (2) for applying a voltage or current to the electrochemical cells (1), and a device (3), external to the cells, for monitoring the relative temperature of each cell when the load is applied.
NASA Astrophysics Data System (ADS)
Mughal, A.; Newman, H.
2017-10-01
We review and demonstrate the design of efficient data transfer nodes (DTNs), from the perspective of the highest throughput over both local and wide area networks, as well as the highest performance per unit cost. A careful system-level design is required for the hardware, firmware, OS and software components. Furthermore, additional tuning of these components, and the identification and elimination of any remaining bottlenecks is needed once the system is assembled and commissioned, in order to obtain optimal performance. For high throughput data transfers, specialized software is used to overcome the traditional limits in performance caused by the OS, file system, file structures used, etc. Concretely, we will discuss and present the latest results using Fast Data Transfer (FDT), developed by Caltech. We present and discuss the design choices for three generations of Caltech DTNs. Their transfer capabilities range from 40 Gbps to 400 Gbps. Disk throughput is still the biggest challenge in the current generation of available hardware. However, new NVME drives combined with RDMA and a new NVME network fabric are expected to improve the overall data-transfer throughput and simultaneously reduce the CPU load on the end nodes.
Melter Throughput Enhancements for High-Iron HLW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruger, A. A.; Gan, Hoa; Joseph, Innocent
2012-12-26
This report describes work performed to develop and test new glass and feed formulations in order to increase glass melting rates in high waste loading glass formulations for HLW with high concentrations of iron. Testing was designed to identify glass and melter feed formulations that optimize waste loading and waste processing rate while meeting all processing and product quality requirements. The work included preparation and characterization of crucible melts to assess melt rate using a vertical gradient furnace system and to develop new formulations with enhanced melt rate. Testing evaluated the effects of waste loading on glass properties and themore » maximum waste loading that can be achieved. The results from crucible-scale testing supported subsequent DuraMelter 100 (DM100) tests designed to examine the effects of enhanced glass and feed formulations on waste processing rate and product quality. The DM100 was selected as the platform for these tests due to its extensive previous use in processing rate determination for various HLW streams and glass compositions.« less
Valiant load-balanced robust routing under hose model for WDM mesh networks
NASA Astrophysics Data System (ADS)
Zhang, Xiaoning; Li, Lemin; Wang, Sheng
2006-09-01
In this paper, we propose Valiant Load-Balanced robust routing scheme for WDM mesh networks under the model of polyhedral uncertainty (i.e., hose model), and the proposed routing scheme is implemented with traffic grooming approach. Our Objective is to maximize the hose model throughput. A mathematic formulation of Valiant Load-Balanced robust routing is presented and three fast heuristic algorithms are also proposed. When implementing Valiant Load-Balanced robust routing scheme to WDM mesh networks, a novel traffic-grooming algorithm called MHF (minimizing hop first) is proposed. We compare the three heuristic algorithms with the VPN tree under the hose model. Finally we demonstrate in the simulation results that MHF with Valiant Load-Balanced robust routing scheme outperforms the traditional traffic-grooming algorithm in terms of the throughput for the uniform/non-uniform traffic matrix under the hose model.
Ghose, Sanchayita; Nagrath, Deepak; Hubbard, Brian; Brooks, Clayton; Cramer, Steven M
2004-01-01
The effect of an alternate strategy employing two different flowrates during loading was explored as a means of increasing system productivity in Protein-A chromatography. The effect of such a loading strategy was evaluated using a chromatographic model that was able to accurately predict experimental breakthrough curves for this Protein-A system. A gradient-based optimization routine is carried out to establish the optimal loading conditions (initial and final flowrates and switching time). The two-step loading strategy (using a higher flowrate during the initial stages followed by a lower flowrate) was evaluated for an Fc-fusion protein and was found to result in significant improvements in process throughput. In an extension of this optimization routine, dynamic loading capacity and productivity were simultaneously optimized using a weighted objective function, and this result was compared to that obtained with the single flowrate. Again, the dual-flowrate strategy was found to be superior.
NASA Astrophysics Data System (ADS)
Wang, Fu; Liu, Bo; Zhang, Lijia; Jin, Feifei; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun
2017-03-01
The wavelength-division multiplexing passive optical network (WDM-PON) is a potential technology to carry multiple services in an optical access network. However, it has the disadvantages of high cost and an immature technique for users. A software-defined WDM/time-division multiplexing PON was proposed to meet the requirements of high bandwidth, high performance, and multiple services. A reasonable and effective uplink dynamic bandwidth allocation algorithm was proposed. A controller with dynamic wavelength and slot assignment was introduced, and a different optical dynamic bandwidth management strategy was formulated flexibly for services of different priorities according to the network loading. The simulation compares the proposed algorithm with the interleaved polling with adaptive cycle time algorithm. The algorithm shows better performance in average delay, throughput, and bandwidth utilization. The results show that the delay is reduced to 62% and the throughput is improved by 35%.
A transmission imaging spectrograph and microfabricated channel system for DNA analysis.
Simpson, J W; Ruiz-Martinez, M C; Mulhern, G T; Berka, J; Latimer, D R; Ball, J A; Rothberg, J M; Went, G T
2000-01-01
In this paper we present the development of a DNA analysis system using a microfabricated channel device and a novel transmission imaging spectrograph which can be efficiently incorporated into a high throughput genomics facility for both sizing and sequencing of DNA fragments. The device contains 48 channels etched on a glass substrate. The channels are sealed with a flat glass plate which also provides a series of apertures for sample loading and contact with buffer reservoirs. Samples can be easily loaded in volumes up to 640 nL without band broadening because of an efficient electrokinetic stacking at the electrophoresis channel entrance. The system uses a dual laser excitation source and a highly sensitive charge-coupled device (CCD) detector allowing for simultaneous detection of many fluorescent dyes. The sieving matrices for the separation of single-stranded DNA fragments are polymerized in situ in denaturing buffer systems. Examples of separation of single-stranded DNA fragments up to 500 bases in length are shown, including accurate sizing of GeneCalling fragments, and sequencing samples prepared with a reduced amount of dye terminators. An increase in sample throughput has been achieved by color multiplexing.
Sensitive high-throughput screening for the detection of reducing sugars.
Mellitzer, Andrea; Glieder, Anton; Weis, Roland; Reisinger, Christoph; Flicker, Karlheinz
2012-01-01
The exploitation of renewable resources for the production of biofuels relies on efficient processes for the enzymatic hydrolysis of lignocellulosic materials. The development of enzymes and strains for these processes requires reliable and fast activity-based screening assays. Additionally, these assays are also required to operate on the microscale and on the high-throughput level. Herein, we report the development of a highly sensitive reducing-sugar assay in a 96-well microplate screening format. The assay is based on the formation of osazones from reducing sugars and para-hydroxybenzoic acid hydrazide. By using this sensitive assay, the enzyme loads and conversion times during lignocellulose hydrolysis can be reduced, thus allowing higher throughput. The assay is about five times more sensitive than the widely applied dinitrosalicylic acid based assay and can reliably detect reducing sugars down to 10 μM. The assay-specific variation over one microplate was determined for three different lignocellulolytic enzymes and ranges from 2 to 8%. Furthermore, the assay was combined with a microscale cultivation procedure for the activity-based screening of Pichia pastoris strains expressing functional Thermomyces lanuginosus xylanase A, Trichoderma reesei β-mannanase, or T. reesei cellobiohydrolase 2. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-throughput flow alignment of barcoded hydrogel microparticles†
Chapin, Stephen C.; Pregibon, Daniel C.
2010-01-01
Suspension (particle-based) arrays offer several advantages over conventional planar arrays in the detection and quantification of biomolecules, including the use of smaller sample volumes, more favorable probe-target binding kinetics, and rapid probe-set modification. We present a microfluidic system for the rapid alignment of multifunctional hydrogel microparticles designed to bear one or several biomolecule probe regions, as well as a graphical code to identify the embedded probes. Using high-speed imaging, we have developed and optimized a flow-through system that (1) allows for a high particle throughput, (2) ensures proper particle alignment for decoding and target quantification, and (3) can be reliably operated continuously without clogging. A tapered channel flanked by side focusing streams is used to orient the flexible, tablet-shaped particles into a well-ordered flow in the center of the channel. The effects of channel geometry, particle geometry, particle composition, particle loading density, and barcode design are explored to determine the best combination for eventual use in biological assays. Particles in the optimized system move at velocities of ~50 cm s−1 and with throughputs of ~40 particles s−1. Simple physical models and CFD simulations have been used to investigate flow behavior in the device. PMID:19823726
Choi, Gihoon; Hassett, Daniel J; Choi, Seokheun
2015-06-21
There is a large global effort to improve microbial fuel cell (MFC) techniques and advance their translational potential toward practical, real-world applications. Significant boosts in MFC performance can be achieved with the development of new techniques in synthetic biology that can regulate microbial metabolic pathways or control their gene expression. For these new directions, a high-throughput and rapid screening tool for microbial biopower production is needed. In this work, a 48-well, paper-based sensing platform was developed for the high-throughput and rapid characterization of the electricity-producing capability of microbes. 48 spatially distinct wells of a sensor array were prepared by patterning 48 hydrophilic reservoirs on paper with hydrophobic wax boundaries. This paper-based platform exploited the ability of paper to quickly wick fluid and promoted bacterial attachment to the anode pads, resulting in instant current generation upon loading of the bacterial inoculum. We validated the utility of our MFC array by studying how strategic genetic modifications impacted the electrochemical activity of various Pseudomonas aeruginosa mutant strains. Within just 20 minutes, we successfully determined the electricity generation capacity of eight isogenic mutants of P. aeruginosa. These efforts demonstrate that our MFC array displays highly comparable performance characteristics and identifies genes in P. aeruginosa that can trigger a higher power density.
Microengineering methods for cell-based microarrays and high-throughput drug-screening applications.
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-09-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.
ac electroosmotic pumping induced by noncontact external electrodes.
Wang, Shau-Chun; Chen, Hsiao-Ping; Chang, Hsueh-Chia
2007-09-21
Electroosmotic (EO) pumps based on dc electroosmosis is plagued by bubble generation and other electrochemical reactions at the electrodes at voltages beyond 1 V for electrolytes. These disadvantages limit their throughput and offset their portability advantage over mechanical syringe or pneumatic pumps. ac electroosmotic pumps at high frequency (>100 kHz) circumvent the bubble problem by inducing polarization and slip velocity on embedded electrodes,1 but they require complex electrode designs to produce a net flow. We report a new high-throughput ac EO pump design based on induced-polarization on the entire channel surface instead of just on the electrodes. Like dc EO pumps, our pump electrodes are outside of the load section and form a cm-long pump unit consisting of three circular reservoirs (3 mm in diameter) connected by a 1x1 mm channel. The field-induced polarization can produce an effective Zeta potential exceeding 1 V and an ac slip velocity estimated as 1 mmsec or higher, both one order of magnitude higher than earlier dc and ac pumps, giving rise to a maximum throughput of 1 mulsec. Polarization over the entire channel surface, quadratic scaling with respect to the field and high voltage at high frequency without electrode bubble generation are the reasons why the current pump is superior to earlier dc and ac EO pumps.
Microengineering Methods for Cell Based Microarrays and High-Throughput Drug Screening Applications
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-01-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time-consuming and often face ethical concerns due to extensive use of animals. To improve cost-effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems have facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell based drug-screening models, which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell based drug screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds a great potential to provide repeatable 3D cell based constructs with high temporal, spatial control and versatility. PMID:21725152
NASA Astrophysics Data System (ADS)
McKisson, R. L.; Grantham, L. F.; Guon, J.; Recht, H. L.
1983-02-01
Results of an estimate of the waste management costs of the commercial high level waste from a 3000 metric ton per year reprocessing plant show that the judicious use of the ceramic waste form can save about $2 billion during a 20 year operating campaign relative to the use of the glass waste form. This assumes PWR fuel is processed and the waste is encapsulated in 0.305-m-diam canisters with ultimate emplacement in a BWIP-type horizontal-borehole repository. Waste loading and waste form density are the driving factors in that the low waste loading (25%) and relatively low density (3.1 g cu cm) characteristic of the glass form require several times as many canisters to handle a given waste throughput than is needed for the ceramic waste form whose waste loading capability exceeds 60% and whose waste density is nominally 5.2 cu cm.
Microelectroporation device for genomic screening
Perroud, Thomas D.; Renzi, Ronald F.; Negrete, Oscar; Claudnic, Mark R.
2014-09-09
We have developed an microelectroporation device that combines microarrays of oligonucleotides, microfluidic channels, and electroporation for cell transfection and high-throughput screening applications (e.g. RNA interference screens). Microarrays allow the deposition of thousands of different oligonucleotides in microscopic spots. Microfluidic channels and microwells enable efficient loading of cells into the device and prevent cross-contamination between different oligonucleotides spots. Electroporation allows optimal transfection of nucleic acids into cells (especially hard-to-transfect cells such as primary cells) by minimizing cell death while maximizing transfection efficiency. This invention has the advantage of a higher throughput and lower cost, while preventing cross-contamination compared to conventional screening technologies. Moreover, this device does not require bulky robotic liquid handling equipment and is inherently safer given that it is a closed system.
High-throughput determination of biochemical oxygen demand (BOD) by a microplate-based biosensor.
Pang, Hei-Leung; Kwok, Nga-Yan; Chan, Pak-Ho; Yeung, Chi-Hung; Lo, Waihung; Wong, Kwok-Yin
2007-06-01
The use of the conventional 5-day biochemical oxygen demand (BOD5) method in BOD determination is greatly hampered by its time-consuming sampling procedure and its technical difficulty in the handling of a large pool of wastewater samples. Thus, it is highly desirable to develop a fast and high-throughput biosensor for BOD measurements. This paper describes the construction of a microplate-based biosensor consisting of an organically modified silica (ORMOSIL) oxygen sensing film for high-throughput determination of BOD in wastewater. The ORMOSIL oxygen sensing film was prepared by reacting tetramethoxysilane with dimethyldimethoxysilane in the presence of the oxygen-sensitive dye tris(4,7-diphenyl-1,10-phenanthroline)ruthenium-(II) chloride. The silica composite formed a homogeneous, crack-free oxygen sensing film on polystyrene microtiter plates with high stability, and the embedded ruthenium dye interacted with the dissolved oxygen in wastewater according to the Stern-Volmer relation. The bacterium Stenotrophomonas maltophilia was loaded into the ORMOSIL/ PVA composite (deposited on the top of the oxygen sensing film) and used to metabolize the organic compounds in wastewater. This BOD biosensor was found to be able to determine the BOD values of wastewater samples within 20 min by monitoring the dissolved oxygen concentrations. Moreover, the BOD values determined by the BOD biosensor were in good agreement with those obtained by the conventional BOD5 method.
Measuring Sister Chromatid Cohesion Protein Genome Occupancy in Drosophila melanogaster by ChIP-seq.
Dorsett, Dale; Misulovin, Ziva
2017-01-01
This chapter presents methods to conduct and analyze genome-wide chromatin immunoprecipitation of the cohesin complex and the Nipped-B cohesin loading factor in Drosophila cells using high-throughput DNA sequencing (ChIP-seq). Procedures for isolation of chromatin, immunoprecipitation, and construction of sequencing libraries for the Ion Torrent Proton high throughput sequencer are detailed, and computational methods to calculate occupancy as input-normalized fold-enrichment are described. The results obtained by ChIP-seq are compared to those obtained by ChIP-chip (genomic ChIP using tiling microarrays), and the effects of sequencing depth on the accuracy are analyzed. ChIP-seq provides similar sensitivity and reproducibility as ChIP-chip, and identifies the same broad regions of occupancy. The locations of enrichment peaks, however, can differ between ChIP-chip and ChIP-seq, and low sequencing depth can splinter broad regions of occupancy into distinct peaks.
Shinde, Aniketa; Guevarra, Dan; Liu, Guiji; ...
2016-08-23
An efficient photoanode is a prerequisite for a viable solar fuels technology. The challenges to realizing an efficient photoanode include the integration of a semiconductor light absorber and a metal oxide electrocatalyst to optimize corrosion protection, light trapping, hole transport, and photocarrier recombination sites. In order to efficiently explore metal oxide coatings, we employ a high throughput methodology wherein a uniform BiVO 4 film is coated with 858 unique metal oxide coatings covering a range of metal oxide loadings and the full (Ni-Fe-Co-Ce)Ox pseudo-quaternary composition space. Photoelectrochemical characterization of the photoanodes reveals that specific combinations of metal oxide composition andmore » loading provide up to a 13-fold increase in the maximum photoelectrochemical power generation for oxygen evolution in pH 13 electrolyte. Through mining of the high throughput data we identify composition regions that form improved interfaces with BiVO 4. Of particular note, integrated photoanodes with catalyst compositions in the range Fe (0.4-0.6)Ce (0.6-0.4)O x exhibit high interface quality and excellent photoelectrochemical power conversion. Furthermore, for scaled-up inkjet-printed electrodes and photoanodic electrodeposition of this composition on BiVO 4 we can confirm the discovery and the synthesis-independent interface improvement of (Fe-Ce)O x coatings on BiVO 4.« less
Shinde, Aniketa; Guevarra, Dan; Liu, Guiji; ...
2016-08-23
An efficient photoanode is a prerequisite for a viable solar fuels technology. The challenges to realizing an efficient photoanode include the integration of a semiconductor light absorber and a metal oxide electrocatalyst to optimize corrosion protection, light trapping, hole transport, and photocarrier recombination sites. In order to efficiently explore metal oxide coatings, we employ a high throughput methodology wherein a uniform BiVO 4 film is coated with 858 unique metal oxide coatings covering a range of metal oxide loadings and the full (Ni-Fe-Co-Ce)O x pseudo-quaternary composition space. Photoelectrochemical characterization of the photoanodes reveals that specific combinations of metal oxide compositionmore » and loading provide up to a 13-fold increase in the maximum photoelectrochemical power generation for oxygen evolution in pH 13 electrolyte. Through mining of the high throughput data we identify composition regions that form improved interfaces with BiVO 4. Of particular note, integrated photoanodes with catalyst compositions in the range Fe (0.4-0.6)Ce (0.6-0.4)O x exhibit high interface quality and excellent photoelectrochemical power conversion. Furthermore, for scaled-up inkjet-printed electrodes and photoanodic electrodeposition of this composition on BiVO 4 we can confirm the discovery and the synthesis-independent interface improvement of (Fe-Ce)O x coatings on BiVO 4.« less
NASA Technical Reports Server (NTRS)
Glaab, Patricia C.
2012-01-01
The first phase of this study investigated the amount of time a flight can be delayed or expedited within the Terminal Airspace using only speed changes. The Arrival Capacity Calculator analysis tool was used to predict the time adjustment envelope for standard descent arrivals and then for CDA arrivals. Results ranged from 0.77 to 5.38 minutes. STAR routes were configured for the ACES simulation, and a validation of the ACC results was conducted comparing the maximum predicted time adjustments to those seen in ACES. The final phase investigated full runway-to-runway trajectories using ACES. The radial distance used by the arrival scheduler was incrementally increased from 50 to 150 nautical miles (nmi). The increased Planning Horizon radii allowed the arrival scheduler to arrange, path stretch, and speed-adjust flights to more fully load the arrival stream. The average throughput for the high volume portion of the day increased from 30 aircraft per runway for the 50 nmi radius to 40 aircraft per runway for the 150 nmi radius for a traffic set representative of high volume 2018. The recommended radius for the arrival scheduler s Planning Horizon was found to be 130 nmi, which allowed more than 95% loading of the arrival stream.
High-Throughput Nanoindentation for Statistical and Spatial Property Determination
NASA Astrophysics Data System (ADS)
Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.
2018-04-01
Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.
Townsend, Jared B.; Shaheen, Farzana; Liu, Ruiwu; Lam, Kit S.
2011-01-01
A method to efficiently immobilize and partition large quantities of microbeads in an array format in microfabricated polydimethylsiloxane (PDMS) cassette for high-throughput in situ releasable solution-phase cell-based screening of one-bead-one-compound (OBOC) combinatorial libraries is described. Commercially available Jeffamine triamine T-403 (∼440 Da) was derivatized such that two of its amino groups were protected by Fmoc and the remaining amino group capped with succinic anhydride to generate a carboxyl group. This resulting tri-functional hydrophilic polymer was then sequentially coupled two times to the outer layer of topologically segregated bilayer TentaGel (TG) beads with solid phase peptide synthesis chemistry, resulting in beads with increased loading capacity, hydrophilicity and porosity at the outer layer. We have found that such bead configuration can facilitate ultra high-throughput in situ releasable solution-phase screening of OBOC libraries. An encoded releasable OBOC small molecule library was constructed on Jeffamine derivatized TG beads with library compounds tethered to the outer layer via a disulfide linker and coding tags in the interior of the beads. Compound-beads could be efficiently loaded (5-10 minutes) into a 5 cm diameter Petri dish containing a 10,000-well PDMS microbead cassette, such that over 90% of the microwells were each filled with only one compound-bead. Jurkat T-lymphoid cancer cells suspended in Matrigel® were then layered over the microbead cassette to immobilize the compound-beads. After 24 hours of incubation at 37°C, dithiothreitol was added to trigger the release of library compounds. Forty-eight hours later, MTT reporter assay was used to identify regions of reduced cell viability surrounding each positive bead. From a total of about 20,000 beads screened, 3 positive beads were detected and physically isolated for decoding. A strong consensus motif was identified for these three positive compounds. These compounds were re-synthesized and found to be cytotoxic (IC50 50-150 μM) against two T-lymphoma cell lines and less so against the MDA-MB 231 breast cancer cell line. This novel ultra high-throughput OBOC releasable method can potentially be adapted to many existing 96- or 384-well solution-phase cell-based or biochemical assays. PMID:20593859
Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography
Mueller, C.; Marx, A.; Epp, S. W.; Zhong, Y.; Kuo, A.; Balo, A. R.; Soman, J.; Schotte, F.; Lemke, H. T.; Owen, R. L.; Pai, E. F.; Pearson, A. R.; Olson, J. S.; Anfinrud, P. A.; Ernst, O. P.; Dwayne Miller, R. J.
2015-01-01
We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL) sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linac Coherent Light Source (LCLS, Menlo Park, California, USA). The chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs. PMID:26798825
Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography.
Mueller, C; Marx, A; Epp, S W; Zhong, Y; Kuo, A; Balo, A R; Soman, J; Schotte, F; Lemke, H T; Owen, R L; Pai, E F; Pearson, A R; Olson, J S; Anfinrud, P A; Ernst, O P; Dwayne Miller, R J
2015-09-01
We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL) sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linac Coherent Light Source (LCLS, Menlo Park, California, USA). The chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs.
ac electroosmotic pumping induced by noncontact external electrodes
Wang, Shau-Chun; Chen, Hsiao-Ping; Chang, Hsueh-Chia
2007-01-01
Electroosmotic (EO) pumps based on dc electroosmosis is plagued by bubble generation and other electrochemical reactions at the electrodes at voltages beyond 1 V for electrolytes. These disadvantages limit their throughput and offset their portability advantage over mechanical syringe or pneumatic pumps. ac electroosmotic pumps at high frequency (>100 kHz) circumvent the bubble problem by inducing polarization and slip velocity on embedded electrodes,1 but they require complex electrode designs to produce a net flow. We report a new high-throughput ac EO pump design based on induced-polarization on the entire channel surface instead of just on the electrodes. Like dc EO pumps, our pump electrodes are outside of the load section and form a cm-long pump unit consisting of three circular reservoirs (3 mm in diameter) connected by a 1×1 mm channel. The field-induced polarization can produce an effective Zeta potential exceeding 1 V and an ac slip velocity estimated as 1 mm∕sec or higher, both one order of magnitude higher than earlier dc and ac pumps, giving rise to a maximum throughput of 1 μl∕sec. Polarization over the entire channel surface, quadratic scaling with respect to the field and high voltage at high frequency without electrode bubble generation are the reasons why the current pump is superior to earlier dc and ac EO pumps. PMID:19693362
Young, Susan M; Curry, Mark S; Ransom, John T; Ballesteros, Juan A; Prossnitz, Eric R; Sklar, Larry A; Edwards, Bruce S
2004-03-01
HyperCyt, an automated sample handling system for flow cytometry that uses air bubbles to separate samples sequentially introduced from multiwell plates by an autosampler. In a previously documented HyperCyt configuration, air bubble separated compounds in one sample line and a continuous stream of cells in another are mixed in-line for serial flow cytometric cell response analysis. To expand capabilities for high-throughput bioactive compound screening, the authors investigated using this system configuration in combination with automated cell sorting. Peptide ligands were sampled from a 96-well plate, mixed in-line with fluo-4-loaded, formyl peptide receptor-transfected U937 cells, and screened at a rate of 3 peptide reactions per minute with approximately 10,000 cells analyzed per reaction. Cell Ca(2+) responses were detected to as little as 10(-11) M peptide with no detectable carryover between samples at up to 10(-7) M peptide. After expansion in culture, cells sort-purified from the 10% highest responders exhibited enhanced sensitivity and more sustained responses to peptide. Thus, a highly responsive cell subset was isolated under high-throughput mixing and sorting conditions in which response detection capability spanned a 1000-fold range of peptide concentration. With single-cell readout systems for protein expression libraries, this technology offers the promise of screening millions of discrete compound interactions per day.
A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing.
Hsiao, Yi-Hsing; Hsu, Chia-Hsien; Chen, Chihchen
2016-07-08
The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca(2+) concentration. However, glucose evoked a rapid elevation of intracellular Ca(2+) followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.
A high throughput mechanical screening device for cartilage tissue engineering.
Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L
2014-06-27
Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.
Herington, Jennifer L.; Swale, Daniel R.; Brown, Naoko; Shelton, Elaine L.; Choi, Hyehun; Williams, Charles H.; Hong, Charles C.; Paria, Bibhash C.; Denton, Jerod S.; Reese, Jeff
2015-01-01
The uterine myometrium (UT-myo) is a therapeutic target for preterm labor, labor induction, and postpartum hemorrhage. Stimulation of intracellular Ca2+-release in UT-myo cells by oxytocin is a final pathway controlling myometrial contractions. The goal of this study was to develop a dual-addition assay for high-throughput screening of small molecular compounds, which could regulate Ca2+-mobilization in UT-myo cells, and hence, myometrial contractions. Primary murine UT-myo cells in 384-well plates were loaded with a Ca2+-sensitive fluorescent probe, and then screened for inducers of Ca2+-mobilization and inhibitors of oxytocin-induced Ca2+-mobilization. The assay exhibited robust screening statistics (Z´ = 0.73), DMSO-tolerance, and was validated for high-throughput screening against 2,727 small molecules from the Spectrum, NIH Clinical I and II collections of well-annotated compounds. The screen revealed a hit-rate of 1.80% for agonist and 1.39% for antagonist compounds. Concentration-dependent responses of hit-compounds demonstrated an EC50 less than 10μM for 21 hit-antagonist compounds, compared to only 7 hit-agonist compounds. Subsequent studies focused on hit-antagonist compounds. Based on the percent inhibition and functional annotation analyses, we selected 4 confirmed hit-antagonist compounds (benzbromarone, dipyridamole, fenoterol hydrobromide and nisoldipine) for further analysis. Using an ex vivo isometric contractility assay, each compound significantly inhibited uterine contractility, at different potencies (IC50). Overall, these results demonstrate for the first time that high-throughput small-molecules screening of myometrial Ca2+-mobilization is an ideal primary approach for discovering modulators of uterine contractility. PMID:26600013
Weak partitioning chromatography for anion exchange purification of monoclonal antibodies.
Kelley, Brian D; Tobler, Scott A; Brown, Paul; Coffman, Jonathan L; Godavarti, Ranga; Iskra, Timothy; Switzer, Mary; Vunnum, Suresh
2008-10-15
Weak partitioning chromatography (WPC) is an isocratic chromatographic protein separation method performed under mobile phase conditions where a significant amount of the product protein binds to the resin, well in excess of typical flowthrough operations. The more stringent load and wash conditions lead to improved removal of more tightly binding impurities, although at the cost of a reduction in step yield. The step yield can be restored by extending the column load and incorporating a short wash at the end of the load stage. The use of WPC with anion exchange resins enables a two-column cGMP purification platform to be used for many different mAbs. The operating window for WPC can be easily established using high throughput batch-binding screens. Under conditions that favor very strong product binding, competitive effects from product binding can give rise to a reduction in column loading capacity. Robust performance of WPC anion exchange chromatography has been demonstrated in multiple cGMP mAb purification processes. Excellent clearance of host cell proteins, leached Protein A, DNA, high molecular weight species, and model virus has been achieved. (c) 2008 Wiley Periodicals, Inc.
Yang, Fan; Liu, Ruiwu; Kramer, Randall; Xiao, Wenwu; Jordan, Richard; Lam, Kit S
2012-12-01
Oral squamous cell carcinoma has a low five-year survival rate, which may be due to late detection and a lack of effective tumor-specific therapies. Using a high throughput drug discovery strategy termed one-bead one-compound combinatorial library, the authors identified six compounds with high binding affinity to different human oral squamous cell carcinoma cell lines but not to normal cells. Current work is under way to develop these ligands to oral squamous cell carcinoma specific imaging probes or therapeutic agents.
Optimal Base Station Density of Dense Network: From the Viewpoint of Interference and Load.
Feng, Jianyuan; Feng, Zhiyong
2017-09-11
Network densification is attracting increasing attention recently due to its ability to improve network capacity by spatial reuse and relieve congestion by offloading. However, excessive densification and aggressive offloading can also cause the degradation of network performance due to problems of interference and load. In this paper, with consideration of load issues, we study the optimal base station density that maximizes the throughput of the network. The expected link rate and the utilization ratio of the contention-based channel are derived as the functions of base station density using the Poisson Point Process (PPP) and Markov Chain. They reveal the rules of deployment. Based on these results, we obtain the throughput of the network and indicate the optimal deployment density under different network conditions. Extensive simulations are conducted to validate our analysis and show the substantial performance gain obtained by the proposed deployment scheme. These results can provide guidance for the network densification.
Analysis of high-throughput biological data using their rank values.
Dembélé, Doulaye
2018-01-01
High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .
Zhang, Xueyu; Zheng, Shaokui; Zhang, Hangyu; Duan, Shoupeng
2018-04-30
This study clarified the dominant nitrogen (N)-transformation pathway and the key ammonia-oxidizing microbial species at three loading levels during optimization of the anoxic/oxic (A/O) process for sewage treatment. Comprehensive N-transformation activity analysis showed that ammonia oxidization was performed predominantly by aerobic chemolithotrophic and heterotrophic ammonia oxidization, whereas N 2 production was performed primarily by anoxic denitrification in the anoxic unit. The abundances of ammonia-oxidizing bacteria (AOB), nitrite-oxidizing bacteria, and anaerobic AOB in activated sludge reflected their activities on the basis of high-throughput sequencing data. AOB amoA gene clone libraries revealed that the predominant AOB species in sludge samples shifted from Nitrosomonas europaea (61% at the normal loading level) to Nitrosomonas oligotropha (58% and 81% at the two higher loading levels). Following isolation and sequencing, the predominant culturable heterotrophic AOB in sludge shifted from Agrobacterium tumefaciens (42% at the normal loading level) to Acinetobacter johnsonii (52% at the highest loading level). Copyright © 2018 Elsevier Ltd. All rights reserved.
An Investigation of DC-DC Converter Power Density Using Si and SiC MOSFETS
2010-05-07
submarine or small surface combatant, volumetric constraints quickly become extremely prohibitive. Dedicating generators for high power loads takes...thermal compounds were applied to the MOSFET-heat sink interface. For the Si APT26F120B2, MG Chemicals TC-450ML thermal epoxy was used to connect the... submarines , bus converter modules must be made optimally power dense in order to decrease volumetric requirements of the modules for a rated throughput
Brady, Mariea A; Vaze, Reva; Amin, Harsh D; Overby, Darryl R; Ethier, C Ross
2014-02-01
To recapitulate the in vivo environment and create neo-organoids that replace lost or damaged tissue requires the engineering of devices, which provide appropriate biophysical cues. To date, bioreactors for cartilage tissue engineering have focused primarily on biomechanical stimulation. There is a significant need for improved devices for articular cartilage tissue engineering capable of simultaneously applying multiple biophysical (electrokinetic and mechanical) stimuli. We have developed a novel high-throughput magneto-mechanostimulation bioreactor, capable of applying static and time-varying magnetic fields, as well as multiple and independently adjustable mechanical loading regimens. The device consists of an array of 18 individual stations, each of which uses contactless magnetic actuation and has an integrated Hall Effect sensing system, enabling the real-time measurements of applied field, force, and construct thickness, and hence, the indirect measurement of construct mechanical properties. Validation tests showed precise measurements of thickness, within 14 μm of gold standard calliper measurements; further, applied force was measured to be within 0.04 N of desired force over a half hour dynamic loading, which was repeatable over a 3-week test period. Finally, construct material properties measured using the bioreactor were not significantly different (p=0.97) from those measured using a standard materials testing machine. We present a new method for articular cartilage-specific bioreactor design, integrating combinatorial magneto-mechanostimulation, which is very attractive from functional and cost viewpoints.
NASA Astrophysics Data System (ADS)
Zhuang, Huidong; Zhang, Xiaodong
2013-08-01
In large tokamaks, disruption of high current plasma would damage plasma facing component surfaces (PFCs) or other inner components due to high heat load, electromagnetic force load and runaway electrons. It would also influence the subsequent plasma discharge due to production of impurities during disruptions. So the avoidance and mitigation of disruptions is essential for the next generation of tokamaks, such as ITER. Massive gas injection (MGI) is a promising method of disruption mitigation. A new fast valve has been developed successfully on EAST. The valve can be opened in 0.5 ms, and the duration of open state is largely dependent on the gas pressure and capacitor voltage. The throughput of the valve can be adjusted from 0 mbar·L to 700 mbar·L by changing the capacitor voltage and gas pressure. The response time and throughput of the fast valve can meet the requirement of disruption mitigation on EAST. In the last round campaign of EAST and HT-7 in 2010, the fast valve has operated successfully. He and Ar was used for the disruption mitigation on HT-7. By injecting the proper amount of gas, the current quench rate could be slowed down, and the impurities radiation would be greatly improved. In elongated plasmas of EAST discharges, the experimental data is opposite to that which is expected.
Improving Data Transfer Throughput with Direct Search Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar
2016-01-01
Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamicallymore » adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.« less
High-throughput microscopy must re-invent the microscope rather than speed up its functions
Oheim, M
2007-01-01
Knowledge gained from the revolutions in genomics and proteomics has helped to identify many of the key molecules involved in cellular signalling. Researchers, both in academia and in the pharmaceutical industry, now screen, at a sub-cellular level, where and when these proteins interact. Fluorescence imaging and molecular labelling combine to provide a powerful tool for real-time functional biochemistry with molecular resolution. However, they traditionally have been work-intensive, required trained personnel, and suffered from low through-put due to sample preparation, loading and handling. The need for speeding up microscopy is apparent from the tremendous complexity of cellular signalling pathways, the inherent biological variability, as well as the possibility that the same molecule plays different roles in different sub-cellular compartments. Research institutes and companies have teamed up to develop imaging cytometers of ever-increasing complexity. However, to truly go high-speed, sub-cellular imaging must free itself from the rigid framework of current microscopes. PMID:17603553
Inkjet formation of unilamellar lipid vesicles for cell-like encapsulation†
Stachowiak, Jeanne C.; Richmond, David L.; Li, Thomas H.; Brochard-Wyart, Françoise
2010-01-01
Encapsulation of macromolecules within lipid vesicles has the potential to drive biological discovery and enable development of novel, cell-like therapeutics and sensors. However, rapid and reliable production of large numbers of unilamellar vesicles loaded with unrestricted and precisely-controlled contents requires new technologies that overcome size, uniformity, and throughput limitations of existing approaches. Here we present a high-throughput microfluidic method for vesicle formation and encapsulation using an inkjet printer at rates up to 200 Hz. We show how multiple high-frequency pulses of the inkjet’s piezoelectric actuator create a microfluidic jet that deforms a bilayer lipid membrane, controlling formation of individual vesicles. Variations in pulse number, pulse voltage, and solution viscosity are used to control the vesicle size. As a first step toward cell-like reconstitution using this method, we encapsulate the cytoskeletal protein actin and use co-encapsulated microspheres to track its polymerization into a densely entangled cytoskeletal network upon vesicle formation. PMID:19568667
Novel organosilicone materials and patterning techniques for nanoimprint lithography
NASA Astrophysics Data System (ADS)
Pina, Carlos Alberto
Nanoimprint Lithography (NIL) is a high-throughput patterning technique that allows the fabrication of nanostructures with great precision. It has been listed on the International Technology Roadmap for Semiconductors (ITRS) as a candidate technology for future generation Si chip manufacturing. In nanoimprint Lithography a resist material, e.g. a thermoplastic polymer, is placed in contact with a mold and then mechanically deformed under an applied load to transfer the nano-features on the mold surface into the resist. The success of NIL relies heavily in the capability of fabricating nanostructures on different types of materials. Thus, a key factor for NIL implementation in industrial settings is the development of advanced materials suitable as the nanoimprint resist. This dissertation focuses on the engineering of new polymer materials suitable as NIL resist. A variety of silicone-based polymer precursors were synthesized and formulated for NIL applications. High throughput and high yield nanopatterning was successfully achieved. Furthermore, additional capabilities of the developed materials were explored for a range of NIL applications such as their use as flexible, UV-transparent stamps and silicon compatible etching layers. Finally, new strategies were investigated to expand the NIL potentiality. High throughput, non-residual layer imprinting was achieved with the newly developed resist materials. In addition, several strategies were designed for the precise control of nanoscale size patterned structures with multifunctional resist systems by post-imprinting modification of the pattern size. These developments provide NIL with a new set of tools for a variety of additional important applications.
NASA Astrophysics Data System (ADS)
Sartipi, Sina; Jansma, Harrie; Bosma, Duco; Boshuizen, Bart; Makkee, Michiel; Gascon, Jorge; Kapteijn, Freek
2013-12-01
Design and operation of a "six-flow fixed-bed microreactor" setup for Fischer-Tropsch synthesis (FTS) is described. The unit consists of feed and mixing, flow division, reaction, separation, and analysis sections. The reactor system is made of five heating blocks with individual temperature controllers, assuring an identical isothermal zone of at least 10 cm along six fixed-bed microreactor inserts (4 mm inner diameter). Such a lab-scale setup allows running six experiments in parallel, under equal feed composition, reaction temperature, and conditions of separation and analysis equipment. It permits separate collection of wax and liquid samples (from each flow line), allowing operation with high productivities of C5+ hydrocarbons. The latter is crucial for a complete understanding of FTS product compositions and will represent an advantage over high-throughput setups with more than ten flows where such instrumental considerations lead to elevated equipment volume, cost, and operation complexity. The identical performance (of the six flows) under similar reaction conditions was assured by testing a same catalyst batch, loaded in all microreactors.
Qiu, Guanglei; Zhang, Sui; Srinivasa Raghavan, Divya Shankari; Das, Subhabrata; Ting, Yen-Peng
2016-11-01
This work uncovers an important feature of the forward osmosis membrane bioreactor (FOMBR) process: the decoupling of contaminants retention time (CRT) and hydraulic retention time (HRT). Based on this concept, the capability of the hybrid microfiltration-forward osmosis membrane bioreactor (MF-FOMBR) in achieving high through-put treatment of municipal wastewater with enhanced phosphorus recovery was explored. High removal of TOC and NH4(+)-N (90% and 99%, respectively) was achieved with HRTs down to 47min, with the treatment capacity increased by an order of magnitude. Reduced HRT did not affect phosphorus removal and recovery. As a result, the phosphorus recovery capacity was also increased by the same order. Reduced HRT resulted in increased system loading rates and thus elevated concentrations of mixed liquor suspended solids and increased membrane fouling. 454-pyrosequecing suggested the thriving of Bacteroidetes and Proteobacteria (especially Sphingobacteriales Flavobacteriales and Thiothrix members), as well as the community succession and dynamics of ammonium oxidizing and nitrite oxidizing bacteria. Copyright © 2016 Elsevier Ltd. All rights reserved.
An Analysis of Green Propulsion Applied to NASA Missions
NASA Technical Reports Server (NTRS)
Cardiff, Eric H.; Mulkey, Henry W.; Baca, Caitlin E.
2014-01-01
The advantages of green propulsion for five mission classes are examined, including a Low Earth Orbit (LEO) mission (GPM), a Geosynchronous Earth Orbit (GEO) mission (SDO), a High Earth Orbit (HEO) mission (MMS), a lunar mission (LRO), and a planetary mission (MAVEN). The propellant mass benefits are considered for all five missions, as well as the effects on the tanks, propellant loading, thruster throughput, thermal considerations, and range requirements for both the AF-M315E and LMP-103S propellants.
1985-08-01
Class high - speed containerships and their subsequent conversion to a cargo configuration specifically designed for rapid load-offload of military unit...Rough Terrain Forklift SLWT Side-Loadable Warping Tug ST Short Ton STON Short Ton SUROB Surf Observations T-ACS Auxiliary Crane Ship T- AKR Auxiliary Cargo ...their delivery systems for container, breakbulk, and bulk liquid cargo , and to define the operating performance of the combined systems in a joint test
Durable silver thin film coating for diffraction gratings
Wolfe, Jesse D [Discovery Bay, CA; Britten, Jerald A [Oakley, CA; Komashko, Aleksey M [San Diego, CA
2006-05-30
A durable silver film thin film coated non-planar optical element has been developed to replace Gold as a material for fabricating such devices. Such a coating and resultant optical element has an increased efficiency and is resistant to tarnishing, can be easily stripped and re-deposited without modifying underlying grating structure, improves the throughput and power loading of short pulse compressor designs for ultra-fast laser systems, and can be utilized in variety of optical and spectrophotometric systems, particularly high-end spectrometers that require maximized efficiency.
Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, C.; Marx, A.; Epp, S. W.
We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL) sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linacmore » Coherent Light Source (LCLS, Menlo Park, California, USA). As a result, the chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs.« less
Fixed target matrix for femtosecond time-resolved and in situ serial micro-crystallography
Mueller, C.; Marx, A.; Epp, S. W.; ...
2015-08-18
We present a crystallography chip enabling in situ room temperature crystallography at microfocus synchrotron beamlines and X-ray free-electron laser (X-FEL) sources. Compared to other in situ approaches, we observe extremely low background and high diffraction data quality. The chip design is robust and allows fast and efficient loading of thousands of small crystals. The ability to load a large number of protein crystals, at room temperature and with high efficiency, into prescribed positions enables high throughput automated serial crystallography with microfocus synchrotron beamlines. In addition, we demonstrate the application of this chip for femtosecond time-resolved serial crystallography at the Linacmore » Coherent Light Source (LCLS, Menlo Park, California, USA). As a result, the chip concept enables multiple images to be acquired from each crystal, allowing differential detection of changes in diffraction intensities in order to obtain high signal-to-noise and fully exploit the time resolution capabilities of XFELs.« less
NASA Astrophysics Data System (ADS)
Ito, Hiroaki; Murakami, Ryo; Sakuma, Shinya; Tsai, Chia-Hung Dylan; Gutsmann, Thomas; Brandenburg, Klaus; Pöschl, Johannes M. B.; Arai, Fumihito; Kaneko, Makoto; Tanaka, Motomu
2017-02-01
Large deformability of erythrocytes in microvasculature is a prerequisite to realize smooth circulation. We develop a novel tool for the three-step “Catch-Load-Launch” manipulation of a human erythrocyte based on an ultra-high speed position control by a microfluidic “robotic pump”. Quantification of the erythrocyte shape recovery as a function of loading time uncovered the critical time window for the transition between fast and slow recoveries. The comparison with erythrocytes under depletion of adenosine triphosphate revealed that the cytoskeletal remodeling over a whole cell occurs in 3 orders of magnitude longer timescale than the local dissociation-reassociation of a single spectrin node. Finally, we modeled septic conditions by incubating erythrocytes with endotoxin, and found that the exposure to endotoxin results in a significant delay in the characteristic transition time for cytoskeletal remodeling. The high speed manipulation of erythrocytes with a robotic pump technique allows for high throughput mechanical diagnosis of blood-related diseases.
Cornelissen, Marion; Gall, Astrid; Vink, Monique; Zorgdrager, Fokla; Binter, Špela; Edwards, Stephanie; Jurriaans, Suzanne; Bakker, Margreet; Ong, Swee Hoe; Gras, Luuk; van Sighem, Ard; Bezemer, Daniela; de Wolf, Frank; Reiss, Peter; Kellam, Paul; Berkhout, Ben; Fraser, Christophe; van der Kuyl, Antoinette C
2017-07-15
The BEEHIVE (Bridging the Evolution and Epidemiology of HIV in Europe) project aims to analyse nearly-complete viral genomes from >3000 HIV-1 infected Europeans using high-throughput deep sequencing techniques to investigate the virus genetic contribution to virulence. Following the development of a computational pipeline, including a new de novo assembler for RNA virus genomes, to generate larger contiguous sequences (contigs) from the abundance of short sequence reads that characterise the data, another area that determines genome sequencing success is the quality and quantity of the input RNA. A pilot experiment with 125 patient plasma samples was performed to investigate the optimal method for isolation of HIV-1 viral RNA for long amplicon genome sequencing. Manual isolation with the QIAamp Viral RNA Mini Kit (Qiagen) was superior over robotically extracted RNA using either the QIAcube robotic system, the mSample Preparation Systems RNA kit with automated extraction by the m2000sp system (Abbott Molecular), or the MagNA Pure 96 System in combination with the MagNA Pure 96 Instrument (Roche Diagnostics). We scored amplification of a set of four HIV-1 amplicons of ∼1.9, 3.6, 3.0 and 3.5kb, and subsequent recovery of near-complete viral genomes. Subsequently, 616 BEEHIVE patient samples were analysed to determine factors that influence successful amplification of the genome in four overlapping amplicons using the QIAamp Viral RNA Kit for viral RNA isolation. Both low plasma viral load and high sample age (stored before 1999) negatively influenced the amplification of viral amplicons >3kb. A plasma viral load of >100,000 copies/ml resulted in successful amplification of all four amplicons for 86% of the samples, this value dropped to only 46% for samples with viral loads of <20,000 copies/ml. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Diot, Alan; Hinks-Roberts, Alex; Lodge, Tiffany; Liao, Chunyan; Dombi, Eszter; Morten, Karl; Brady, Stefen; Fratter, Carl; Carver, Janet; Muir, Rebecca; Davis, Ryan; Green, Charlotte J; Johnston, Iain; Hilton-Jones, David; Sue, Carolyn; Mortiboys, Heather; Poulton, Joanna
2015-10-01
Mitophagy is a cellular mechanism for the recycling of mitochondrial fragments. This process is able to improve mitochondrial DNA (mtDNA) quality in heteroplasmic mtDNA disease, in which mutant mtDNA co-exists with normal mtDNA. In disorders where the load of mutant mtDNA determines disease severity it is likely to be an important determinant of disease progression. Measuring mitophagy is technically demanding. We used pharmacological modulators of autophagy to validate two techniques for quantifying mitophagy. First we used the IN Cell 1000 analyzer to quantify mitochondrial co-localisation with LC3-II positive autophagosomes. Unlike conventional fluorescence and electron microscopy, this high-throughput system is sufficiently sensitive to detect transient low frequency autophagosomes. Secondly, because mitophagy preferentially removes pathogenic heteroplasmic mtDNA mutants, we developed a heteroplasmy assay based on loss of m.3243A>G mtDNA, during culture conditions requiring oxidative metabolism ("energetic stress"). The effects of the pharmacological modulators on these two measures were consistent, confirming that the high throughput imaging output (autophagosomes co-localising with mitochondria) reflects mitochondrial quality control. To further validate these methods, we performed a more detailed study using metformin, the most commonly prescribed antidiabetic drug that is still sometimes used in Maternally Inherited Diabetes and Deafness (MIDD). This confirmed our initial findings and revealed that metformin inhibits mitophagy at clinically relevant concentrations, suggesting that it may have novel therapeutic uses. Copyright © 2015. Published by Elsevier Ltd.
Wallner, Jakob; Lhota, Gabriele; Schosserer, Markus; Vorauer-Uhl, Karola
2017-06-01
Non-fluidic bio-layer interferometry (BLI) has rapidly become a standard tool for monitoring almost all biomolecular interactions in a label-free, real-time and high-throughput manner. High-efficiency screening methods which measure the kinetics of liposomes with a variety of compounds require the immobilization of liposomes. In this work, a method is described for immobilizing liposomes for interaction studies, based on the biophysical principles of this biosensor platform. The immobilization approach includes the loading of DSPE-PEG (2000) -biotin containing sterically stabilized micelles (SSMs) which are restructured in a buffer change step, resulting in an accessible substrate for liposome immobilization. Liposomes in a concentration of 5mM of varying composition and fluidity were immobilized on the sensor surface by inserting the hydrophobic residues of the former loaded SSMs. This proof of principle was carried out using Cytochrome C as a membrane-interacting model protein. The binding of Cytochrome C to the immobilized liposomes was demonstrated, and the derived kinetic and affinity constants were similar to values given in the literature. In order to obtain a detailed understanding of this surface, and to show the integrity of the liposomes, confocal fluorescence microscopy was used. Images of immobilized liposomes containing calcein in the aqueous core indicated intact vesicles. A combination of this simple liposome immobilization approach, the possibility of automation on BLI systems with high throughput within an acceptable timescale and excellent reproducibility makes this assay suitable for basic research as well as for industrial and regulatory applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Leão, Erico; Montez, Carlos; Moraes, Ricardo; Portugal, Paulo; Vasques, Francisco
2017-01-01
The use of Wireless Sensor Network (WSN) technologies is an attractive option to support wide-scale monitoring applications, such as the ones that can be found in precision agriculture, environmental monitoring and industrial automation. The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable topology to build wide-scale WSNs. Despite some of its known advantages, including timing synchronisation and duty-cycle operation, cluster-tree networks may suffer from severe network congestion problems due to the convergecast pattern of its communication traffic. Therefore, the careful adjustment of transmission opportunities (superframe durations) allocated to the cluster-heads is an important research issue. This paper proposes a set of proportional Superframe Duration Allocation (SDA) schemes, based on well-defined protocol and timing models, and on the message load imposed by child nodes (Load-SDA scheme), or by number of descendant nodes (Nodes-SDA scheme) of each cluster-head. The underlying reasoning is to adequately allocate transmission opportunities (superframe durations) and parametrize buffer sizes, in order to improve the network throughput and avoid typical problems, such as: network congestion, high end-to-end communication delays and discarded messages due to buffer overflows. Simulation assessments show how proposed allocation schemes may clearly improve the operation of wide-scale cluster-tree networks. PMID:28134822
Lehotay, Steven J; Han, Lijun; Sapozhnikova, Yelena
2016-01-01
This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography-tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. Cleanup efficiencies and breakthrough volumes using different mini-SPE sorbents were compared using avocado, salmon, pork loin, and kale as representative matrices. Optimum extract load volume was 300 µL for the 45 mg mini-cartridges containing 20/12/12/1 (w/w/w/w) anh. MgSO 4 /PSA (primary secondary amine)/C 18 /CarbonX sorbents used in the final method. In method validation to demonstrate high-throughput capabilities and performance results, 230 spiked extracts of 10 different foods (apple, kiwi, carrot, kale, orange, black olive, wheat grain, dried basil, pork, and salmon) underwent automated mini-SPE cleanup and analysis over the course of 5 days. In all, 325 analyses for 54 pesticides and 43 environmental contaminants (3 analyzed together) were conducted using the 10 min LPGC-MS/MS method without changing the liner or retuning the instrument. Merely, 1 mg equivalent sample injected achieved <5 ng g -1 limits of quantification. With the use of internal standards, method validation results showed that 91 of the 94 analytes including pairs achieved satisfactory results (70-120 % recovery and RSD ≤ 25 %) in the 10 tested food matrices ( n = 160). Matrix effects were typically less than ±20 %, mainly due to the use of analyte protectants, and minimal human review of software data processing was needed due to summation function integration of analyte peaks. This study demonstrated that the automated mini-SPE + LPGC-MS/MS method yielded accurate results in rugged, high-throughput operations with minimal labor and data review.
Yeo, David C; Wiraja, Christian; Zhou, Yingying; Tay, Hui Min; Xu, Chenjie; Hou, Han Wei
2015-09-23
Engineering cells with active-ingredient-loaded micro/nanoparticles is becoming increasingly popular for imaging and therapeutic applications. A critical yet inadequately addressed issue during its implementation concerns the significant number of particles that remain unbound following the engineering process, which inadvertently generate signals and impart transformative effects onto neighboring nontarget cells. Here we demonstrate that those unbound micro/nanoparticles remaining in solution can be efficiently separated from the particle-labeled cells by implementing a fast, continuous, and high-throughput Dean flow fractionation (DFF) microfluidic device. As proof-of-concept, we applied the DFF microfluidic device for buffer exchange to sort labeled suspension cells (THP-1) from unbound fluorescent dye and dye-loaded micro/nanoparticles. Compared to conventional centrifugation, the depletion efficiency of free dyes or particles was improved 20-fold and the mislabeling of nontarget bystander cells by free particles was minimized. The microfluidic device was adapted to further accommodate heterogeneous-sized mesenchymal stem cells (MSCs). Complete removal of unbound nanoparticles using DFF led to the usage of engineered MSCs without exerting off-target transformative effects on the functional properties of neighboring endothelial cells. Apart from its effectiveness in removing free particles, this strategy is also efficient and scalable. It could continuously process cell solutions with concentrations up to 10(7) cells·mL(-1) (cell densities commonly encountered during cell therapy) without observable loss of performance. Successful implementation of this technology is expected to pave the way for interference-free clinical application of micro/nanoparticle engineered cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruger, Albert A.
2013-07-01
The current estimates and glass formulation efforts have been conservative in terms of achievable waste loadings. These formulations have been specified to ensure that the glasses are homogenous, contain essentially no crystalline phases, are processable in joule-heated, ceramic-lined melters and meet Hanford Tank Waste Treatment and Immobilization Plant (WTP) Contract terms. The WTP's overall mission will require the immobilization of tank waste compositions that are dominated by mixtures of aluminum (Al), chromium (Cr), bismuth (Bi), iron (Fe), phosphorous (P), zirconium (Zr), and sulphur (S) compounds as waste-limiting components. Glass compositions for these waste mixtures have been developed based upon previousmore » experience and current glass property models. Recently, DOE has initiated a testing program to develop and characterize HLW glasses with higher waste loadings and higher throughput efficiencies. Results of this work have demonstrated the feasibility of increases in waste loading from about 25 wt% to 33-50 wt% (based on oxide loading) in the glass depending on the waste stream. In view of the importance of aluminum limited waste streams at Hanford (and also Savannah River), the ability to achieve high waste loadings without adversely impacting melt rates has the potential for enormous cost savings from reductions in canister count and the potential for schedule acceleration. Consequently, the potential return on the investment made in the development of these enhancements is extremely favorable. Glass composition development for one of the latest Hanford HLW projected compositions with sulphate concentrations high enough to limit waste loading have been successfully tested and show tolerance for previously unreported tolerance for sulphate. Though a significant increase in waste loading for high-iron wastes has been achieved, the magnitude of the increase is not as substantial as those achieved for high-aluminum, high-chromium, high-bismuth or sulphur. Waste processing rate increases for high-iron streams as a combined effect of higher waste loadings and higher melt rates resulting from new formulations have been achieved. (author)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruger, Albert A.
2013-01-16
The current estimates and glass formulation efforts have been conservative in terms of achievable waste loadings. These formulations have been specified to ensure that the glasses are homogenous, contain essentially no crystalline phases, are processable in joule-heated, ceramic-lined melters and meet Hanford Tank Waste Treatment and Immobilization Plant (WTP) Contract terms. The WTP?s overall mission will require the immobilization of tank waste compositions that are dominated by mixtures of aluminum (Al), chromium (Cr), bismuth (Bi), iron (Fe), phosphorous (P), zirconium (Zr), and sulphur (S) compounds as waste-limiting components. Glass compositions for these waste mixtures have been developed based upon previousmore » experience and current glass property models. Recently, DOE has initiated a testing program to develop and characterize HLW glasses with higher waste loadings and higher throughput efficiencies. Results of this work have demonstrated the feasibility of increases in waste loading from about 25 wt% to 33-50 wt% (based on oxide loading) in the glass depending on the waste stream. In view of the importance of aluminum limited waste streams at Hanford (and also Savannah River), the ability to achieve high waste loadings without adversely impacting melt rates has the potential for enormous cost savings from reductions in canister count and the potential for schedule acceleration. Consequently, the potential return on the investment made in the development of these enhancements is extremely favorable. Glass composition development for one of the latest Hanford HLW projected compositions with sulphate concentrations high enough to limit waste loading have been successfully tested and show tolerance for previously unreported tolerance for sulphate. Though a significant increase in waste loading for high-iron wastes has been achieved, the magnitude of the increase is not as substantial as those achieved for high-aluminum, high-chromium, high-bismuth or sulphur. Waste processing rate increases for high-iron streams as a combined effect of higher waste loadings and higher melt rates resulting from new formulations have been achieved.« less
Evaluation of Advanced Polymers for Additive Manufacturing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rios, Orlando; Carter, William G.; Kutchko, Cindy
The goal of this Manufacturing Demonstration Facility (MDF) technical collaboration project between Oak Ridge National Laboratory (ORNL) and PPG Industries, Inc. (PPG) was to evaluate the feasibility of using conventional coatings chemistry and technology to build up material layer-by-layer. The PPG-ORNL study successfully demonstrated that polymeric coatings formulations may overcome many limitations of common thermoplastics used in additive manufacturing (AM), allow lightweight nozzle design for material deposition, and increase build rate. The materials effort focused on layer-by-layer deposition of coatings with each layer fusing together. The combination of materials and deposition results in an additively manufactured build that has sufficientmore » mechanical properties to bear the load of additional layers, yet is capable of bonding across the z-layers to improve build direction strength. The formulation properties were tuned to enable a novel, high-throughput deposition method that is highly scalable, compatible with high loading of reinforcing fillers, and inherently low-cost.« less
Fast and Adaptive Lossless Onboard Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kimesh, Matthew A.
2012-01-01
Modern hyperspectral imaging systems are able to acquire far more data than can be downlinked from a spacecraft. Onboard data compression helps to alleviate this problem, but requires a system capable of power efficiency and high throughput. Software solutions have limited throughput performance and are power-hungry. Dedicated hardware solutions can provide both high throughput and power efficiency, while taking the load off of the main processor. Thus a hardware compression system was developed. The implementation uses a field-programmable gate array (FPGA). The implementation is based on the fast lossless (FL) compression algorithm reported in Fast Lossless Compression of Multispectral-Image Data (NPO-42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which achieves excellent compression performance and has low complexity. This algorithm performs predictive compression using an adaptive filtering method, and uses adaptive Golomb coding. The implementation also packetizes the coded data. The FL algorithm is well suited for implementation in hardware. In the FPGA implementation, one sample is compressed every clock cycle, which makes for a fast and practical realtime solution for space applications. Benefits of this implementation are: 1) The underlying algorithm achieves a combination of low complexity and compression effectiveness that exceeds that of techniques currently in use. 2) The algorithm requires no training data or other specific information about the nature of the spectral bands for a fixed instrument dynamic range. 3) Hardware acceleration provides a throughput improvement of 10 to 100 times vs. the software implementation. A prototype of the compressor is available in software, but it runs at a speed that does not meet spacecraft requirements. The hardware implementation targets the Xilinx Virtex IV FPGAs, and makes the use of this compressor practical for Earth satellites as well as beyond-Earth missions with hyperspectral instruments.
Bass, A L; Hinch, S G; Teffer, A K; Patterson, D A; Miller, K M
2017-04-01
Microparasites play an important role in the demography, ecology and evolution of Pacific salmonids. As salmon stocks continue to decline and the impacts of global climate change on fish populations become apparent, a greater understanding of microparasites in wild salmon populations is warranted. We used high-throughput, quantitative PCR (HT-qRT-PCR) to rapidly screen 82 adult Chinook salmon from five geographically or genetically distinct groups (mostly returning to tributaries of the Fraser River) for 45 microparasite taxa. We detected 20 microparasite species, four of which have not previously been documented in Chinook salmon, and four of which have not been previously detected in any salmonids in the Fraser River. Comparisons of microparasite load to blood plasma variables revealed some positive associations between Flavobacterium psychrophilum, Cryptobia salmositica and Ceratonova shasta and physiological indices suggestive of morbidity. We include a comparison of our findings for each microparasite taxa with previous knowledge of its distribution in British Columbia. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Music, Denis; Geyer, Richard W.; Hans, Marcus
2016-07-01
To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO2 with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m-1 K-2 for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to pure NbO2 and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.
NASA Astrophysics Data System (ADS)
Zhu, Feng; Akagi, Jin; Hall, Chris J.; Crosier, Kathryn E.; Crosier, Philip S.; Delaage, Pierre; Wlodkowic, Donald
2013-12-01
Drug discovery screenings performed on zebrafish embryos mirror with a high level of accuracy. The tests usually performed on mammalian animal models, and the fish embryo toxicity assay (FET) is one of the most promising alternative approaches to acute ecotoxicity testing with adult fish. Notwithstanding this, conventional methods utilising 96-well microtiter plates and manual dispensing of fish embryos are very time-consuming. They rely on laborious and iterative manual pipetting that is a main source of analytical errors and low throughput. In this work, we present development of a miniaturised and high-throughput Lab-on-a-Chip (LOC) platform for automation of FET assays. The 3D high-density LOC array was fabricated in poly-methyl methacrylate (PMMA) transparent thermoplastic using infrared laser micromachining while the off-chip interfaces were fabricated using additive manufacturing processes (FDM and SLA). The system's design facilitates rapid loading and immobilization of a large number of embryos in predefined clusters of traps during continuous microperfusion of drugs/toxins. It has been conceptually designed to seamlessly interface with both upright and inverted fluorescent imaging systems and also to directly interface with conventional microtiter plate readers that accept 96-well plates. We also present proof-of-concept interfacing with a high-speed imaging cytometer Plate RUNNER HD® capable of multispectral image acquisition with resolution of up to 8192 x 8192 pixels and depth of field of about 40 μm. Furthermore, we developed a miniaturized and self-contained analytical device interfaced with a miniaturized USB microscope. This system modification is capable of performing rapid imaging of multiple embryos at a low resolution for drug toxicity analysis.
Li, Bowei; Jiang, Lei; Xie, Hua; Gao, Yan; Qin, Jianhua; Lin, Bingcheng
2009-09-01
A micropump-actuated negative pressure pinched injection method is developed for parallel electrophoresis on a multi-channel LIF detection system. The system has a home-made device that could individually control 16-port solenoid valves and a high-voltage power supply. The laser beam is excitated and distributes to the array separation channels for detection. The hybrid Glass-PDMS microfluidic chip comprises two common reservoirs, four separation channels coupled to their respective pneumatic micropumps and two reference channels. Due to use of pressure as a driving force, the proposed method has no sample bias effect for separation. There is only one high-voltage supply needed for separation without relying on the number of channels, which is significant for high-throughput analysis, and the time for sample loading is shortened to 1 s. In addition, the integrated micropumps can provide the versatile interface for coupling with other function units to satisfy the complicated demands. The performance is verified by separation of DNA marker and Hepatitis B virus DNA samples. And this method is also expected to show the potential throughput for the DNA analysis in the field of disease diagnosis.
Luan, Peng; Lee, Sophia; Paluch, Maciej; Kansopon, Joe; Viajar, Sharon; Begum, Zahira; Chiang, Nancy; Nakamura, Gerald; Hass, Philip E.; Wong, Athena W.; Lazar, Greg A.
2018-01-01
ABSTRACT To rapidly find “best-in-class” antibody therapeutics, it has become essential to develop high throughput (HTP) processes that allow rapid assessment of antibodies for functional and molecular properties. Consequently, it is critical to have access to sufficient amounts of high quality antibody, to carry out accurate and quantitative characterization. We have developed automated workflows using liquid handling systems to conduct affinity-based purification either in batch or tip column mode. Here, we demonstrate the capability to purify >2000 antibodies per day from microscale (1 mL) cultures. Our optimized, automated process for human IgG1 purification using MabSelect SuRe resin achieves ∼70% recovery over a wide range of antibody loads, up to 500 µg. This HTP process works well for hybridoma-derived antibodies that can be purified by MabSelect SuRe resin. For rat IgG2a, which is often encountered in hybridoma cultures and is challenging to purify via an HTP process, we established automated purification with GammaBind Plus resin. Using these HTP purification processes, we can efficiently recover sufficient amounts of antibodies from mammalian transient or hybridoma cultures with quality comparable to conventional column purification. PMID:29494273
Duez, Julien; Carucci, Mario; Garcia-Barbazan, Irene; Corral, Matias; Perez, Oscar; Presa, Jesus Luis; Henry, Benoit; Roussel, Camille; Ndour, Papa Alioune; Rosa, Noemi Bahamontes; Sanz, Laura; Gamo, Francisco-Javier; Buffet, Pierre
2018-06-01
The mechanical retention of rigid erythrocytes in the spleen is central in major hematological diseases such as hereditary spherocytosis, sickle-cell disease and malaria. Here, we describe the use of microsphiltration (microsphere filtration) to assess erythrocyte deformability in hundreds to thousands of samples in parallel, by filtering them through microsphere layers in 384-well plates adapted for the discovery of compounds that stiffen Plasmodium falciparum gametocytes, with the aim of interrupting malaria transmission. Compound-exposed gametocytes are loaded into microsphiltration plates, filtered and then transferred to imaging plates for analysis. High-content imaging detects viable gametocytes upstream and downstream from filters and quantifies spleen-like retention. This screening assay takes 3-4 d. Unlike currently available methods used to assess red blood cell (RBC) deformability, microsphiltration enables high-throughput pharmacological screening (tens of thousands of compounds tested in a matter of months) and involves a cell mechanical challenge that induces a physiologically relevant dumbbell-shape deformation. It therefore directly assesses the ability of RBCs to cross inter-endothelial splenic slits in vivo. This protocol has potential applications in quality control for transfusion and in determination of phenotypic markers of erythrocytes in hematological diseases.
Strategic and Operational Plan for Integrating Transcriptomics ...
Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016
High-Throughput Experimental Approach Capabilities | Materials Science |
NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non
NASA Astrophysics Data System (ADS)
Kumar, Vijay M.; Murthy, ANN; Chandrashekara, K.
2012-05-01
The production planning problem of flexible manufacturing system (FMS) concerns with decisions that have to be made before an FMS begins to produce parts according to a given production plan during an upcoming planning horizon. The main aspect of production planning deals with machine loading problem in which selection of a subset of jobs to be manufactured and assignment of their operations to the relevant machines are made. Such problems are not only combinatorial optimization problems, but also happen to be non-deterministic polynomial-time-hard, making it difficult to obtain satisfactory solutions using traditional optimization techniques. In this paper, an attempt has been made to address the machine loading problem with objectives of minimization of system unbalance and maximization of throughput simultaneously while satisfying the system constraints related to available machining time and tool slot designing and using a meta-hybrid heuristic technique based on genetic algorithm and particle swarm optimization. The results reported in this paper demonstrate the model efficiency and examine the performance of the system with respect to measures such as throughput and system utilization.
Analysis of Container Yard Capacity In North TPK Using ARIMA Method
NASA Astrophysics Data System (ADS)
Sirajuddin; Cut Gebrina Hisbach, M.; Ekawati, Ratna; Ade Irman, SM
2018-03-01
North container terminal known as North TPK is container terminal located in Indonesia Port Corporation area serving domestic container loading and unloading. It has 1006 ground slots with a total capacity of 5,544 TEUs and the maximum throughput of containers is 539,616 TEUs / year. Container throughput in the North TPK is increasing year by year. In 2011-2012, the North TPK container throughput is 165,080 TEUs / year and in 2015-2016 has reached 213,147 TEUs / year. To avoid congestion, and prevent possible losses in the future, this paper will analyze the flow of containers and the level of Yard Occupation Ratio in the North TPK at Tanjung Priok Port. The method used is the Autoregressive Integrated Moving Average (ARIMA) Model. ARIMA is a model that completely ignores independent variables in making forecasting. ARIMA results show that in 2016-2017 the total throughput of containers reached 234,006 TEUs / year with field effectiveness of 43.4% and in 2017-2018 the total throughput of containers reached 249,417 TEUs / year with field effectiveness 46.2%.
Argueta, Edwin; Shaji, Jeena; Gopalan, Arun; Liao, Peilin; Snurr, Randall Q; Gómez-Gualdrón, Diego A
2018-01-09
Metal-organic frameworks (MOFs) are porous crystalline materials with attractive properties for gas separation and storage. Their remarkable tunability makes it possible to create millions of MOF variations but creates the need for fast material screening to identify promising structures. Computational high-throughput screening (HTS) is a possible solution, but its usefulness is tied to accurate predictions of MOF adsorption properties. Accurate adsorption simulations often require an accurate description of electrostatic interactions, which depend on the electronic charges of the MOF atoms. HTS-compatible methods to assign charges to MOF atoms need to accurately reproduce electrostatic potentials (ESPs) and be computationally affordable, but current methods present an unsatisfactory trade-off between computational cost and accuracy. We illustrate a method to assign charges to MOF atoms based on ab initio calculations on MOF molecular building blocks. A library of building blocks with built-in charges is thus created and used by an automated MOF construction code to create hundreds of MOFs with charges "inherited" from the constituent building blocks. The molecular building block-based (MBBB) charges are similar to REPEAT charges-which are charges that reproduce ESPs obtained from ab initio calculations on crystallographic unit cells of nanoporous crystals-and thus similar predictions of adsorption loadings, heats of adsorption, and Henry's constants are obtained with either method. The presented results indicate that the MBBB method to assign charges to MOF atoms is suitable for use in computational high-throughput screening of MOFs for applications that involve adsorption of molecules such as carbon dioxide.
Prasad, Satendra; Wouters, Eloy R; Dunyach, Jean-Jacques
2015-08-18
Ion sampling from an electrospray ionization (ESI) source was improved by increasing gas conductance of the MS inlet by 4.3-fold. Converting the gas throughput (Q) into sensitivity improvement was dependent on ion desolvation and handling of the gas load. Desolvation was addressed by using a novel slot shaped inlet that exhibited desolvation properties identical to the 0.58 mm i.d capillary. An assay tailored for "small molecules" at high chromatographic flow rate (500 μL/min) yielded a compound dependent 6.5 to 14-fold signal gain while analysis at nano chromatographic flow rate (300 nL/min) showed 2 to 3.5-fold improvement for doubly charged peptides. Improvement exceeding the Q (4.3-fold) at high chromatographic flow rate was explained by superior sampling of the spatially dispersed ion spray when using the slot shaped capillary. Sensitivity improvement across a wide range of chromatographic flow rate confirmed no compromise in ion desolvation with the increase in Q. Another improvement included less overflow of gas into the mass analyzer from the foreline region owing to the slot shape of the capillary. By doubling the roughing pump capacity and operating the electrodynamic ion funnel (EDIF) at ∼4 Torr, a single pumping stage was sufficient to handle the gas load. The transport of solvent clusters from the LC effluent into the mass analyzer was prevented by a "wavy shaped" transfer quadrupole and was compared with a benchmark approach that delivered ions orthogonally into a differentially pumped dual EDIF at comparable gas Q.
Characterization of a starch based desiccant wheel dehumidifier
NASA Astrophysics Data System (ADS)
Beery, Kyle Edward
Starch, cellulose, and hemicellulose have an affinity for water, and adsorb water vapor from air. Materials made from combinations of these biobased sugar polymers also have been found to possess adsorptive properties. An interesting possible application of these starch-based adsorbents is the desiccant wheel dehumidifier. The desiccant wheel dehumidifier is used in conjunction with a standard air conditioning system. In this process, ambient air is passed through a stationary section while a wheel packed with desiccant rotates through that section. The desiccant adsorbs humidity (latent load) from the air, and the air conditioning system then cools the air (sensible load). Several starch based adsorbents were developed and tested for adsorptive capacity in a new high throughput screening system. The best formulations from the high throughput screening system, also taking into account economic considerations and structural integrity, were considered for use in the desiccant wheel dehumidifier. A suitable adsorbent was chosen and formulated into a matrix structure for the desiccant wheel system. A prototype desiccant wheel system was constructed and the performance was investigated under varying regeneration temperatures and rotation speeds. The results from the experiments showed that the starch based desiccant wheel dehumidification system does transfer moisture from the inlet process stream to the outlet regeneration stream. The DESSIM model was modified for the starch based adsorbent and compared to the experimental results. Also, the results when the wheel parameters were varied were compared to the predicted results from the model. The results given by the starch based desiccant wheel system show the desired proof of concept.
Based new WiMax simulation model to investigate Qos with OPNET modeler in sheduling environment
NASA Astrophysics Data System (ADS)
Saini, Sanju; Saini, K. K.
2012-11-01
WiMAX stands for World Interoperability for Microwave Access. It is considered a major part of broadband wireless network having the IEEE 802.16 standard. WiMAX provides innovative, fixed as well as mobile platforms for broadband internet access anywhere anytime with different transmission modes. The results show approximately equal load and throughput while the delay values vary among the different Base Stations Introducing the various type of scheduling algorithm, like FIFO,PQ,WFQ, for comparison of four type of scheduling service, with its own QoS needs and also introducing OPNET modeler support for Worldwide Interoperability for Microwave Access (WiMAX) network. The simulation results indicate the correctness and the effectiveness of this algorithm. This paper presents a WiMAX simulation model designed with OPNET modeler 14 to measure the delay, load and the throughput performance factors.
NASA Technical Reports Server (NTRS)
Brunstrom, Anna; Leutenegger, Scott T.; Simha, Rahul
1995-01-01
Traditionally, allocation of data in distributed database management systems has been determined by off-line analysis and optimization. This technique works well for static database access patterns, but is often inadequate for frequently changing workloads. In this paper we address how to dynamically reallocate data for partionable distributed databases with changing access patterns. Rather than complicated and expensive optimization algorithms, a simple heuristic is presented and shown, via an implementation study, to improve system throughput by 30 percent in a local area network based system. Based on artificial wide area network delays, we show that dynamic reallocation can improve system throughput by a factor of two and a half for wide area networks. We also show that individual site load must be taken into consideration when reallocating data, and provide a simple policy that incorporates load in the reallocation decision.
A device for high-throughput monitoring of degradation in soft tissue samples.
Tzeranis, D S; Panagiotopoulos, I; Gkouma, S; Kanakaris, G; Georgiou, N; Vaindirlis, N; Vasileiou, G; Neidlin, M; Gkousioudi, A; Spitas, V; Macheras, G A; Alexopoulos, L G
2018-06-06
This work describes the design and validation of a novel device, the High-Throughput Degradation Monitoring Device (HDD), for monitoring the degradation of 24 soft tissue samples over incubation periods of several days inside a cell culture incubator. The device quantifies sample degradation by monitoring its deformation induced by a static gravity load. Initial instrument design and experimental protocol development focused on quantifying cartilage degeneration. Characterization of measurement errors, caused mainly by thermal transients and by translating the instrument sensor, demonstrated that HDD can quantify sample degradation with <6 μm precision and <10 μm temperature-induced errors. HDD capabilities were evaluated in a pilot study that monitored the degradation of fresh ex vivo human cartilage samples by collagenase solutions over three days. HDD could robustly resolve the effects of collagenase concentration as small as 0.5 mg/ml. Careful sample preparation resulted in measurements that did not suffer from donor-to-donor variation (coefficient of variance <70%). Due to its unique combination of sample throughput, measurement precision, temporal sampling and experimental versality, HDD provides a novel biomechanics-based experimental platform for quantifying the effects of proteins (cytokines, growth factors, enzymes, antibodies) or small molecules on the degradation of soft tissues or tissue engineering constructs. Thereby, HDD can complement established tools and in vitro models in important applications including drug screening and biomaterial development. Copyright © 2018 Elsevier Ltd. All rights reserved.
Shrink-induced single-cell plastic microwell array.
Lew, Valerie; Nguyen, Diep; Khine, Michelle
2011-12-01
The ability to interrogate and track single cells over time in a high-throughput format would provide critical information for fundamental biological understanding of processes and for various applications, including drug screening and toxicology. We have developed an ultrarapid and simple method to create single-cell wells of controllable diameter and depth with commodity shrink-wrap film and tape. Using a programmable CO(2) laser, we cut hole arrays into the tape. The tape then serves as a shadow mask to selectively etch wells into commodity shrink-wrap film by O(2) plasma. When the shrink-wrap film retracts upon briefly heating, high-aspect plastic microwell arrays with diameters down to 20 μm are readily achieved. We calibrated the loading procedure with fluorescent microbeads. Finally, we demonstrate the utility of the wells by loading fluorescently labeled single human embryonic stem cells into the wells. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
Wang, Jian; Evans, Julian R G
2005-01-01
This paper describes the design, construction, and operation of the London University Search Instrument (LUSI) which was recently commissioned to create and test combinatorial libraries of ceramic compositions. The instrument uses commercially available powders, milled as necessary to create thick-film libraries by ink-jet printing. Multicomponent mixtures are prepared by well plate reformatting of ceramic inks. The library tiles are robotically loaded into a flatbed furnace and, when fired, transferred to a 2-axis high-resolution measurement table fitted with a hot plate where measurements of, for example, optical or electrical properties can be made. Data are transferred to a dedicated high-performance computer. The possibilities for remote interrogation and search steering are discussed.
Quantum load balancing in ad hoc networks
NASA Astrophysics Data System (ADS)
Hasanpour, M.; Shariat, S.; Barnaghi, P.; Hoseinitabatabaei, S. A.; Vahid, S.; Tafazolli, R.
2017-06-01
This paper presents a novel approach in targeting load balancing in ad hoc networks utilizing the properties of quantum game theory. This approach benefits from the instantaneous and information-less capability of entangled particles to synchronize the load balancing strategies in ad hoc networks. The quantum load balancing (QLB) algorithm proposed by this work is implemented on top of OLSR as the baseline routing protocol; its performance is analyzed against the baseline OLSR, and considerable gain is reported regarding some of the main QoS metrics such as delay and jitter. Furthermore, it is shown that QLB algorithm supports a solid stability gain in terms of throughput which stands a proof of concept for the load balancing properties of the proposed theory.
High Throughput PBTK: Open-Source Data and Tools for ...
Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy
Parriot, Sandi; Hudson, Thomas H.; Lang, Thierry; Ngundam, Franklyn; Leed, Susan; Sena, Jenell; Harris, Michael; O'Neil, Michael; Sciotti, Richard; Read, Lisa; Lecoeur, Herve; Grogl, Max
2017-01-01
ABSTRACT In any drug discovery and development effort, a reduction in the time of the lead optimization cycle is critical to decrease the time to license and reduce costs. In addition, ethical guidelines call for the more ethical use of animals to minimize the number of animals used and decrease their suffering. Therefore, any effort to develop drugs to treat cutaneous leishmaniasis requires multiple tiers of in vivo testing that start with higher-throughput efficacy assessments and progress to lower-throughput models with the most clinical relevance. Here, we describe the validation of a high-throughput, first-tier, noninvasive model of lesion suppression that uses an in vivo optical imaging technology for the initial screening of compounds. A strong correlation between luciferase activity and the parasite load at up to 18 days postinfection was found. This correlation allows the direct assessment of the effects of drug treatment on parasite burden. We demonstrate that there is a strong correlation between drug efficacy measured on day 18 postinfection and the suppression of lesion size by day 60 postinfection, which allows us to reach an accurate conclusion on drug efficacy in only 18 days. Compounds demonstrating a significant reduction in the bioluminescence signal compared to that in control animals can be tested in lower-throughput, more definitive tests of lesion cure in BALB/c mice and Golden Syrian hamsters (GSH) using Old World and New World parasites. PMID:28137819
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartipi, Sina, E-mail: S.Sartipi@tudelft.nl, E-mail: J.Gascon@tudelft.nl; Jansma, Harrie; Bosma, Duco
2013-12-15
Design and operation of a “six-flow fixed-bed microreactor” setup for Fischer-Tropsch synthesis (FTS) is described. The unit consists of feed and mixing, flow division, reaction, separation, and analysis sections. The reactor system is made of five heating blocks with individual temperature controllers, assuring an identical isothermal zone of at least 10 cm along six fixed-bed microreactor inserts (4 mm inner diameter). Such a lab-scale setup allows running six experiments in parallel, under equal feed composition, reaction temperature, and conditions of separation and analysis equipment. It permits separate collection of wax and liquid samples (from each flow line), allowing operation with highmore » productivities of C5+ hydrocarbons. The latter is crucial for a complete understanding of FTS product compositions and will represent an advantage over high-throughput setups with more than ten flows where such instrumental considerations lead to elevated equipment volume, cost, and operation complexity. The identical performance (of the six flows) under similar reaction conditions was assured by testing a same catalyst batch, loaded in all microreactors.« less
Li, Yubo; Zhang, Zhenzhu; Liu, Xinyu; Li, Aizhu; Hou, Zhiguo; Wang, Yuming; Zhang, Yanjun
2015-08-28
This study combines solid phase extraction (SPE) using 96-well plates with column-switching technology to construct a rapid and high-throughput method for the simultaneous extraction and non-targeted analysis of small molecules metabolome and lipidome based on ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry. This study first investigated the columns and analytical conditions for small molecules metabolome and lipidome, separated by an HSS T3 and BEH C18 columns, respectively. Next, the loading capacity and actuation duration of SPE were further optimized. Subsequently, SPE and column switching were used together to rapidly and comprehensively analyze the biological samples. The experimental results showed that the new analytical procedure had good precision and maintained sample stability (RSD<15%). The method was then satisfactorily applied to more widely analyze the small molecules metabolome and lipidome to test the throughput. The resulting method represents a new analytical approach for biological samples, and a highly useful tool for researches in metabolomics and lipidomics. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Music, Denis, E-mail: music@mch.rwth-aachen.de; Geyer, Richard W.; Hans, Marcus
2016-07-28
To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO{sub 2} with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m{sup −1} K{sup −2} for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to puremore » NbO{sub 2} and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.« less
Highly scalable, closed-loop synthesis of drug-loaded, layer-by-layer nanoparticles.
Correa, Santiago; Choi, Ki Young; Dreaden, Erik C; Renggli, Kasper; Shi, Aria; Gu, Li; Shopsowitz, Kevin E; Quadir, Mohiuddin A; Ben-Akiva, Elana; Hammond, Paula T
2016-02-16
Layer-by-layer (LbL) self-assembly is a versatile technique from which multicomponent and stimuli-responsive nanoscale drug carriers can be constructed. Despite the benefits of LbL assembly, the conventional synthetic approach for fabricating LbL nanoparticles requires numerous purification steps that limit scale, yield, efficiency, and potential for clinical translation. In this report, we describe a generalizable method for increasing throughput with LbL assembly by using highly scalable, closed-loop diafiltration to manage intermediate purification steps. This method facilitates highly controlled fabrication of diverse nanoscale LbL formulations smaller than 150 nm composed from solid-polymer, mesoporous silica, and liposomal vesicles. The technique allows for the deposition of a broad range of polyelectrolytes that included native polysaccharides, linear polypeptides, and synthetic polymers. We also explore the cytotoxicity, shelf life and long-term storage of LbL nanoparticles produced using this approach. We find that LbL coated systems can be reliably and rapidly produced: specifically, LbL-modified liposomes could be lyophilized, stored at room temperature, and reconstituted without compromising drug encapsulation or particle stability, thereby facilitating large scale applications. Overall, this report describes an accessible approach that significantly improves the throughput of nanoscale LbL drug-carriers that show low toxicity and are amenable to clinically relevant storage conditions.
Automatic cassette to cassette radiant impulse processor
NASA Astrophysics Data System (ADS)
Sheets, Ronald E.
1985-01-01
Single wafer rapid annealing using high temperature isothermal processing has become increasingly popular in recent years. In addition to annealing, this process is also being investigated for suicide formation, passivation, glass reflow and alloying. Regardless of the application, there is a strong necessity to automate in order to maintain process control, repeatability, cleanliness and throughput. These requirements have been carefully addressed during the design and development of the Model 180 Radiant Impulse Processor which is a totally automatic cassette to cassette wafer processing system. Process control and repeatability are maintained by a closed loop optical pyrometer system which maintains the wafer at the programmed temperature-time conditions. Programmed recipes containing up to 10 steps may be easily entered on the computer keyboard or loaded in from a recipe library stored on a standard 5 {1}/{4″} floppy disk. Cold wall heating chamber construction, controlled environment (N 2, A, forming gas) and quartz wafer carriers prevent contamination of the wafer during high temperature processing. Throughputs of 150-240 wafers per hour are achieved by quickly heating the wafer to temperature (450-1400°C) in 3-6 s with a high intensity, uniform (± 1%) radiant flux of 100 {W}/{cm 2}, parallel wafer handling system and a wafer cool down stage.
Cache-Oblivious parallel SIMD Viterbi decoding for sequence search in HMMER.
Ferreira, Miguel; Roma, Nuno; Russo, Luis M S
2014-05-30
HMMER is a commonly used bioinformatics tool based on Hidden Markov Models (HMMs) to analyze and process biological sequences. One of its main homology engines is based on the Viterbi decoding algorithm, which was already highly parallelized and optimized using Farrar's striped processing pattern with Intel SSE2 instruction set extension. A new SIMD vectorization of the Viterbi decoding algorithm is proposed, based on an SSE2 inter-task parallelization approach similar to the DNA alignment algorithm proposed by Rognes. Besides this alternative vectorization scheme, the proposed implementation also introduces a new partitioning of the Markov model that allows a significantly more efficient exploitation of the cache locality. Such optimization, together with an improved loading of the emission scores, allows the achievement of a constant processing throughput, regardless of the innermost-cache size and of the dimension of the considered model. The proposed optimized vectorization of the Viterbi decoding algorithm was extensively evaluated and compared with the HMMER3 decoder to process DNA and protein datasets, proving to be a rather competitive alternative implementation. Being always faster than the already highly optimized ViterbiFilter implementation of HMMER3, the proposed Cache-Oblivious Parallel SIMD Viterbi (COPS) implementation provides a constant throughput and offers a processing speedup as high as two times faster, depending on the model's size.
Automating fruit fly Drosophila embryo injection for high throughput transgenic studies
NASA Astrophysics Data System (ADS)
Cornell, E.; Fisher, W. W.; Nordmeyer, R.; Yegian, D.; Dong, M.; Biggin, M. D.; Celniker, S. E.; Jin, J.
2008-01-01
To decipher and manipulate the 14 000 identified Drosophila genes, there is a need to inject a large number of embryos with transgenes. We have developed an automated instrument for high throughput injection of Drosophila embryos. It was built on an inverted microscope, equipped with a motorized xy stage, autofocus, a charge coupled device camera, and an injection needle mounted on a high speed vertical stage. A novel, micromachined embryo alignment device was developed to facilitate the arrangement of a large number of eggs. The control system included intelligent and dynamic imaging and analysis software and an embryo injection algorithm imitating a human operator. Once the injection needle and embryo slide are loaded, the software automatically images and characterizes each embryo and subsequently injects DNA into all suitable embryos. The ability to program needle flushing and monitor needle status after each injection ensures reliable delivery of biomaterials. Using this instrument, we performed a set of transformation injection experiments. The robot achieved injection speeds and transformation efficiencies comparable to those of a skilled human injector. Because it can be programed to allow injection at various locations in the embryo, such as the anterior pole or along the dorsal or ventral axes, this system is also suitable for injection of general biochemicals, including drugs and RNAi.
High throughput optical scanner
Basiji, David A.; van den Engh, Gerrit J.
2001-01-01
A scanning apparatus is provided to obtain automated, rapid and sensitive scanning of substrate fluorescence, optical density or phosphorescence. The scanner uses a constant path length optical train, which enables the combination of a moving beam for high speed scanning with phase-sensitive detection for noise reduction, comprising a light source, a scanning mirror to receive light from the light source and sweep it across a steering mirror, a steering mirror to receive light from the scanning mirror and reflect it to the substrate, whereby it is swept across the substrate along a scan arc, and a photodetector to receive emitted or scattered light from the substrate, wherein the optical path length from the light source to the photodetector is substantially constant throughout the sweep across the substrate. The optical train can further include a waveguide or mirror to collect emitted or scattered light from the substrate and direct it to the photodetector. For phase-sensitive detection the light source is intensity modulated and the detector is connected to phase-sensitive detection electronics. A scanner using a substrate translator is also provided. For two dimensional imaging the substrate is translated in one dimension while the scanning mirror scans the beam in a second dimension. For a high throughput scanner, stacks of substrates are loaded onto a conveyor belt from a tray feeder.
Impact of Roadway Stormwater Runoff on Microbial Contamination in the Receiving Stream.
Wyckoff, Kristen N; Chen, Si; Steinman, Andrew J; He, Qiang
2017-09-01
Stormwater runoff from roadways has increasingly become a regulatory concern for water pollution control. Recent work has suggested roadway stormwater runoff as a potential source of microbial pollutants. The objective of this study was to determine the impact of roadway runoff on the microbiological quality of receiving streams. Microbiological quality of roadway stormwater runoff and the receiving stream was monitored during storm events with both cultivation-dependent fecal bacteria enumeration and cultivation-independent high-throughput sequencing techniques. Enumeration of total coliforms as a measure of fecal microbial pollution found consistently lower total coliform counts in roadway runoff than those in the stream water, suggesting that roadway runoff was not a major contributor of microbial pollutants to the receiving stream. Further characterization of the microbial community in the stormwater samples by 16S ribosomal RNA gene-based high-throughput amplicon sequencing revealed significant differences in the microbial composition of stormwater runoff from the roadways and the receiving stream. The differences in microbial composition between the roadway runoff and stream water demonstrate that roadway runoff did not appear to have a major influence on the stream in terms of microbiological quality. Thus, results from both fecal bacteria enumeration and high-throughput amplicon sequencing techniques were consistent that roadway stormwater runoff was not the primary contributor of microbial loading to the stream. Further studies of additional watersheds with distinct characteristics are needed to validate these findings. Understanding gained in this study could support the development of more effective strategies for stormwater management in sensitive watersheds. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Application of ToxCast High-Throughput Screening and ...
Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors
2008-03-01
Postgraduate School’s COASTS international field- testing and thesis research program. B. COASTS 2007 Indonesia, Malaysia , Singapore, Thailand, and the...software tools available for monitoring and testing network throughput. One Dell Laptop was loaded with the IxChariot console as shown in Figure 32...91 J. LAPTOP COMPUTERS As mentioned in the previous section, one dell laptop was loaded with the IxChariot console. Two additional laptop
Folmsbee, Martha
2015-01-01
Approximately 97% of filter validation tests result in the demonstration of absolute retention of the test bacteria, and thus sterile filter validation failure is rare. However, while Brevundimonas diminuta (B. diminuta) penetration of sterilizing-grade filters is rarely detected, the observation that some fluids (such as vaccines and liposomal fluids) may lead to an increased incidence of bacterial penetration of sterilizing-grade filters by B. diminuta has been reported. The goal of the following analysis was to identify important drivers of filter validation failure in these rare cases. The identification of these drivers will hopefully serve the purpose of assisting in the design of commercial sterile filtration processes with a low risk of filter validation failure for vaccine, liposomal, and related fluids. Filter validation data for low-surface-tension fluids was collected and evaluated with regard to the effect of bacterial load (CFU/cm(2)), bacterial load rate (CFU/min/cm(2)), volume throughput (mL/cm(2)), and maximum filter flux (mL/min/cm(2)) on bacterial penetration. The data set (∼1162 individual filtrations) included all instances of process-specific filter validation failures performed at Pall Corporation, including those using other filter media, but did not include all successful retentive filter validation bacterial challenges. It was neither practical nor necessary to include all filter validation successes worldwide (Pall Corporation) to achieve the goals of this analysis. The percentage of failed filtration events for the selected total master data set was 27% (310/1162). Because it is heavily weighted with penetration events, this percentage is considerably higher than the actual rate of failed filter validations, but, as such, facilitated a close examination of the conditions that lead to filter validation failure. In agreement with our previous reports, two of the significant drivers of bacterial penetration identified were the total bacterial load and the bacterial load rate. In addition to these parameters, another three possible drivers of failure were also identified: volume throughput, maximum filter flux, and pressure. Of the data for which volume throughput information was available, 24% (249/1038) of the filtrations resulted in penetration. However, for the volume throughput range of 680-2260 mL/cm(2), only 9 out of 205 bacterial challenges (∼4%) resulted in penetration. Of the data for which flux information was available, 22% (212/946) resulted in bacterial penetration. However, in the maximum filter flux range from 7 to 18 mL/min/cm(2), only one out of 121 filtrations (0.6%) resulted in penetration. A slight increase in filter failure was observed in filter bacterial challenges with a differential pressure greater than 30 psid. When designing a commercial process for the sterile filtration of a low-surface-tension fluid (or any other potentially high-risk fluid), targeting the volume throughput range of 680-2260 mL/cm(2) or flux range of 7-18 mL/min/cm(2), and maintaining the differential pressure below 30 psid, could significantly decrease the risk of validation filter failure. However, it is important to keep in mind that these are general trends described in this study and some test fluids may not conform to the general trends described here. Ultimately, it is important to evaluate both filterability and bacterial retention of the test fluid under proposed process conditions prior to finalizing the manufacturing process to ensure successful process-specific filter validation of low-surface-tension fluids. An overwhelming majority of process-specific filter validation (qualification) tests result in the demonstration of absolute retention of test bacteria by sterilizing-grade membrane filters. As such, process-specific filter validation failure is rare. However, while bacterial penetration of sterilizing-grade filters during process-specific filter validation is rarely detected, some fluids (such as vaccines and liposomal fluids) have been associated with an increased incidence of bacterial penetration. The goal of the following analysis was to identify important drivers of process-specific filter validation failure. The identification of these drivers will possibly serve to assist in the design of commercial sterile filtration processes with a low risk of filter validation failure. Filter validation data for low-surface-tension fluids was collected and evaluated with regard to bacterial concentration and rates, as well as filtered fluid volume and rate (Pall Corporation). The master data set (∼1160 individual filtrations) included all recorded instances of process-specific filter validation failures but did not include all successful filter validation bacterial challenge tests. This allowed for a close examination of the conditions that lead to process-specific filter validation failure. As previously reported, two significant drivers of bacterial penetration were identified: the total bacterial load (the total number of bacteria per filter) and the bacterial load rate (the rate at which bacteria were applied to the filter). In addition to these parameters, another three possible drivers of failure were also identified: volumetric throughput, filter flux, and pressure. When designing a commercial process for the sterile filtration of a low-surface-tension fluid (or any other penetrative-risk fluid), targeting the identified bacterial challenge loads, volume throughput, and corresponding flux rates could decrease, and possibly eliminate, the risk of validation filter failure. However, it is important to keep in mind that these are general trends described in this study and some test fluids may not conform to the general trends described here. Ultimately, it is important to evaluate both filterability and bacterial retention of the test fluid under proposed process conditions prior to finalizing the manufacturing process to ensure successful filter validation of low-surface-tension fluids. © PDA, Inc. 2015.
A single-stage optical load-balanced switch for data centers.
Huang, Qirui; Yeo, Yong-Kee; Zhou, Luying
2012-10-22
Load balancing is an attractive technique to achieve maximum throughput and optimal resource utilization in large-scale switching systems. However current electronic load-balanced switches suffer from severe problems in implementation cost, power consumption and scaling. To overcome these problems, in this paper we propose a single-stage optical load-balanced switch architecture based on an arrayed waveguide grating router (AWGR) in conjunction with fast tunable lasers. By reuse of the fast tunable lasers, the switch achieves both functions of load balancing and switching through the AWGR. With this architecture, proof-of-concept experiments have been conducted to investigate the feasibility of the optical load-balanced switch and to examine its physical performance. Compared to three-stage load-balanced switches, the reported switch needs only half of optical devices such as tunable lasers and AWGRs, which can provide a cost-effective solution for future data centers.
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
High Throughput Screening For Hazard and Risk of Environmental Contaminants
High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...
NASA Astrophysics Data System (ADS)
Primeaux, Philip A.; Zhang, Bin; Zhang, Xiaoman; Miller, Jacob; Meng, W. J.; KC, Pratik; Moore, Arden L.
2017-02-01
Microscale fin array structures were replicated onto surfaces of aluminum 1100 and aluminum 6061 alloy (Al1100/Al6061) sheet metals through room-temperature instrumented roll molding. Aluminum-based micro fin arrays were replicated at room temperature, and the fabrication process is one with high throughput and low cost. One-dimensional (1D) micro fin arrays were made through one-pass rolling, while two-dimensional (2D) micro fin arrays were made by sequential 90° cross rolling with the same roller sleeve. For roll molding of 1D micro fins, fin heights greater than 600 µm were achieved and were shown to be proportional to the normal load force per feature width. At a given normal load force, the fin height was further shown to scale inversely with the hardness of the sheet metal. For sequential 90° cross rolling, morphologies of roll molded 2D micro fin arrays were examined, which provided clues to understand how plastic deformation occurred under cross rolling conditions. A series of pool boiling experiments on low profile Al micro fin array structures were performed within Novec 7100, a widely used commercial dielectric coolant. Results for both horizontal and vertical surface orientations show that roll molded Al micro fin arrays can increase heat flux at fixed surface temperature as compared to un-patterned Al sheet. The present results further suggest that many factors beyond just increased surface area can influence heat transfer performance, including surface finish and the important multiphase transport mechanisms in and around the fin geometry. These factors must also be considered when designing and optimizing micro fin array structures for heat transfer applications.
High Throughput Transcriptomics: From screening to pathways
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
NASA Astrophysics Data System (ADS)
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-12-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
High Throughput Experimental Materials Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakutayev, Andriy; Perkins, John; Schwarting, Marcus
The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).
20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)
Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...
High Throughput Determination of Critical Human Dosing Parameters (SOT)
High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...
High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)
High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...
Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos
Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...
Using ALFA for high throughput, distributed data transmission in the ALICE O2 system
NASA Astrophysics Data System (ADS)
Wegrzynek, A.;
2017-10-01
ALICE (A Large Ion Collider Experiment) is a heavy-ion detector designed to study the physics of strongly interacting matter (the Quark-Gluon Plasma at the CERN LHC (Large Hadron Collider). ALICE has been successfully collecting physics data in Run 2 since spring 2015. In parallel, preparations for a major upgrade of the computing system, called O2 (Online-Offline), scheduled for the Long Shutdown 2 in 2019-2020, are being made. One of the major requirements of the system is the capacity to transport data between so-called FLPs (First Level Processors), equipped with readout cards, and the EPNs (Event Processing Node), performing data aggregation, frame building and partial reconstruction. It is foreseen to have 268 FLPs dispatching data to 1500 EPNs with an average output of 20 Gb/s each. In overall, the O2 processing system will operate at terabits per second of throughput while handling millions of concurrent connections. The ALFA framework will standardize and handle software related tasks such as readout, data transport, frame building, calibration, online reconstruction and more in the upgraded computing system. ALFA supports two data transport libraries: ZeroMQ and nanomsg. This paper discusses the efficiency of ALFA in terms of high throughput data transport. The tests were performed with multiple FLPs pushing data to multiple EPNs. The transfer was done using push-pull communication patterns and two socket configurations: bind, connect. The set of benchmarks was prepared to get the most performant results on each hardware setup. The paper presents the measurement process and final results - data throughput combined with computing resources usage as a function of block size. The high number of nodes and connections in the final set up may cause race conditions that can lead to uneven load balancing and poor scalability. The performed tests allow us to validate whether the traffic is distributed evenly over all receivers. It also measures the behaviour of the network in saturation and evaluates scalability from a 1-to-1 to a N-to-M solution.
RESULTS OF THE FY09 ENHANCED DOE HIGH LEVEL WASTE MELTER THROUGHPUT STUDIES AT SRNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, F.; Edwards, T.
2010-06-23
High-level waste (HLW) throughput (i.e., the amount of waste processed per unit time) is a function of two critical parameters: waste loading (WL) and melt rate. For the Waste Treatment and Immobilization Plant (WTP) at the Hanford Site and the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). The objective of this task is to develop data, assess property models, and refine or develop the necessary models to support increased WL of HLW at SRS. It is a continuationmore » of the studies initiated in FY07, but is under the specific guidance of a Task Change Request (TCR)/Work Authorization received from DOE headquarters (Project Number RV071301). Using the data generated in FY07, FY08 and historical data, two test matrices (60 glasses total) were developed at the Savannah River National Laboratory (SRNL) in order to generate data in broader compositional regions. These glasses were fabricated and characterized using chemical composition analysis, X-ray Diffraction (XRD), viscosity, liquidus temperature (TL) measurement and durability as defined by the Product Consistency Test (PCT). The results of this study are summarized below: (1) In general, the current durability model predicts the durabilities of higher waste loading glasses quite well. A few of the glasses exhibited poorer durability than predicted. (2) Some of the glasses exhibited anomalous behavior with respect to durability (normalized leachate for boron (NL [B])). The quenched samples of FY09EM21-02, -07 and -21 contained no nepheline or other wasteform affecting crystals, but have unacceptable NL [B] values (> 10 g/L). The ccc sample of FY09EM21-07 has a NL [B] value that is more than one half the value of the quenched sample. These glasses also have lower concentrations of Al{sub 2}O{sub 3} and SiO{sub 2}. (3) Five of the ccc samples (EM-13, -14, -15, -29 and -30) completely crystallized with both magnetite and nepheline, and still had extremely low NL [B] values. These particular glasses have more CaO present than any of the other glasses in the matrix. It appears that while all of the glasses contain nepheline, the NL [B] values decrease as the CaO concentration increases from 2.3 wt% to 4.3 wt%. A different form of nepheline may be created at higher concentrations of CaO that does not significantly reduce glass durability. (4) The T{sub L} model appears to be under-predicting the measured values of higher waste loading glasses. Trends in T{sub L} with composition are not evident in the data from these studies. (5) A small number of glasses in the FY09 matrix have measured viscosities that are much lower than the viscosity range over which the current model was developed. The decrease in viscosity is due to a higher concentration of non-bridging oxygens (NBO). A high iron concentration is the cause of the increase in NBO. Durability, viscosity and T{sub L} data collected during FY07 and FY09 that specifically targeted higher waste loading glasses was compiled and assessed. It appears that additional data may be required to expand the coverage of the T{sub L} and viscosity models for higher waste loading glasses. In general, the compositional regions of the higher waste loading glasses are very different than those used to develop these models. On the other hand, the current durability model seems to be applicable to the new data. At this time, there is no evidence to modify this model; however additional experimental studies should be conducted to determine the cause of the anomalous durability data.« less
Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...
2015-01-07
Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737
Experiences with http/WebDAV protocols for data access in high throughput computing
NASA Astrophysics Data System (ADS)
Bernabeu, Gerard; Martinez, Francisco; Acción, Esther; Bria, Arnau; Caubet, Marc; Delfino, Manuel; Espinal, Xavier
2011-12-01
In the past, access to remote storage was considered to be at least one order of magnitude slower than local disk access. Improvement on network technologies provide the alternative of using remote disk. For those accesses one can today reach levels of throughput similar or exceeding those of local disks. Common choices as access protocols in the WLCG collaboration are RFIO, [GSI]DCAP, GRIDFTP, XROOTD and NFS. HTTP protocol shows a promising alternative as it is a simple, lightweight protocol. It also enables the use of standard technologies such as http caching or load balancing which can be used to improve service resilience and scalability or to boost performance for some use cases seen in HEP such as the "hot files". WebDAV extensions allow writing data, giving it enough functionality to work as a remote access protocol. This paper will show our experiences with the WebDAV door for dCache, in terms of functionality and performance, applied to some of the HEP work flows in the LHC Tier1 at PIC.
High-throughput screening (HTS) and modeling of the retinoid ...
Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system
Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)
High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...
High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes
USDA-ARS?s Scientific Manuscript database
High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...
Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)
Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...
A quantitative literature-curated gold standard for kinase-substrate pairs
2011-01-01
We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431
High-Throughput Industrial Coatings Research at The Dow Chemical Company.
Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T
2016-09-12
At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.
Tiersch, Terrence R.; Yang, Huiping; Hu, E.
2011-01-01
With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666
Microfluidic Bead Suspension Hopper
2014-01-01
Many high-throughput analytical platforms, from next-generation DNA sequencing to drug discovery, rely on beads as carriers of molecular diversity. Microfluidic systems are ideally suited to handle and analyze such bead libraries with high precision and at minute volume scales; however, the challenge of introducing bead suspensions into devices before they sediment usually confounds microfluidic handling and analysis. We developed a bead suspension hopper that exploits sedimentation to load beads into a microfluidic droplet generator. A suspension hopper continuously delivered synthesis resin beads (17 μm diameter, 112,000 over 2.67 h) functionalized with a photolabile linker and pepstatin A into picoliter-scale droplets of an HIV-1 protease activity assay to model ultraminiaturized compound screening. Likewise, trypsinogen template DNA-coated magnetic beads (2.8 μm diameter, 176,000 over 5.5 h) were loaded into droplets of an in vitro transcription/translation system to model a protein evolution experiment. The suspension hopper should effectively remove any barriers to using suspensions as sample inputs, paving the way for microfluidic automation to replace robotic library distribution. PMID:24761972
Spatial distribution of traffic in a cellular mobile data network
NASA Astrophysics Data System (ADS)
Linnartz, J. P. M. G.
1987-02-01
The use of integral transforms of the probability density function for the received power to analyze the relation between the spatial distributions of offered and throughout packet traffic in a mobile radio network with Rayleigh fading channels and ALOHA multiple access was assessed. A method to obtain the spatial distribution of throughput traffic from a prescribed spatial distribution of offered traffic is presented. Incoherent and coherent addition of interference signals is considered. The channel behavior for heavy traffic loads is studied. In both the incoherent and coherent case, the spatial distribution of offered traffic required to ensure a prescribed spatially uniform throughput is synthesized numerically.
Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich
2006-01-01
The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.
High throughput vacuum chemical epitaxy
NASA Astrophysics Data System (ADS)
Fraas, L. M.; Malocsay, E.; Sundaram, V.; Baird, R. W.; Mao, B. Y.; Lee, G. Y.
1990-10-01
We have developed a vacuum chemical epitaxy (VCE) reactor which avoids the use of arsine and allows multiple wafers to be coated at one time. Our vacuum chemical epitaxy reactor closely resembles a molecular beam epitaxy system in that wafers are loaded into a stainless steel vacuum chamber through a load chamber. Also as in MBE, arsenic vapors are supplied as reactant by heating solid arsenic sources thereby avoiding the use of arsine. However, in our VCE reactor, a large number of wafers are coated at one time in a vacuum system by the substitution of Group III alkyl sources for the elemental metal sources traditionally used in MBE. Higher wafer throughput results because in VCE, the metal-alkyl sources for Ga, Al, and dopants can be mixed at room temperature and distributed uniformly though a large area injector to multiple substrates as a homogeneous array of mixed element molecular beams. The VCE reactor that we have built and that we shall describe here uniformly deposits films on 7 inch diameter substrate platters. Each platter contains seven two inch or three 3 inch diameter wafers. The load chamber contains up to nine platters. The vacuum chamber is equipped with two VCE growth zones and two arsenic ovens, one per growth zone. Finally, each oven has a 1 kg arsenic capacity. As of this writing, mirror smooth GaAs films have been grown at up to 4 μm/h growth rate on multiple wafers with good thickness uniformity. The background doping is p-type with a typical hole concentration and mobility of 1 × 10 16/cm 3 and 350 cm 2/V·s. This background doping level is low enough for the fabrication of MESFETs, solar cells, and photocathodes as well as other types of devices. We have fabricated MESFET devices using VCE-grown epi wafers with peak extrinsic transconductance as high as 210 mS/mm for a threshold voltage of - 3 V and a 0.6 μm gate length. We have also recently grown AlGaAs epi layers with up to 80% aluminum using TEAl as the aluminum alkyl source. The AlGaAs layer thickness and aluminum content uniformity appear excellent.
The Effect of Polymeric Nanoparticles on Biocompatibility of Carrier Red Blood Cells
Pan, Daniel; Vargas-Morales, Omayra; Zern, Blaine; Anselmo, Aaron C.; Gupta, Vivek; Zakrewsky, Michael; Mitragotri, Samir; Muzykantov, Vladimir
2016-01-01
Red blood cells (RBCs) can be used for vascular delivery of encapsulated or surface-bound drugs and carriers. Coupling to RBC prolongs circulation of nanoparticles (NP, 200 nm spheres, a conventional model of polymeric drug delivery carrier) enabling their transfer to the pulmonary vasculature without provoking overt RBC elimination. However, little is known about more subtle and potentially harmful effects of drugs and drug carriers on RBCs. Here we devised high-throughput in vitro assays to determine the sensitivity of loaded RBCs to osmotic stress and other damaging insults that they may encounter in vivo (e.g. mechanical, oxidative and complement insults). Sensitivity of these tests is inversely proportional to RBC concentration in suspension and our results suggest that mouse RBCs are more sensitive to damaging factors than human RBCs. Loading RBCs by NP at 1:50 ratio did not affect RBCs, while 10–50 fold higher NP load accentuated RBC damage by mechanical, osmotic and oxidative stress. This extensive loading of RBC by NP also leads to RBCs agglutination in buffer; however, addition of albumin diminished this effect. These results provide a template for analyses of the effects of diverse cargoes loaded on carrier RBCs and indicate that: i) RBCs can tolerate carriage of NP at doses providing loading of millions of nanoparticles per microliter of blood; ii) tests using protein-free buffers and mouse RBCs may overestimate adversity that may be encountered in humans. PMID:27003833
Tsiliyannis, Christos Aristeides
2013-09-01
Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.
TCP Throughput Profiles Using Measurements over Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less
High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
NASA Astrophysics Data System (ADS)
He, Yang; Geng, Yanquan; Yan, Yongda; Luo, Xichun
2017-09-01
We show that an atomic force microscope (AFM) tip-based dynamic plowing lithography (DPL) approach can be used to fabricate nanoscale pits with high throughput. The method relies on scratching with a relatively large speed over a sample surface in tapping mode, which is responsible for the separation distance of adjacent pits. Scratching tests are carried out on a poly(methyl methacrylate) (PMMA) thin film using a diamond-like carbon coating tip. Results show that 100 μm/s is the critical value of the scratching speed. When the scratching speed is greater than 100 μm/s, pit structures can be generated. In contrast, nanogrooves can be formed with speeds less than the critical value. Because of the difficulty of breaking the molecular chain of glass-state polymer with an applied high-frequency load and low-energy dissipation in one interaction of the tip and the sample, one pit requires 65-80 penetrations to be achieved. Subsequently, the forming process of the pit is analyzed in detail, including three phases: elastic deformation, plastic deformation, and climbing over the pile-up. In particular, 4800-5800 pits can be obtained in 1 s using this proposed method. Both experiments and theoretical analysis are presented that fully determine the potential of this proposed method to fabricate pits efficiently.
He, Yang; Geng, Yanquan; Yan, Yongda; Luo, Xichun
2017-09-22
We show that an atomic force microscope (AFM) tip-based dynamic plowing lithography (DPL) approach can be used to fabricate nanoscale pits with high throughput. The method relies on scratching with a relatively large speed over a sample surface in tapping mode, which is responsible for the separation distance of adjacent pits. Scratching tests are carried out on a poly(methyl methacrylate) (PMMA) thin film using a diamond-like carbon coating tip. Results show that 100 μm/s is the critical value of the scratching speed. When the scratching speed is greater than 100 μm/s, pit structures can be generated. In contrast, nanogrooves can be formed with speeds less than the critical value. Because of the difficulty of breaking the molecular chain of glass-state polymer with an applied high-frequency load and low-energy dissipation in one interaction of the tip and the sample, one pit requires 65-80 penetrations to be achieved. Subsequently, the forming process of the pit is analyzed in detail, including three phases: elastic deformation, plastic deformation, and climbing over the pile-up. In particular, 4800-5800 pits can be obtained in 1 s using this proposed method. Both experiments and theoretical analysis are presented that fully determine the potential of this proposed method to fabricate pits efficiently.
Cache-Oblivious parallel SIMD Viterbi decoding for sequence search in HMMER
2014-01-01
Background HMMER is a commonly used bioinformatics tool based on Hidden Markov Models (HMMs) to analyze and process biological sequences. One of its main homology engines is based on the Viterbi decoding algorithm, which was already highly parallelized and optimized using Farrar’s striped processing pattern with Intel SSE2 instruction set extension. Results A new SIMD vectorization of the Viterbi decoding algorithm is proposed, based on an SSE2 inter-task parallelization approach similar to the DNA alignment algorithm proposed by Rognes. Besides this alternative vectorization scheme, the proposed implementation also introduces a new partitioning of the Markov model that allows a significantly more efficient exploitation of the cache locality. Such optimization, together with an improved loading of the emission scores, allows the achievement of a constant processing throughput, regardless of the innermost-cache size and of the dimension of the considered model. Conclusions The proposed optimized vectorization of the Viterbi decoding algorithm was extensively evaluated and compared with the HMMER3 decoder to process DNA and protein datasets, proving to be a rather competitive alternative implementation. Being always faster than the already highly optimized ViterbiFilter implementation of HMMER3, the proposed Cache-Oblivious Parallel SIMD Viterbi (COPS) implementation provides a constant throughput and offers a processing speedup as high as two times faster, depending on the model’s size. PMID:24884826
[Current applications of high-throughput DNA sequencing technology in antibody drug research].
Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong
2012-03-01
Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.
Routing optimization in networks based on traffic gravitational field model
NASA Astrophysics Data System (ADS)
Liu, Longgeng; Luo, Guangchun
2017-04-01
For research on the gravitational field routing mechanism on complex networks, we further analyze the gravitational effect of paths. In this study, we introduce the concept of path confidence degree to evaluate the unblocked reliability of paths that it takes the traffic state of all nodes on the path into account from the overall. On the basis of this, we propose an improved gravitational field routing protocol considering all the nodes’ gravities on the path and the path confidence degree. In order to evaluate the transmission performance of the routing strategy, an order parameter is introduced to measure the network throughput by the critical value of phase transition from a free-flow phase to a jammed phase, and the betweenness centrality is used to evaluate the transmission performance and traffic congestion of the network. Simulation results show that compared with the shortest-path routing strategy and the previous gravitational field routing strategy, the proposed algorithm improves the network throughput considerably and effectively balances the traffic load within the network, and all nodes in the network are utilized high efficiently. As long as γ ≥ α, the transmission performance can reach the maximum and remains unchanged for different α and γ, which ensures that the proposed routing protocol is high efficient and stable.
FPGA Implementation of Stereo Disparity with High Throughput for Mobility Applications
NASA Technical Reports Server (NTRS)
Villalpando, Carlos Y.; Morfopolous, Arin; Matthies, Larry; Goldberg, Steven
2011-01-01
High speed stereo vision can allow unmanned robotic systems to navigate safely in unstructured terrain, but the computational cost can exceed the capacity of typical embedded CPUs. In this paper, we describe an end-to-end stereo computation co-processing system optimized for fast throughput that has been implemented on a single Virtex 4 LX160 FPGA. This system is capable of operating on images from a 1024 x 768 3CCD (true RGB) camera pair at 15 Hz. Data enters the FPGA directly from the cameras via Camera Link and is rectified, pre-filtered and converted into a disparity image all within the FPGA, incurring no CPU load. Once complete, a rectified image and the final disparity image are read out over the PCI bus, for a bandwidth cost of 68 MB/sec. Within the FPGA there are 4 distinct algorithms: Camera Link capture, Bilinear rectification, Bilateral subtraction pre-filtering and the Sum of Absolute Difference (SAD) disparity. Each module will be described in brief along with the data flow and control logic for the system. The system has been successfully fielded upon the Carnegie Mellon University's National Robotics Engineering Center (NREC) Crusher system during extensive field trials in 2007 and 2008 and is being implemented for other surface mobility systems at JPL.
Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H
1997-01-01
The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.
High Throughput and Mechano-Active Platforms to Promote Cartilage Regeneration and Repair
NASA Astrophysics Data System (ADS)
Mohanraj, Bhavana
Traumatic joint injuries initiate acute degenerative changes in articular cartilage that can lead to progressive loss of load-bearing function. As a result, patients often develop post-traumatic osteoarthritis (PTOA), a condition for which there currently exists no biologic interventions. To address this need, tissue engineering aims to mimic the structure and function of healthy, native counterparts. These constructs can be used to not only replace degenerated tissue, but also build in vitro, pre-clinical models of disease. Towards this latter goal, this thesis focuses on the design of a high throughput system to screen new therapeutics in a micro-engineered model of PTOA, and the development of a mechanically-responsive drug delivery system to augment tissue-engineered approaches for cartilage repair. High throughput screening is a powerful tool for drug discovery that can be adapted to include 3D tissue constructs. To facilitate this process for cartilage repair, we built a high throughput mechanical injury platform to create an engineered cartilage model of PTOA. Compressive injury of functionally mature constructs increased cell death and proteoglycan loss, two hallmarks of injury observed in vivo. Comparison of this response to that of native cartilage explants, and evaluation of putative therapeutics, validated this model for subsequent use in small molecule screens. A primary screen of 118 compounds identified a number of 'hits' and relevant pathways that may modulate pathologic signaling post-injury. To complement this process of therapeutic discovery, a stimuli-responsive delivery system was designed that used mechanical inputs as the 'trigger' mechanism for controlled release. The failure thresholds of these mechanically-activated microcapsules (MAMCs) were influenced by physical properties and composition, as well as matrix mechanical properties in 3D environments. TGF-beta released from the system upon mechano-activation stimulated stem cell chondrogenesis, demonstrating the potential of MAMCs to actively deliver therapeutics within demanding mechanical environments. Taken together, this work advances our capacity to identify and deliver new compounds of clinical relevance to modulate disease progression following traumatic injury using state-of-the-art micro-engineered screening tools and a novel mechanically-activated delivery system. These platforms advance strategies for cartilage repair and regeneration in PTOA and provide new options for the treatment of this debilitating condition.
Lessons from high-throughput protein crystallization screening: 10 years of practical experience
JR, Luft; EH, Snell; GT, DeTitta
2011-01-01
Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073
High-throughput screening based on label-free detection of small molecule microarrays
NASA Astrophysics Data System (ADS)
Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong
2017-02-01
Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.
High-throughput analysis of yeast replicative aging using a microfluidic system
Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong
2015-01-01
Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317
Data Partitioning and Load Balancing in Parallel Disk Systems
NASA Technical Reports Server (NTRS)
Scheuermann, Peter; Weikum, Gerhard; Zabback, Peter
1997-01-01
Parallel disk systems provide opportunities for exploiting I/O parallelism in two possible waves, namely via inter-request and intra-request parallelism. In this paper we discuss the main issues in performance tuning of such systems, namely striping and load balancing, and show their relationship to response time and throughput. We outline the main components of an intelligent, self-reliant file system that aims to optimize striping by taking into account the requirements of the applications and performs load balancing by judicious file allocation and dynamic redistributions of the data when access patterns change. Our system uses simple but effective heuristics that incur only little overhead. We present performance experiments based on synthetic workloads and real-life traces.
COLA: Optimizing Stream Processing Applications via Graph Partitioning
NASA Astrophysics Data System (ADS)
Khandekar, Rohit; Hildrum, Kirsten; Parekh, Sujay; Rajan, Deepak; Wolf, Joel; Wu, Kun-Lung; Andrade, Henrique; Gedik, Buğra
In this paper, we describe an optimization scheme for fusing compile-time operators into reasonably-sized run-time software units called processing elements (PEs). Such PEs are the basic deployable units in System S, a highly scalable distributed stream processing middleware system. Finding a high quality fusion significantly benefits the performance of streaming jobs. In order to maximize throughput, our solution approach attempts to minimize the processing cost associated with inter-PE stream traffic while simultaneously balancing load across the processing hosts. Our algorithm computes a hierarchical partitioning of the operator graph based on a minimum-ratio cut subroutine. We also incorporate several fusion constraints in order to support real-world System S jobs. We experimentally compare our algorithm with several other reasonable alternative schemes, highlighting the effectiveness of our approach.
Erickson, Heidi S
2012-09-28
The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.
Multiscale peak detection in wavelet space.
Zhang, Zhi-Min; Tong, Xia; Peng, Ying; Ma, Pan; Zhang, Ming-Jin; Lu, Hong-Mei; Chen, Xiao-Qing; Liang, Yi-Zeng
2015-12-07
Accurate peak detection is essential for analyzing high-throughput datasets generated by analytical instruments. Derivatives with noise reduction and matched filtration are frequently used, but they are sensitive to baseline variations, random noise and deviations in the peak shape. A continuous wavelet transform (CWT)-based method is more practical and popular in this situation, which can increase the accuracy and reliability by identifying peaks across scales in wavelet space and implicitly removing noise as well as the baseline. However, its computational load is relatively high and the estimated features of peaks may not be accurate in the case of peaks that are overlapping, dense or weak. In this study, we present multi-scale peak detection (MSPD) by taking full advantage of additional information in wavelet space including ridges, valleys, and zero-crossings. It can achieve a high accuracy by thresholding each detected peak with the maximum of its ridge. It has been comprehensively evaluated with MALDI-TOF spectra in proteomics, the CAMDA 2006 SELDI dataset as well as the Romanian database of Raman spectra, which is particularly suitable for detecting peaks in high-throughput analytical signals. Receiver operating characteristic (ROC) curves show that MSPD can detect more true peaks while keeping the false discovery rate lower than MassSpecWavelet and MALDIquant methods. Superior results in Raman spectra suggest that MSPD seems to be a more universal method for peak detection. MSPD has been designed and implemented efficiently in Python and Cython. It is available as an open source package at .
Establishment of an AAV Reverse Infection-Based Array
Wang, Gang; Dong, Zheyue; Shen, Wei; Zheng, Gang; Wu, Xiaobing; Xue, Jinglun; Wang, Yue; Chen, Jinzhong
2010-01-01
Background The development of a convenient high-throughput gene transduction approach is critical for biological screening. Adeno-associated virus (AAV) vectors are broadly used in gene therapy studies, yet their applications in in vitro high-throughput gene transduction are limited. Principal Findings We established an AAV reverse infection (RI)-based method in which cells were transduced by quantified recombinant AAVs (rAAVs) pre-coated onto 96-well plates. The number of pre-coated rAAV particles and number of cells loaded per well, as well as the temperature stability of the rAAVs on the plates, were evaluated. As the first application of this method, six serotypes or hybrid serotypes of rAAVs (AAV1, AAV2, AAV5/5, AAV8, AAV25 m, AAV28 m) were compared for their transduction efficiencies using various cell lines, including BHK21, HEK293, BEAS-2BS, HeLaS3, Huh7, Hepa1-6, and A549. AAV2 and AAV1 displayed high transduction efficiency; thus, they were deemed to be suitable candidate vectors for the RI-based array. We next evaluated the impact of sodium butyrate (NaB) treatment on rAAV vector-mediated reporter gene expression and found it was significantly enhanced, suggesting that our system reflected the biological response of target cells to specific treatments. Conclusions/Significance Our study provides a novel method for establishing a highly efficient gene transduction array that may be developed into a platform for cell biological assays. PMID:20976058
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...
Mascarenhas, Roshan; Pietrzak, Maciej; Smith, Ryan M; Webb, Amy; Wang, Danxin; Papp, Audrey C; Pinsonneault, Julia K; Seweryn, Michal; Rempala, Grzegorz; Sadee, Wolfgang
2015-01-01
mRNA translation into proteins is highly regulated, but the role of mRNA isoforms, noncoding RNAs (ncRNAs), and genetic variants remains poorly understood. mRNA levels on polysomes have been shown to correlate well with expressed protein levels, pointing to polysomal loading as a critical factor. To study regulation and genetic factors of protein translation we measured levels and allelic ratios of mRNAs and ncRNAs (including microRNAs) in lymphoblast cell lines (LCL) and in polysomal fractions. We first used targeted assays to measure polysomal loading of mRNA alleles, confirming reported genetic effects on translation of OPRM1 and NAT1, and detecting no effect of rs1045642 (3435C>T) in ABCB1 (MDR1) on polysomal loading while supporting previous results showing increased mRNA turnover of the 3435T allele. Use of high-throughput sequencing of complete transcript profiles (RNA-Seq) in three LCLs revealed significant differences in polysomal loading of individual RNA classes and isoforms. Correlated polysomal distribution between protein-coding and non-coding RNAs suggests interactions between them. Allele-selective polysome recruitment revealed strong genetic influence for multiple RNAs, attributable either to differential expression of RNA isoforms or to differential loading onto polysomes, the latter defining a direct genetic effect on translation. Genes identified by different allelic RNA ratios between cytosol and polysomes were enriched with published expression quantitative trait loci (eQTLs) affecting RNA functions, and associations with clinical phenotypes. Polysomal RNA-Seq combined with allelic ratio analysis provides a powerful approach to study polysomal RNA recruitment and regulatory variants affecting protein translation.
Accelerating the design of solar thermal fuel materials through high throughput simulations.
Liu, Yun; Grossman, Jeffrey C
2014-12-10
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.
Usefulness of Compile-Time Restructuring of LGDF Programs in Throughput- Critical Applications
1993-09-01
efficiency of the sufers . Ma overhead can be reduced effecively by using the node and an: attributes of the data flow graph at ccunpie-time to...intolerable delays and insufficient buffer space, especiall underbhigh loads. A. THESIS SCOPE AND CONTRIB~tMON The focus of this work is on compile-time
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Non-Covalent Functionalization of Carbon Nanovectors with an Antibody Enables Targeted Drug Delivery
Berlin, Jacob M.; Pham, Tam T.; Sano, Daisuke; Mohamedali, Khalid A.; Marcano, Daniela C.; Myers, Jeffrey N.; Tour, James M.
2011-01-01
Current chemotherapeutics are characterized by efficient tumor cell-killing and severe side effects mostly derived from off target toxicity. Hence targeted delivery of these drugs to tumor cells is actively sought. We previously demonstrated that poly(ethylene glycol)-functionalized carbon nanovectors are able to sequester paclitaxel, a widely used hydrophobic cancer drug, by simple physisorption and deliver the drug for killing of cancer cells. The cell-killing when these drug-loaded carbon nanoparticles were used was equivalent to when a commercial formulation of paclitaxel was used. Here we show that by further mixing the drug-loaded nanoparticles with Cetuximab, a monoclonal antibody that recognizes the epidermal growth factor receptor (EGFR), paclitaxel is preferentially targeted to EGFR+ tumor cells in vitro. This supports progressing to in vivo studies. Moreover, the construct is unusual in that all three components are assembled through non-covalent interactions. Such non-covalent assembly could enable high-throughput screening of drug/antibody combinations. PMID:21736358
Xu, Rui; Yang, Zhao-Hui; Zheng, Yue; Liu, Jian-Bo; Xiong, Wei-Ping; Zhang, Yan-Ru; Lu, Yue; Xue, Wen-Jing; Fan, Chang-Zheng
2018-04-22
Understanding of how anaerobic digestion (AD)-related microbiomes are constructed by operational parameters or their interactions within the biochemical process is limited. Using high-throughput sequencing and molecular ecological network analysis, this study shows the succession of AD-related microbiome hosting diverse members of the phylum Actinobacteria, Bacteroidetes, Euryarchaeota, and Firmicutes, which were affected by organic loading rate (OLR) and hydraulic retention time (HRT). OLR formed finer microbial network modules than HRT (12 vs. 6), suggesting the further subdivision of functional components. Biomarkers were also identified in OLR or HRT groups (e.g. the family Actinomycetaceae, Methanosaetaceae and Aminiphilaceae). The most pair-wise link between Firmicutes and biogas production indicates the keystone members based on network features can be considered as markers in the regulation of AD. A set of 40% species ("core microbiome") were similar across different digesters. Such noteworthy overlap of microbiomes indicates they are generalists in maintaining the ecological stability of digesters. Copyright © 2018 Elsevier Ltd. All rights reserved.
Asati, Atul; Kachurina, Olga; Kachurin, Anatoly
2012-01-01
Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605
High-throughput sample adaptive offset hardware architecture for high-efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin
2018-03-01
A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.
Magnetite-doped polydimethylsiloxane (PDMS) for phosphopeptide enrichment.
Sandison, Mairi E; Jensen, K Tveen; Gesellchen, F; Cooper, J M; Pitt, A R
2014-10-07
Reversible phosphorylation plays a key role in numerous biological processes. Mass spectrometry-based approaches are commonly used to analyze protein phosphorylation, but such analysis is challenging, largely due to the low phosphorylation stoichiometry. Hence, a number of phosphopeptide enrichment strategies have been developed, including metal oxide affinity chromatography (MOAC). Here, we describe a new material for performing MOAC that employs a magnetite-doped polydimethylsiloxane (PDMS), that is suitable for the creation of microwell array and microfluidic systems to enable low volume, high throughput analysis. Incubation time and sample loading were explored and optimized and demonstrate that the embedded magnetite is able to enrich phosphopeptides. This substrate-based approach is rapid, straightforward and suitable for simultaneously performing multiple, low volume enrichments.
Mandalakis, Manolis; Stravinskaitė, Austėja; Lagaria, Anna; Psarra, Stella; Polymenakou, Paraskevi
2017-07-01
Chlorophyll a (Chl a) is the predominant pigment in every single photosynthesizing organism including phytoplankton and one of the most commonly measured water quality parameters. Various methods are available for Chl a analysis, but the majority of them are of limited throughput and require considerable effort and time from the operator. The present study describes a high-throughput, microplate-based fluorometric assay for rapid quantification of Chl a in phytoplankton extracts. Microplate sealing combined with ice cooling was proved an effective means for diminishing solvent evaporation during sample loading and minimized the analytical errors involved in Chl a measurements with a fluorescence microplate reader. A set of operating parameters (settling time, detector gain, sample volume) were also optimized to further improve the intensity and reproducibility of Chl a fluorescence signal. A quadratic regression model provided the best fit (r 2 = 0.9998) across the entire calibration range (0.05-240 pg μL -1 ). The method offered excellent intra- and interday precision (% RSD 2.2 to 11.2%) and accuracy (% relative error -3.8 to 13.8%), while it presented particularly low limits of detection (0.044 pg μL -1 ) and quantification (0.132 pg μL -1 ). The present assay was successfully applied on marine phytoplankton extracts, and the overall results were consistent (average % relative error -14.8%) with Chl a concentrations (including divinyl Chl a) measured by high-performance liquid chromatography (HPLC). More importantly, the microplate-based method allowed the analysis of 96 samples/standards within a few minutes, instead of hours or days, when using a traditional cuvette-based fluorometer or an HPLC system. Graphical abstract TChl a concentrations (i.e. sum of Chl a and divinyl Chl a in ng L -1 ) measured in seawater samples by HPLC and fluorescence microplate reader.
Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N
2017-08-01
To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.
High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis
Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.
2016-09-23
High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less
2015-01-01
High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296
Lin, Lihua; Liu, Shengquan; Nie, Zhou; Chen, Yingzhuang; Lei, Chunyang; Wang, Zhen; Yin, Chao; Hu, Huiping; Huang, Yan; Yao, Shouzhuo
2015-04-21
Nowadays, large-scale screening for enzyme discovery, engineering, and drug discovery processes require simple, fast, and sensitive enzyme activity assay platforms with high integration and potential for high-throughput detection. Herein, a novel automatic and integrated micro-enzyme assay (AIμEA) platform was proposed based on a unique microreaction system fabricated by a engineered green fluorescence protein (GFP)-functionalized monolithic capillary column, with thrombin as an example. The recombinant GFP probe was rationally engineered to possess a His-tag and a substrate sequence of thrombin, which enable it to be immobilized on the monolith via metal affinity binding, and to be released after thrombin digestion. Combined with capillary electrophoresis-laser-induced fluorescence (CE-LIF), all the procedures, including thrombin injection, online enzymatic digestion in the microreaction system, and label-free detection of the released GFP, were integrated in a single electrophoretic process. By taking advantage of the ultrahigh loading capacity of the AIμEA platform and the CE automatic programming setup, one microreaction column was sufficient for many times digestion without replacement. The novel microreaction system showed significantly enhanced catalytic efficiency, about 30 fold higher than that of the equivalent bulk reaction. Accordingly, the AIμEA platform was highly sensitive with a limit of detection down to 1 pM of thrombin. Moreover, the AIμEA platform was robust and reliable to detect thrombin in human serum samples and its inhibition by hirudin. Hence, this AIμEA platform exhibits great potential for high-throughput analysis in future biological application, disease diagnostics, and drug screening.
Li, Fumin; Wang, Jun; Jenkins, Rand
2016-05-01
There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.
A scalable silicon photonic chip-scale optical switch for high performance computing systems.
Yu, Runxiang; Cheung, Stanley; Li, Yuliang; Okamoto, Katsunari; Proietti, Roberto; Yin, Yawei; Yoo, S J B
2013-12-30
This paper discusses the architecture and provides performance studies of a silicon photonic chip-scale optical switch for scalable interconnect network in high performance computing systems. The proposed switch exploits optical wavelength parallelism and wavelength routing characteristics of an Arrayed Waveguide Grating Router (AWGR) to allow contention resolution in the wavelength domain. Simulation results from a cycle-accurate network simulator indicate that, even with only two transmitter/receiver pairs per node, the switch exhibits lower end-to-end latency and higher throughput at high (>90%) input loads compared with electronic switches. On the device integration level, we propose to integrate all the components (ring modulators, photodetectors and AWGR) on a CMOS-compatible silicon photonic platform to ensure a compact, energy efficient and cost-effective device. We successfully demonstrate proof-of-concept routing functions on an 8 × 8 prototype fabricated using foundry services provided by OpSIS-IME.
High throughput system for magnetic manipulation of cells, polymers, and biomaterials
Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.
2008-01-01
In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357
Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho
2017-01-01
Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean
2016-10-01
To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.
Transfer-arm evaporator cell for rapid loading and deposition of organic thin films.
Greiner, M T; Helander, M G; Wang, Z B; Lu, Z H
2009-12-01
Described herein is a transfer-arm evaporator cell (TAE-cell), which allows for rapid loading of materials into vacuum for low-temperature sublimation deposition of thin films. This design can be incorporated with an existing analysis system for convenient in situ thin film characterization. This evaporator is especially well suited for photoemission characterization of organic semiconductor interfaces. Photoemission is one of the most important techniques for characterizing such, however, it generally requires in situ sample preparation. The ease with which materials can be loaded and evaporated with this design increases the throughput of in situ photoemission characterization, and broadens the research scope of the technique. Here, we describe the design, operation, and performance of the TAE-cell.
Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Y; Grossman, JC
2014-12-01
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less
A high performance totally ordered multicast protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Whetten, Brian; Kaplan, Simon
1995-01-01
This paper presents the Reliable Multicast Protocol (RMP). RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service such as IP Multicasting. RMP is fully and symmetrically distributed so that no site bears un undue portion of the communication load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These QoS guarantees are selectable on a per packet basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, an implicit naming service, mutually exclusive handlers for messages, and mutually exclusive locks. It has commonly been held that a large performance penalty must be paid in order to implement total ordering -- RMP discounts this. On SparcStation 10's on a 1250 KB/sec Ethernet, RMP provides totally ordered packet delivery to one destination at 842 KB/sec throughput and with 3.1 ms packet latency. The performance stays roughly constant independent of the number of destinations. For two or more destinations on a LAN, RMP provides higher throughput than any protocol that does not use multicast or broadcast.
40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...
Investigation of vacuum properties of CuCrZr alloy for high-heat-load absorber
NASA Astrophysics Data System (ADS)
Shueh, C.; Chan, C. K.; Chang, C. C.; Sheng, I. C.
2017-01-01
The Taiwan Photon Source (TPS) uses high-heat-load (HHL) absorbers to protect downstream ultrahigh-vacuum chambers from overheating. In this work, we propose to use the CuCrZr alloy (ASTM C18150) for the HHL absorber body and the ConFlat® flanges. We use the throughput method to measure the thermal outgassing rate and a helium leak detector to verify the vacuum seal between the CuCrZr alloy and stainless-steel flanges. The measured outgassing rate of the CuCrZr alloy was 5.8×10-10 Pa m/s after 72 h of pumping and decreased to 2.0 × 10-10 Pa m/s after 100 h of pumping. The leak rate through the vacuum seal between a CuCrZr flange and a stainless-steel flange was less than 1 × 10-10 Pa m3/s even after mounting and unmounting the flanges ten times and baking them at 250 °C. These results indicate that CuCrZr alloy is suitable for integrating HHL components with ConFlat® CuCrZr flanges for the absorption of the synchrotron radiation generated by the TPS.
Dynamic Transfers Of Tasks Among Computers
NASA Technical Reports Server (NTRS)
Liu, Howard T.; Silvester, John A.
1989-01-01
Allocation scheme gives jobs to idle computers. Ideal resource-sharing algorithm should have following characteristics: Dynamics, decentralized, and heterogeneous. Proposed enhanced receiver-initiated dynamic algorithm (ERIDA) for resource sharing fulfills all above criteria. Provides method balancing workload among hosts, resulting in improvement in response time and throughput performance of total system. Adjusts dynamically to traffic load of each station.
Liu, Yue; Hu, Jia; Li, Yan; Li, Xiao-Shuang; Wang, Zhong-Liang
2016-10-01
A novel method with high sensitivity for the rapid determination of chrysin, apigenin and luteolin in environment water samples was developed by double-pumps controlled on-line solid-phase extraction (SPE) coupled with high-performance liquid chromatography (HPLC). In the developed technique, metal organic framework MIL-101 was synthesized and applied as a sorbent for SPE. The as-synthesized MIL-101 was characterized by scanning electron microscope, X-ray diffraction spectrometry, thermal gravimetric analysis and micropore physisorption analysis. The MIL-101 behaved as a fast kinetics in the adsorption of chrysin, apigenin and luteolin. On-line SPE of chrysin, apigenin and luteolin was processed by loading a sample solution at a flow rate of 1.0 mL/min for 10 min. The extracted analytes were subsequently eluted into a ZORBAX Bonus-RP analytical column (25 cm long × 4.6 mm i.d.) for HPLC separation under isocratic condition with a mobile phase (MeOH: ACN: 0.02 M H 3 PO 4 = 35:35:30) at a flow rate of 1.0 mL/min. Experimental conditions, including ionic strength, sample pH, sample loading rates, sample loading time and desorption analytes time, were further optimized to obtain efficient preconcentration and high-precision determination of the analytes mentioned above. The method achieved the merits of simplicity, rapidity, sensitivity, wide linear range and high sample throughput. The possible mechanism for the adsorption of flavonoids on MIL-101 was proposed. The developed method has been applied to determine trace chrysin, apigenin and luteolin in a variety of environmental water samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.
2014-05-01
MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.
A Sensitive Assay for Virus Discovery in Respiratory Clinical Samples
de Vries, Michel; Deijs, Martin; Canuti, Marta; van Schaik, Barbera D. C.; Faria, Nuno R.; van de Garde, Martijn D. B.; Jachimowski, Loes C. M.; Jebbink, Maarten F.; Jakobs, Marja; Luyf, Angela C. M.; Coenjaerts, Frank E. J.; Claas, Eric C. J.; Molenkamp, Richard; Koekkoek, Sylvie M.; Lammens, Christine; Leus, Frank; Goossens, Herman; Ieven, Margareta; Baas, Frank; van der Hoek, Lia
2011-01-01
In 5–40% of respiratory infections in children, the diagnostics remain negative, suggesting that the patients might be infected with a yet unknown pathogen. Virus discovery cDNA-AFLP (VIDISCA) is a virus discovery method based on recognition of restriction enzyme cleavage sites, ligation of adaptors and subsequent amplification by PCR. However, direct discovery of unknown pathogens in nasopharyngeal swabs is difficult due to the high concentration of ribosomal RNA (rRNA) that acts as competitor. In the current study we optimized VIDISCA by adjusting the reverse transcription enzymes and decreasing rRNA amplification in the reverse transcription, using hexamer oligonucleotides that do not anneal to rRNA. Residual cDNA synthesis on rRNA templates was further reduced with oligonucleotides that anneal to rRNA but can not be extended due to 3′-dideoxy-C6-modification. With these modifications >90% reduction of rRNA amplification was established. Further improvement of the VIDISCA sensitivity was obtained by high throughput sequencing (VIDISCA-454). Eighteen nasopharyngeal swabs were analysed, all containing known respiratory viruses. We could identify the proper virus in the majority of samples tested (11/18). The median load in the VIDISCA-454 positive samples was 7.2 E5 viral genome copies/ml (ranging from 1.4 E3–7.7 E6). Our results show that optimization of VIDISCA and subsequent high-throughput-sequencing enhances sensitivity drastically and provides the opportunity to perform virus discovery directly in patient material. PMID:21283679
Tucker, Strahan; Li, Shaorong; Kaukinen, Karia H; Patterson, David A; Miller, Kristina M
2018-01-01
Disease-causing infectious agents are natural components of ecosystems and considered a major selective force driving the evolution of host species. However, knowledge of the presence and abundance of suites of infectious agents in wild populations has been constrained by our ability to easily screen for them. Using salmon as a model, we contrasted seasonal pathogenic infectious agents in life history variants of juvenile Chinook salmon from the Fraser River system (N = 655), British Columbia (BC), through the application of a novel high-throughput quantitative PCR monitoring platform. This included freshwater hatchery origin fish and samples taken at sea between ocean entry in spring and over-winter residence in coastal waters. These variants currently display opposite trends in productivity, with yearling stocks generally in decline and sub-yearling stocks doing comparatively well. We detected the presence of 32 agents, 21 of which were at >1% prevalence. Variants carried a different infectious agent profile in terms of (1) diversity, (2) origin or transmission environment of infectious agents, and (3) prevalence and abundance of individual agents. Differences in profiles tended to reflect differential timing and residence patterns through freshwater, estuarine and marine habitats. Over all seasons, individual salmon carried an average of 3.7 agents. Diversity changed significantly, increasing upon saltwater entrance, increasing through the fall and decreasing slightly in winter. Diversity varied between life history types with yearling individuals carrying 1.3-times more agents on average. Shifts in prevalence and load over time were examined to identify agents with the greatest potential for impact at the stock level; those displaying concurrent decrease in prevalence and load truncation with time. Of those six that had similar patterns in both variants, five reached higher prevalence in yearling fish while only one reached higher prevalence in sub-yearling fish; this pattern was present for an additional five agents in yearling fish only.
Li, Shaorong; Kaukinen, Karia H.; Patterson, David A.; Miller, Kristina M.
2018-01-01
Disease-causing infectious agents are natural components of ecosystems and considered a major selective force driving the evolution of host species. However, knowledge of the presence and abundance of suites of infectious agents in wild populations has been constrained by our ability to easily screen for them. Using salmon as a model, we contrasted seasonal pathogenic infectious agents in life history variants of juvenile Chinook salmon from the Fraser River system (N = 655), British Columbia (BC), through the application of a novel high-throughput quantitative PCR monitoring platform. This included freshwater hatchery origin fish and samples taken at sea between ocean entry in spring and over-winter residence in coastal waters. These variants currently display opposite trends in productivity, with yearling stocks generally in decline and sub-yearling stocks doing comparatively well. We detected the presence of 32 agents, 21 of which were at >1% prevalence. Variants carried a different infectious agent profile in terms of (1) diversity, (2) origin or transmission environment of infectious agents, and (3) prevalence and abundance of individual agents. Differences in profiles tended to reflect differential timing and residence patterns through freshwater, estuarine and marine habitats. Over all seasons, individual salmon carried an average of 3.7 agents. Diversity changed significantly, increasing upon saltwater entrance, increasing through the fall and decreasing slightly in winter. Diversity varied between life history types with yearling individuals carrying 1.3-times more agents on average. Shifts in prevalence and load over time were examined to identify agents with the greatest potential for impact at the stock level; those displaying concurrent decrease in prevalence and load truncation with time. Of those six that had similar patterns in both variants, five reached higher prevalence in yearling fish while only one reached higher prevalence in sub-yearling fish; this pattern was present for an additional five agents in yearling fish only. PMID:29672620
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
Sibole, Scott C.; Erdemir, Ahmet
2012-01-01
Cells of the musculoskeletal system are known to respond to mechanical loading and chondrocytes within the cartilage are not an exception. However, understanding how joint level loads relate to cell level deformations, e.g. in the cartilage, is not a straightforward task. In this study, a multi-scale analysis pipeline was implemented to post-process the results of a macro-scale finite element (FE) tibiofemoral joint model to provide joint mechanics based displacement boundary conditions to micro-scale cellular FE models of the cartilage, for the purpose of characterizing chondrocyte deformations in relation to tibiofemoral joint loading. It was possible to identify the load distribution within the knee among its tissue structures and ultimately within the cartilage among its extracellular matrix, pericellular environment and resident chondrocytes. Various cellular deformation metrics (aspect ratio change, volumetric strain, cellular effective strain and maximum shear strain) were calculated. To illustrate further utility of this multi-scale modeling pipeline, two micro-scale cartilage constructs were considered: an idealized single cell at the centroid of a 100×100×100 μm block commonly used in past research studies, and an anatomically based (11 cell model of the same volume) representation of the middle zone of tibiofemoral cartilage. In both cases, chondrocytes experienced amplified deformations compared to those at the macro-scale, predicted by simulating one body weight compressive loading on the tibiofemoral joint. In the 11 cell case, all cells experienced less deformation than the single cell case, and also exhibited a larger variance in deformation compared to other cells residing in the same block. The coupling method proved to be highly scalable due to micro-scale model independence that allowed for exploitation of distributed memory computing architecture. The method’s generalized nature also allows for substitution of any macro-scale and/or micro-scale model providing application for other multi-scale continuum mechanics problems. PMID:22649535
Statistical Modeling of Single Target Cell Encapsulation
Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan
2011-01-01
High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Creasy, Arch; Barker, Gregory; Carta, Giorgio
2017-03-01
A methodology is presented to predict protein elution behavior from an ion exchange column using both individual or combined pH and salt gradients based on high-throughput batch isotherm data. The buffer compositions are first optimized to generate linear pH gradients from pH 5.5 to 7 with defined concentrations of sodium chloride. Next, high-throughput batch isotherm data are collected for a monoclonal antibody on the cation exchange resin POROS XS over a range of protein concentrations, salt concentrations, and solution pH. Finally, a previously developed empirical interpolation (EI) method is extended to describe protein binding as a function of the protein and salt concentration and solution pH without using an explicit isotherm model. The interpolated isotherm data are then used with a lumped kinetic model to predict the protein elution behavior. Experimental results obtained for laboratory scale columns show excellent agreement with the predicted elution curves for both individual or combined pH and salt gradients at protein loads up to 45 mg/mL of column. Numerical studies show that the model predictions are robust as long as the isotherm data cover the range of mobile phase compositions where the protein actually elutes from the column. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-throughput measurements of the optical redox ratio using a commercial microplate reader.
Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C
2015-01-01
There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.
A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting
Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.
2016-01-01
Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
High-throughput, image-based screening of pooled genetic variant libraries
Emanuel, George; Moffitt, Jeffrey R.; Zhuang, Xiaowei
2018-01-01
Image-based, high-throughput screening of genetic perturbations will advance both biology and biotechnology. We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in numerous individual cells. We achieve genotyping by introducing barcoded genetic variants into cells and using massively multiplexed FISH to measure the barcodes. We demonstrated this method by screening mutants of the fluorescent protein YFAST, yielding brighter and more photostable YFAST variants. PMID:29083401
Experimental Design for Combinatorial and High Throughput Materials Development
NASA Astrophysics Data System (ADS)
Cawse, James N.
2002-12-01
In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.
Load-adaptive practical multi-channel communications in wireless sensor networks.
Islam, Md Shariful; Alam, Muhammad Mahbub; Hong, Choong Seon; Lee, Sungwon
2010-01-01
In recent years, a significant number of sensor node prototypes have been designed that provide communications in multiple channels. This multi-channel feature can be effectively exploited to increase the overall capacity and performance of wireless sensor networks (WSNs). In this paper, we present a multi-channel communications system for WSNs that is referred to as load-adaptive practical multi-channel communications (LPMC). LPMC estimates the active load of a channel at the sink since it has a more comprehensive view of the network behavior, and dynamically adds or removes channels based on the estimated load. LPMC updates the routing path to balance the loads of the channels. The nodes in a path use the same channel; therefore, they do not need to switch channels to receive or forward packets. LPMC has been evaluated through extensive simulations, and the results demonstrate that it can effectively increase the delivery ratio, network throughput, and channel utilization, and that it can decrease the end-to-end delay and energy consumption.
Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing
Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi
2016-01-01
Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039
Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.
Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S
1994-01-01
The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.
Fujimoto, Hiroyuki; Kato, Koichi; Iwata, Hiroo
2010-05-01
Electroporation microarrays have been developed for the high-throughput transfection of expression constructs and small interfering RNAs (siRNAs) into living mammalian cells. These techniques have potential to provide a platform for the cell-based analysis of gene functions. One of the key issues associated with microarray technology is the efficiency of transfection. The capability of attaining reasonably high transfection efficiency is the basis for obtaining functional data without false negatives. In this study, we aimed at improving the transfection efficiency in the system that siRNA loaded on an electrode is electroporated into cells cultured directly on the electrode. The strategy we adopted here is to increase the surface density of siRNA loaded onto electrodes. For this purpose, the layer-by-layer assembly of siRNA and cationic polymers, branched or linear form of poly(ethyleneimine), was performed. The multilayer thus obtained was characterized by infrared reflection-adsorption spectroscopy and surface plasmon resonance analysis. Transfection efficiency was evaluated in a system that siRNA specific for enhanced green fluorescent protein (EGFP) was electroporated on the electrode into human embryonic kidney cells stably transformed with the EGFP gene. The suppression of EGFP expression was assessed by fluorescence microscopy and flow cytometry. Our data showed that the layer-by-layer assembly of siRNA with branched poly(ethyleneimine) facilitated to increase the surface density of loaded siRNA. As a result, the expression of EGFP gene in the electroporated cells was suppressed much more on the electrodes with the multilayer of siRNA than that with the monolayer.
High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Zheng, Ji; Zhou, Zhenchao; Wei, Yuanyuan; Chen, Tao; Feng, Wanqiu; Chen, Hong
2018-05-01
The rapid expansion of human activity in a region can exacerbate human health risks induced by antibiotic resistance genes (ARGs). Peri-urban ecosystems serve at the symbiotic interface between urban and rural ecosystems, and investigations into the dissemination of ARGs in peri-urban areas provide a basic framework for tracking the spread of ARGs and potential mitigations. In this study, through the use of high-throughput quantitative PCR and 16S rRNA gene high-throughput sequencing, seasonal and geographical distributions of ARGs and their host bacterial communities were characterized in a peri-urban river. The abundance of ARGs in downstream was 5.2-33.9 times higher than upstream, which indicated distinct antibiotic resistance pollution in the areas where human lives. With the comparison classified based on land use nearby, the abundance of ARGs in samples near farmland and villages was higher than in the background (3.47-5.58 times), pointing to the high load in the river caused by farming and other human activities in the peri-urban areas. With the co-occurrence pattern revealed by network analysis, blaVEB and tetM were proposed to be indicators of ARGs which get together in the same module. Furthermore, seasonal variations in ARGs and the transport of bacterial communities were observed. The effects of seasonal temperature on the dissemination of ARGs along the watershed was also evaluated. The highest absolute abundance of ARGs occurred in summer (2.81 × 10 9 copies/L on average), the trends of ARG abundances in four seasons were similar with local air temperature. The Linear discriminant analysis effect size (LEfSe) suggested that nine bacterial genera were implicated as biomarkers for the corresponding season. Mobile genetic elements (MGEs) showed significant positive correlation with ARGs (P < 0.01) and MGEs were also identified as the key-contributing factor driving ARG alteration. This study provides an overview of seasonal and geographical variations in ARGs distribution in a peri-urban river and draws attention to controlling pollutants in peri-urban ecosystems. Copyright © 2018 Elsevier Ltd. All rights reserved.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor-Pashow, Kathryn M. L.; McCabe, Daniel J.; Nash, Charles A.
Vitrification of Low Activity Waste in the Hanford Waste Treatment and Immobilization Plant generates a condensate stream from the off-gas processes. Components in this stream are partially volatile and accumulate to high concentrations through recycling, which impacts the waste glass loading and facility throughput. The primary radionuclide that vaporizes and accumulates in the stream is 99Tc. This program is investigating Tc removal via reductive precipitation with stannous chloride to examine the potential for diverting this stream to an alternate disposition path. As a result, research has shown stannous chloride to be effective, and this paper describes results of recent experimentsmore » performed to further mature the technology.« less
Assays for Determination of Protein Concentration.
Olson, Bradley J S C
2016-06-01
Biochemical analysis of proteins relies on accurate quantification of protein concentration. Detailed in this appendix are some commonly used methods for protein analysis, e.g., Lowry, Bradford, bicinchoninic acid (BCA), UV spectroscopic, and 3-(4-carboxybenzoyl)quinoline-2-carboxaldehyde (CBQCA) assays. The primary focus of this report is assay selection, emphasizing sample and buffer compatibility. The fundamentals of generating protein assay standard curves and of data processing are considered, as are high-throughput adaptations of the more commonly used protein assays. Also included is a rapid, inexpensive, and reliable BCA assay of total protein in SDS-PAGE sample buffer that is used for equal loading of SDS-PAGE gels. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Taylor-Pashow, Kathryn M. L.; McCabe, Daniel J.; Nash, Charles A.
2017-03-16
Vitrification of Low Activity Waste in the Hanford Waste Treatment and Immobilization Plant generates a condensate stream from the off-gas processes. Components in this stream are partially volatile and accumulate to high concentrations through recycling, which impacts the waste glass loading and facility throughput. The primary radionuclide that vaporizes and accumulates in the stream is 99Tc. This program is investigating Tc removal via reductive precipitation with stannous chloride to examine the potential for diverting this stream to an alternate disposition path. As a result, research has shown stannous chloride to be effective, and this paper describes results of recent experimentsmore » performed to further mature the technology.« less
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Live Virtual Constructive Distributed Test Environment Characterization Report
NASA Technical Reports Server (NTRS)
Murphy, Jim; Kim, Sam K.
2013-01-01
This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.
Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.
Yang, Darren; Wong, Wesley P
2018-01-01
We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.
High throughput single cell counting in droplet-based microfluidics.
Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie
2017-05-02
Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.
2016-12-01
AWARD NUMBER: W81XWH-13-1-0371 TITLE: High-Throughput Sequencing of Germline and Tumor From Men with Early- Onset Metastatic Prostate Cancer...DATES COVERED 30 Sep 2013 - 29 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER High-Throughput Sequencing of Germline and Tumor From Men with...presenting with metastatic prostate cancer at a young age (before age 60 years). Whole exome sequencing identified a panel of germline variants that have
BeeDoctor, a Versatile MLPA-Based Diagnostic Tool for Screening Bee Viruses
De Smet, Lina; Ravoet, Jorgen; de Miranda, Joachim R.; Wenseleers, Tom; Mueller, Matthias Y.; Moritz, Robin F. A.; de Graaf, Dirk C.
2012-01-01
The long-term decline of managed honeybee hives in the world has drawn significant attention to the scientific community and bee-keeping industry. A high pathogen load is believed to play a crucial role in this phenomenon, with the bee viruses being key players. Most of the currently characterized honeybee viruses (around twenty) are positive stranded RNA viruses. Techniques based on RNA signatures are widely used to determine the viral load in honeybee colonies. High throughput screening for viral loads necessitates the development of a multiplex polymerase chain reaction approach in which different viruses can be targeted simultaneously. A new multiparameter assay, called “BeeDoctor”, was developed based on multiplex-ligation probe dependent amplification (MLPA) technology. This assay detects 10 honeybee viruses in one reaction. “BeeDoctor” is also able to screen selectively for either the positive strand of the targeted RNA bee viruses or the negative strand, which is indicative for active viral replication. Due to its sensitivity and specificity, the MLPA assay is a useful tool for rapid diagnosis, pathogen characterization, and epidemiology of viruses in honeybee populations. “BeeDoctor” was used for screening 363 samples from apiaries located throughout Flanders; the northern half of Belgium. Using the “BeeDoctor”, virus infections were detected in almost eighty percent of the colonies, with deformed wing virus by far the most frequently detected virus and multiple virus infections were found in 26 percent of the colonies. PMID:23144717
BeeDoctor, a versatile MLPA-based diagnostic tool for screening bee viruses.
De Smet, Lina; Ravoet, Jorgen; de Miranda, Joachim R; Wenseleers, Tom; Mueller, Matthias Y; Moritz, Robin F A; de Graaf, Dirk C
2012-01-01
The long-term decline of managed honeybee hives in the world has drawn significant attention to the scientific community and bee-keeping industry. A high pathogen load is believed to play a crucial role in this phenomenon, with the bee viruses being key players. Most of the currently characterized honeybee viruses (around twenty) are positive stranded RNA viruses. Techniques based on RNA signatures are widely used to determine the viral load in honeybee colonies. High throughput screening for viral loads necessitates the development of a multiplex polymerase chain reaction approach in which different viruses can be targeted simultaneously. A new multiparameter assay, called "BeeDoctor", was developed based on multiplex-ligation probe dependent amplification (MLPA) technology. This assay detects 10 honeybee viruses in one reaction. "BeeDoctor" is also able to screen selectively for either the positive strand of the targeted RNA bee viruses or the negative strand, which is indicative for active viral replication. Due to its sensitivity and specificity, the MLPA assay is a useful tool for rapid diagnosis, pathogen characterization, and epidemiology of viruses in honeybee populations. "BeeDoctor" was used for screening 363 samples from apiaries located throughout Flanders; the northern half of Belgium. Using the "BeeDoctor", virus infections were detected in almost eighty percent of the colonies, with deformed wing virus by far the most frequently detected virus and multiple virus infections were found in 26 percent of the colonies.
NASA Astrophysics Data System (ADS)
Markelov, Oleg; Nguyen Duc, Viet; Bogachev, Mikhail
2017-11-01
Recently we have suggested a universal superstatistical model of user access patterns and aggregated network traffic. The model takes into account the irregular character of end user access patterns on the web via the non-exponential distributions of the local access rates, but neglects the long-term correlations between these rates. While the model is accurate for quasi-stationary traffic records, its performance under highly variable and especially non-stationary access dynamics remains questionable. In this paper, using an example of the traffic patterns from a highly loaded network cluster hosting the website of the 1998 FIFA World Cup, we suggest a generalization of the previously suggested superstatistical model by introducing long-term correlations between access rates. Using queueing system simulations, we show explicitly that this generalization is essential for modeling network nodes with highly non-stationary access patterns, where neglecting long-term correlations leads to the underestimation of the empirical average sojourn time by several decades under high throughput utilization.
NASA Astrophysics Data System (ADS)
Moralis-Pegios, M.; Terzenidis, N.; Mourgias-Alexandris, G.; Vyrsokinos, K.; Pleros, N.
2018-02-01
Disaggregated Data Centers (DCs) have emerged as a powerful architectural framework towards increasing resource utilization and system power efficiency, requiring, however, a networking infrastructure that can ensure low-latency and high-bandwidth connectivity between a high-number of interconnected nodes. This reality has been the driving force towards high-port count and low-latency optical switching platforms, with recent efforts concluding that the use of distributed control architectures as offered by Broadcast-and-Select (BS) layouts can lead to sub-μsec latencies. However, almost all high-port count optical switch designs proposed so far rely either on electronic buffering and associated SerDes circuitry for resolving contention or on buffer-less designs with packet drop and re-transmit procedures, unavoidably increasing latency or limiting throughput. In this article, we demonstrate a 256x256 optical switch architecture for disaggregated DCs that employs small-size optical delay line buffering in a distributed control scheme, exploiting FPGA-based header processing over a hybrid BS/Wavelength routing topology that is implemented by a 16x16 BS design and a 16x16 AWGR. Simulation-based performance analysis reveals that even the use of a 2- packet optical buffer can yield <620nsec latency with >85% throughput for up to 100% loads. The switch has been experimentally validated with 10Gb/s optical data packets using 1:16 optical splitting and a SOA-MZI wavelength converter (WC) along with fiber delay lines for the 2-packet buffer implementation at every BS outgoing port, followed by an additional SOA-MZI tunable WC and the 16x16 AWGR. Error-free performance in all different switch input/output combinations has been obtained with a power penalty of <2.5dB.
High-throughput sequencing methods to study neuronal RNA-protein interactions.
Ule, Jernej
2009-12-01
UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reass, W.A.
1994-07-01
This paper describes the electrical design and operation of a high power modulator system implemented for the Los Alamos Plasma Source Ion Implantation (PSII) facility. To test the viability of the PSII process for various automotive components, the modulator must accept wide variations of load impedance. Components have varying area and composition which must be processed with different plasmas. Additionally, the load impedance may change by large factors during the typical 20 uS pulse, due to plasma displacement currents and sheath growth. As a preliminary design to test the system viability for automotive component implantation, suitable for a manufacturing environment,more » circuit topology must be able to directly scale to high power versions, for increased component through-put. We have chosen an evolutionary design approach with component families of characterized performance, which should Ion result in a reliable modulator system with component lifetimes. The modulator utilizes a pair of Litton L-3408 hollow beam amplifier tubes as switching elements in a ``hot-deck`` configuration. Internal to the main of planar triode hot deck, an additional pair decks, configured in a totem pole circuit, provide input drive to the L-3408 mod-anodes. The modulator can output over 2 amps average current (at 100 kV) with 1 kW of modanode drive. Diagnostic electronics monitor the load and stops pulses for 100 mS when a load arcs occur. This paper, in addition to providing detailed engineering design information, will provide operational characteristics and reliability data that direct the design to the higher power, mass production line capable modulators.« less
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W
2001-07-01
The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.
NASA Astrophysics Data System (ADS)
Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela
2016-10-01
Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.
The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.
The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.
Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong
2014-01-01
Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980
A high-quality annotated transcriptome of swine peripheral blood
USDA-ARS?s Scientific Manuscript database
Background: High throughput gene expression profiling assays of peripheral blood are widely used in biomedicine, as well as in animal genetics and physiology research. Accurate, comprehensive, and precise interpretation of such high throughput assays relies on well-characterized reference genomes an...
Zhou, Jun; Yang, Jun; Yu, Qing; Yong, Xiaoyu; Xie, Xinxin; Zhang, Lijuan; Wei, Ping; Jia, Honghua
2017-11-01
The aim of this work was to investigate the mesophilic methane fermentation of rice straw at different organic loading rates (OLRs) in a 300m 3 bioreactor. It was found that biogas production increased when the OLR was below 2.00kg VS substrate /(m 3 ·d). The average volumetric biogas production reached 0.86m 3 /(m 3 ·d) at an OLR of 2.00kg VS substrate /(m 3 ·d). Biogas production rate was 323m 3 /t dry rice straw over the whole process. The pH, chemical oxygen demand, volatile fatty acid, and NH 4 + -N concentrations were all in optimal range at different OLRs. High-throughput sequencing analysis indicated that Firmicutes, Fibrobacteres, and Spirochaetes predominated in straw samples. Chloroflexi, Proteobacteria, and Planctomycetes were more abundant in the slurry. The hydrogenotrophic pathway was the main biochemical pathway of methanogenesis in the reactor. This study provides new information regarding the OLR and the differences in the spatial distribution of specific microbiota in a rice straw biogas plant. Copyright © 2017 Elsevier Ltd. All rights reserved.
Random access to mobile networks with advanced error correction
NASA Technical Reports Server (NTRS)
Dippold, Michael
1990-01-01
A random access scheme for unreliable data channels is investigated in conjunction with an adaptive Hybrid-II Automatic Repeat Request (ARQ) scheme using Rate Compatible Punctured Codes (RCPC) Forward Error Correction (FEC). A simple scheme with fixed frame length and equal slot sizes is chosen and reservation is implicit by the first packet transmitted randomly in a free slot, similar to Reservation Aloha. This allows the further transmission of redundancy if the last decoding attempt failed. Results show that a high channel utilization and superior throughput can be achieved with this scheme that shows a quite low implementation complexity. For the example of an interleaved Rayleigh channel and soft decision utilization and mean delay are calculated. A utilization of 40 percent may be achieved for a frame with the number of slots being equal to half the station number under high traffic load. The effects of feedback channel errors and some countermeasures are discussed.
UltraNet Target Parameters. Chapter 1
NASA Technical Reports Server (NTRS)
Kislitzin, Katherine T.; Blaylock, Bruce T. (Technical Monitor)
1992-01-01
The UltraNet is a high speed network capable of rates up to one gigabit per second. It is a hub based network with four optical fiber links connecting each hub. Each link can carry up to 256 megabits of data, and the hub backplane is capable of one gigabit aggregate throughput. Host connections to the hub may be fiber, coax, or channel based. Bus based machines have adapter boards that connect to transceivers in the hub, while channel based machines use a personality module in the hub. One way that the UltraNet achieves its high transfer rates is by off-loading the protocol processing from the hosts to special purpose protocol engines in the UltraNet hubs. In addition, every hub has a PC connected to it by StarLAN for network management purposes. Although there is hub resident and PC resident UltraNet software, this document treats only the host resident UltraNet software.
Functionality screen of streptavidin mutants by non-denaturing SDS-PAGE using biotin-4-fluorescein.
Humbert, Nicolas; Ward, Thomas R
2008-01-01
Site-directed mutagenesis or directed evolution of proteins often leads to the production of inactive mutants. For streptavidin and related proteins, mutations may lead to the loss of their biotin-binding properties. With high-throughput screening methodologies in mind, it is imperative to detect, prior to the high-density protein production, the bacteria that produce non-functional streptavidin isoforms. Based on the incorporation of biotin-4-fluorescein in streptavidin mutants present in Escherichia coli bacterial extracts, we detail a functional screen that allows the identification of biotin-binding streptavidin variants. Bacteria are cultivated in a small volume, followed by a rapid treatment of the cells; biotin-4-fluorescein is added to the bacterial extract and loaded on an Sodium Dodecyl Sulfate Poly-Acrylamide Gel Electrophoresis (SDS-PAGE) under non-denaturing conditions. Revealing is performed using a UV transilluminator. This screen is thus easy to implement, cheap and requires only readily available equipment.
Starch Applications for Delivery Systems
NASA Astrophysics Data System (ADS)
Li, Jason
2013-03-01
Starch is one of the most abundant and economical renewable biopolymers in nature. Starch molecules are high molecular weight polymers of D-glucose linked by α-(1,4) and α-(1,6) glycosidic bonds, forming linear (amylose) and branched (amylopectin) structures. Octenyl succinic anhydride modified starches (OSA-starch) are designed by carefully choosing a proper starch source, path and degree of modification. This enables emulsion and micro-encapsulation delivery systems for oil based flavors, micronutrients, fragrance, and pharmaceutical actives. A large percentage of flavors are encapsulated by spray drying in today's industry due to its high throughput. However, spray drying encapsulation faces constant challenges with retention of volatile compounds, oxidation of sensitive compound, and manufacturing yield. Specialty OSA-starches were developed suitable for the complex dynamics in spray drying and to provide high encapsulation efficiency and high microcapsule quality. The OSA starch surface activity, low viscosity and film forming capability contribute to high volatile retention and low active oxidation. OSA starches exhibit superior performance, especially in high solids and high oil load encapsulations compared with other hydrocolloids. The submission is based on research and development of Ingredion
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS
High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...
GiNA, an efficient and high-throughput software for horticultural phenotyping
USDA-ARS?s Scientific Manuscript database
Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...
High-throughput quantification of hydroxyproline for determination of collagen.
Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan
2011-10-15
An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.
2016-09-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakala, Jacqueline Alexandra
2016-11-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
A Memory Efficient Network Encryption Scheme
NASA Astrophysics Data System (ADS)
El-Fotouh, Mohamed Abo; Diepold, Klaus
In this paper, we studied the two widely used encryption schemes in network applications. Shortcomings have been found in both schemes, as these schemes consume either more memory to gain high throughput or low memory with low throughput. The need has aroused for a scheme that has low memory requirements and in the same time possesses high speed, as the number of the internet users increases each day. We used the SSM model [1], to construct an encryption scheme based on the AES. The proposed scheme possesses high throughput together with low memory requirements.
HTP-NLP: A New NLP System for High Throughput Phenotyping.
Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L
2017-01-01
Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.
Air Force Research Laboratory Resident Associateship Program Continuation
2014-12-04
2011-7/17/2012 United States Received Veremyev, Alexander Fedorovich Pasiliao, Eduardo Lewis 8/1/2012-7/31/2013 Russia Sensors Directorate Aga...mass and damping on their modal characteristics. 5 Aerodynamic loads were estimated from the wind -tunnel test data, where the angle of attack of the... Wireless Networks; Throughput Optimization for Cognitive Radio Network with Slowly Varying Channels. 2 Capacity Optimization of MIMO Links with
High-throughput screening of chromatographic separations: IV. Ion-exchange.
Kelley, Brian D; Switzer, Mary; Bastek, Patrick; Kramarczyk, Jack F; Molnar, Kathleen; Yu, Tianning; Coffman, Jon
2008-08-01
Ion-exchange (IEX) chromatography steps are widely applied in protein purification processes because of their high capacity, selectivity, robust operation, and well-understood principles. Optimization of IEX steps typically involves resin screening and selection of the pH and counterion concentrations of the load, wash, and elution steps. Time and material constraints associated with operating laboratory columns often preclude evaluating more than 20-50 conditions during early stages of process development. To overcome this limitation, a high-throughput screening (HTS) system employing a robotic liquid handling system and 96-well filterplates was used to evaluate various operating conditions for IEX steps for monoclonal antibody (mAb) purification. A screening study for an adsorptive cation-exchange step evaluated eight different resins. Sodium chloride concentrations defining the operating boundaries of product binding and elution were established at four different pH levels for each resin. Adsorption isotherms were measured for 24 different pH and salt combinations for a single resin. An anion-exchange flowthrough step was then examined, generating data on mAb adsorption for 48 different combinations of pH and counterion concentration for three different resins. The mAb partition coefficients were calculated and used to estimate the characteristic charge of the resin-protein interaction. Host cell protein and residual Protein A impurity levels were also measured, providing information on selectivity within this operating window. The HTS system shows promise for accelerating process development of IEX steps, enabling rapid acquisition of large datasets addressing the performance of the chromatography step under many different operating conditions. (c) 2008 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki
In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.
Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)
High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
Yan, Qimin; Yu, Jie; Suram, Santosh K.; ...
2017-03-06
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
Microfluidics for cell-based high throughput screening platforms - A review.
Du, Guansheng; Fang, Qun; den Toonder, Jaap M J
2016-01-15
In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery. Copyright © 2015 Elsevier B.V. All rights reserved.
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Qimin; Yu, Jie; Suram, Santosh K.
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Development and Validation of an Automated High-Throughput System for Zebrafish In Vivo Screenings
Virto, Juan M.; Holgado, Olaia; Diez, Maria; Izpisua Belmonte, Juan Carlos; Callol-Massot, Carles
2012-01-01
The zebrafish is a vertebrate model compatible with the paradigms of drug discovery. The small size and transparency of zebrafish embryos make them amenable for the automation necessary in high-throughput screenings. We have developed an automated high-throughput platform for in vivo chemical screenings on zebrafish embryos that includes automated methods for embryo dispensation, compound delivery, incubation, imaging and analysis of the results. At present, two different assays to detect cardiotoxic compounds and angiogenesis inhibitors can be automatically run in the platform, showing the versatility of the system. A validation of these two assays with known positive and negative compounds, as well as a screening for the detection of unknown anti-angiogenic compounds, have been successfully carried out in the system developed. We present a totally automated platform that allows for high-throughput screenings in a vertebrate organism. PMID:22615792
BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing
Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph
2011-01-01
Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797
Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie
2018-04-25
Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...
2017-03-28
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
High Throughput Genotoxicity Profiling of the US EPA ToxCast Chemical Library
A key aim of the ToxCast project is to investigate modern molecular and genetic high content and high throughput screening (HTS) assays, along with various computational tools to supplement and perhaps replace traditional assays for evaluating chemical toxicity. Genotoxicity is a...
A multiplex calibrated real-time PCR assay for quantitation of DNA of EBV-1 and 2.
Gatto, Francesca; Cassina, Giulia; Broccolo, Francesco; Morreale, Giuseppe; Lanino, Edoardo; Di Marco, Eddi; Vardas, Efthiya; Bernasconi, Daniela; Buttò, Stefano; Principi, Nicola; Esposito, Susanna; Scarlatti, Gabriella; Lusso, Paolo; Malnati, Mauro S
2011-12-01
Accurate and highly sensitive tests for the diagnosis of active Epstein-Barr virus (EBV) infection are essential for the clinical management of individuals infected with EBV. A calibrated quantitative real-time PCR assay for the measurement of EBV DNA of both EBV-1 and 2 subtypes was developed, combining the detection of the EBV DNA and a synthetic DNA calibrator in a multiplex PCR format. The assay displays a wide dynamic range and a high degree of accuracy even in the presence of 1μg of human genomic DNA. This assay measures with the same efficiency EBV DNA from strains prevalent in different geographic areas. The clinical sensitivity and specificity of the system were evaluated by testing 181 peripheral blood mononuclear cell (PBMCs) and plasma specimens obtained from 21 patients subjected to bone marrow transplantation, 70 HIV-seropositive subjects and 23 healthy controls. Patients affected by EBV-associated post-transplant lymphoprolipherative disorders had the highest frequency of EBV detection and the highest viral load. Persons infected with HIV had higher levels of EBV DNA load in PBMCs and a higher frequency of EBV plasma viremia compared to healthy controls. In conclusion, this new assay provides a reliable high-throughput method for the quantitation of EBV DNA in clinical samples. Copyright © 2011 Elsevier B.V. All rights reserved.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella
2012-01-01
Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701
Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella
2012-08-01
Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents.
Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C
2016-01-01
Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Megan; Nordmeyer, Robert A.; Cornell, Earl
2009-10-02
To facilitate a direct interface between protein separation by PAGE and protein identification by mass spectrometry, we developed a multichannel system that continuously collects fractions as protein bands migrate off the bottom of gel electrophoresis columns. The device was constructed using several short linear gel columns, each of a different percent acrylamide, to achieve a separation power similar to that of a long gradient gel. A Counter Free-Flow elution technique then allows continuous and simultaneous fraction collection from multiple channels at low cost. We demonstrate that rapid, high-resolution separation of a complex protein mixture can be achieved on this systemmore » using SDS-PAGE. In a 2.5 h electrophoresis run, for example, each sample was separated and eluted into 48-96 fractions over a mass range of 10-150 kDa; sample recovery rates were 50percent or higher; each channel was loaded with up to 0.3 mg of protein in 0.4 mL; and a purified band was eluted in two to three fractions (200 L/fraction). Similar results were obtained when running native gel electrophoresis, but protein aggregation limited the loading capacity to about 50 g per channel and reduced resolution.« less
Tran, Benjamin; Grosskopf, Vanessa; Wang, Xiangdan; Yang, Jihong; Walker, Don; Yu, Christopher; McDonald, Paul
2016-03-18
Purification processes for therapeutic antibodies typically exploit multiple and orthogonal chromatography steps in order to remove impurities, such as host-cell proteins. While the majority of host-cell proteins are cleared through purification processes, individual host-cell proteins such as Phospholipase B-like 2 (PLBL2) are more challenging to remove and can persist into the final purification pool even after multiple chromatography steps. With packed-bed chromatography runs using host-cell protein ELISAs and mass spectrometry analysis, we demonstrated that different therapeutic antibodies interact to varying degrees with host-cell proteins in general, and PLBL2 specifically. We then used a high-throughput Protein A chromatography method to further examine the interaction between our antibodies and PLBL2. Our results showed that the co-elution of PLBL2 during Protein A chromatography is highly dependent on the individual antibody and PLBL2 concentration in the chromatographic load. Process parameters such as antibody resin load density and pre-elution wash conditions also influence the levels of PLBL2 in the Protein A eluate. Furthermore, using surface plasmon resonance, we demonstrated that there is a preference for PLBL2 to interact with IgG4 subclass antibodies compared to IgG1 antibodies. Copyright © 2016 Elsevier B.V. All rights reserved.
CERN data services for LHC computing
NASA Astrophysics Data System (ADS)
Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.
2017-10-01
Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.
Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis
NASA Technical Reports Server (NTRS)
Montgomery, Todd L.
1995-01-01
This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.
On-chip ultraviolet holography for high-throughput nanoparticle and biomolecule detection
NASA Astrophysics Data System (ADS)
Daloglu, Mustafa Ugur; Ray, Aniruddha; Gorocs, Zoltán.; Xiong, Matthew; Malik, Ravinder; Bitan, Gal; McLeod, Euan; Ozcan, Aydogan
2018-02-01
Nanoparticle and biomolecule imaging has become an important need for various applications. In an effort to find a higher throughput alternative to existing devices, we have designed a lensfree on-chip holographic imaging platform operating at an ultraviolet (UV) wavelength of 266 nm. With a custom-designed free-space light delivery system to illuminate the sample that is placed very close (<0.5 mm) to an opto-electronic image sensor chip, without any imaging lenses in between, the full active area of the imager chip (>16 mm2 ) was utilized as the imaging field-of-view (FOV) capturing holographic signatures of target objects on a chip. These holograms were then digitally back propagated to extract both the amplitude and phase information of the sample. The increased forward scattering from nanoparticles due to this shorter illumination wavelength has enabled us to image individual particles that are smaller than 30 nm over an FOV of >16 mm2 . Our platform was further utilized in high-contrast imaging of nanoscopic biomolecule aggregates since 266 nm illumination light is strongly absorbed by biomolecules including proteins and nucleic acids. Aggregates of Cu/Zn-superoxide dismutase (SOD1), which has been linked to a fatal neurodegenerative disease, ALS (amyotrophic lateral sclerosis), have been imaged with significantly improved contrast compared to imaging at visible wavelengths. This unique UV imaging modality could be valuable for biomedical applications (e.g., viral load measurements) and environmental monitoring including air and water quality monitoring.
A high-throughput method for generating uniform microislands for autaptic neuronal cultures
Sgro, Allyson E.; Nowak, Amy L.; Austin, Naola S.; Custer, Kenneth L.; Allen, Peter B.; Chiu, Daniel T.; Bajjalieh, Sandra M.
2013-01-01
Generating microislands of culture substrate on coverslips by spray application of poly-D lysine is a commonly used method for culturing isolated neurons that form self (autaptic) synapses. This preparation has multiple advantages for studying synaptic transmission in isolation; however, generating microislands by spraying produces islands of non-uniform size and thus cultures vary widely in the number of islands containing single neurons. To address these problems, we developed a high-throughput method for reliably generating uniformly-shaped microislands of culture substrate. Stamp molds formed of poly(dimethylsiloxane) (PDMS) were fabricated with arrays of circles and used to generate stamps made of 9.2% agarose. The agarose stamps were capable of loading sufficient poly D-lysine and collagen dissolved in acetic acid to rapidly generate coverslips containing at least 64 microislands per coverslip. When hippocampal neurons were cultured on these coverslips, there were significantly more single-neuron islands per coverslip. We noted that single neurons tended to form one of three distinct neurite-arbor morphologies, which varied with island size and the location of the cell body on the island. To our surprise, the number of synapses per autaptic neuron did not correlate with arbor shape or island size, suggesting that other factors regulate the number of synapses formed by isolated neurons. The stamping method we report can be used to increase the number of single-neuron islands per culture and aid in the rapid visualization of microislands. PMID:21515305
Ken Dror, Shifra; Pavlotzky, Elsa; Barak, Mira
2016-01-01
Infectious gastroenteritis is a global health problem associated with high morbidity and mortality rates. Rapid and accurate diagnosis is crucial to allow appropriate and timely treatment. Current laboratory stool testing has a long turnaround time (TAT) and demands highly qualified personnel and multiple techniques. The need for high throughput and the number of possible enteric pathogens compels the implementation of a molecular approach which uses multiplex technology, without compromising performance requirements. In this work we evaluated the feasibility of the NanoCHIP® Gastrointestinal Panel (GIP) (Savyon Diagnostics, Ashdod, IL), a molecular microarray-based screening test, to be used in the routine workflow of our laboratory, a big outpatient microbiology laboratory. The NanoCHIP® GIP test provides simultaneous detection of nine major enteric bacteria and parasites: Campylobacter spp., Salmonella spp., Shigella spp., Giardia sp., Cryptosporidium spp., Entamoeba histolytica, Entamoeba dispar, Dientamoeba fragilis, and Blastocystis spp. The required high-throughput was obtained by the NanoCHIP® detection system together with the MagNA Pure 96 DNA purification system (Roche Diagnostics Ltd., Switzerland). This combined system has demonstrated a higher sensitivity and detection yield compared to the conventional methods in both, retrospective and prospective samples. The identification of multiple parasites and bacteria in a single test also enabled increased efficiency of detecting mixed infections, as well as reduced hands-on time and work load. In conclusion, the combination of these two automated systems is a proper response to the laboratory needs in terms of improving laboratory workflow, turn-around-time, minimizing human errors and can be efficiently integrated in the routine work of the laboratory. PMID:27447173
I describe research on high throughput exposure and toxicokinetics. These tools provide context for data generated by high throughput toxicity screening to allow risk-based prioritization of thousands of chemicals.
MIPHENO: Data normalization for high throughput metabolic analysis.
High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...
High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)
High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...
Gore, Brooklin
2018-02-01
This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.
Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas
2017-01-01
Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.
USDA-ARS?s Scientific Manuscript database
Field-based high-throughput phenotyping is an emerging approach to characterize difficult, time-sensitive plant traits in relevant growing conditions. Proximal sensing carts have been developed as an alternative platform to more costly high-clearance tractors for phenotyping dynamic traits in the fi...
High-throughput profiling and analysis of plant responses over time to abiotic stress
USDA-ARS?s Scientific Manuscript database
Energy sorghum (Sorghum bicolor (L.) Moench) is a rapidly growing, high-biomass, annual crop prized for abiotic stress tolerance. Measuring genotype-by-environment (G x E) interactions remains a progress bottleneck. High throughput phenotyping within controlled environments has been proposed as a po...
ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)
US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...
A high-throughput multiplex method adapted for GMO detection.
Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique
2008-12-24
A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.
RIPiT-Seq: A high-throughput approach for footprinting RNA:protein complexes
Singh, Guramrit; Ricci, Emiliano P.; Moore, Melissa J.
2013-01-01
Development of high-throughput approaches to map the RNA interaction sites of individual RNA binding proteins (RBPs) transcriptome-wide is rapidly transforming our understanding of post-transcriptional gene regulatory mechanisms. Here we describe a ribonucleoprotein (RNP) footprinting approach we recently developed for identifying occupancy sites of both individual RBPs and multi-subunit RNP complexes. RNA:protein immunoprecipitation in tandem (RIPiT) yields highly specific RNA footprints of cellular RNPs isolated via two sequential purifications; the resulting RNA footprints can then be identified by high-throughput sequencing (Seq). RIPiT-Seq is broadly applicable to all RBPs regardless of their RNA binding mode and thus provides a means to map the RNA binding sites of RBPs with poor inherent ultraviolet (UV) crosslinkability. Further, among current high-throughput approaches, RIPiT has the unique capacity to differentiate binding sites of RNPs with overlapping protein composition. It is therefore particularly suited for studying dynamic RNP assemblages whose composition evolves as gene expression proceeds. PMID:24096052
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Micro injector sample delivery system for charged molecules
Davidson, James C.; Balch, Joseph W.
1999-11-09
A micro injector sample delivery system for charged molecules. The injector is used for collecting and delivering controlled amounts of charged molecule samples for subsequent analysis. The injector delivery system can be scaled to large numbers (>96) for sample delivery to massively parallel high throughput analysis systems. The essence of the injector system is an electric field controllable loading tip including a section of porous material. By applying the appropriate polarity bias potential to the injector tip, charged molecules will migrate into porous material, and by reversing the polarity bias potential the molecules are ejected or forced away from the tip. The invention has application for uptake of charged biological molecules (e.g. proteins, nucleic acids, polymers, etc.) for delivery to analytical systems, and can be used in automated sample delivery systems.
A high performance, ad-hoc, fuzzy query processing system for relational databases
NASA Technical Reports Server (NTRS)
Mansfield, William H., Jr.; Fleischman, Robert M.
1992-01-01
Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.
A modified error correction protocol for CCITT signalling system no. 7 on satellite links
NASA Astrophysics Data System (ADS)
Kreuer, Dieter; Quernheim, Ulrich
1991-10-01
Comite Consultatif International des Telegraphe et Telephone (CCITT) Signalling System No. 7 (SS7) provides a level 2 error correction protocol particularly suited for links with propagation delays higher than 15 ms. Not being originally designed for satellite links, however, the so called Preventive Cyclic Retransmission (PCR) Method only performs well on satellite channels when traffic is low. A modified level 2 error control protocol, termed Fix Delay Retransmission (FDR) method is suggested which performs better at high loads, thus providing a more efficient use of the limited carrier capacity. Both the PCR and the FDR methods are investigated by means of simulation and results concerning throughput, queueing delay, and system delay, respectively. The FDR method exhibits higher capacity and shorter delay than the PCR method.
Nanostructured delivery system for Suberoylanilide hydroxamic acid against lung cancer cells.
Sankar, Renu; Karthik, Selvaraju; Subramanian, Natesan; Krishnaswami, Venkateshwaran; Sonnemann, Jürgen; Ravikumar, Vilwanathan
2015-06-01
With the objective to provide a potential approach for the treatment of lung cancer, nanotechnology based Suberoylanilide hydroxamic acid (SAHA)-loaded Poly-d, l-lactide-co glycolide (PLGA) nanoparticles have been formulated using the nanoprecipitation technique. The acquired nanoparticles were characterized by various throughput techniques and the analyses showed the presence of smooth and spherical shaped SAHA-loaded PLGA nanoparticles, with an encapsulation efficiency of 44.8% and a particle size of 208nm. The compatibility between polymer and drug in the formulation was tested using FT-IR, Micro-Raman spectrum and DSC thermogram analyses, revealing a major interaction between the drug and polymer. The in vitro drug release from the SAHA-loaded PLGA nanoparticles was found to be biphasic with an initial burst followed by a sustained release for up to 50h. In experiments using the lung cancer cell line A549, SAHA-loaded PLGA nanoparticles demonstrated a superior antineoplastic activity over free SAHA. In conclusion, SAHA-loaded PLGA nanoparticles may be a useful novel approach for the treatment of lung cancer. Copyright © 2015. Published by Elsevier B.V.
A high-throughput label-free nanoparticle analyser.
Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N
2011-05-01
Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.
High Performance Computing Modernization Program Kerberos Throughput Test Report
2017-10-26
functionality as Kerberos plugins. The pre -release production kit was used in these tests to compare against the current release kit. YubiKey support...HPCMP Kerberos Throughput Test Report 3 2. THROUGHPUT TESTING 2.1 Testing Components Throughput testing was done to determine the benefits of the pre ...both the current release kit and the pre -release production kit for a total of 378 individual tests in order to note any improvements. Based on work
Metabolomics Approach for Toxicity Screening of Volatile Substances
In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However, the ch...
AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.
As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...
New High Throughput Methods to Estimate Chemical Exposure
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...
Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays
High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...
Leulliot, Nicolas; Trésaugues, Lionel; Bremang, Michael; Sorel, Isabelle; Ulryck, Nathalie; Graille, Marc; Aboulfath, Ilham; Poupon, Anne; Liger, Dominique; Quevillon-Cheruel, Sophie; Janin, Joël; van Tilbeurgh, Herman
2005-06-01
Crystallization has long been regarded as one of the major bottlenecks in high-throughput structural determination by X-ray crystallography. Structural genomics projects have addressed this issue by using robots to set up automated crystal screens using nanodrop technology. This has moved the bottleneck from obtaining the first crystal hit to obtaining diffraction-quality crystals, as crystal optimization is a notoriously slow process that is difficult to automatize. This article describes the high-throughput optimization strategies used in the Yeast Structural Genomics project, with selected successful examples.
Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime
NASA Astrophysics Data System (ADS)
Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie
2017-09-01
Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.
High Throughput Determination of Critical Human Dosing ...
High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data into predicted human equivalent doses that can be linked with biologically relevant exposure scenarios. Thus, HTTK provides essential data for risk prioritization for thousands of chemicals that lack TK data. One critical HTTK parameter that can be measured in vitro is the unbound fraction of a chemical in plasma (Fub). However, for chemicals that bind strongly to plasma, Fub is below the limits of detection (LOD) for high throughput analytical chemistry, and therefore cannot be quantified. A novel method for quantifying Fub was implemented for 85 strategically selected chemicals: measurement of Fub was attempted at 10%, 30%, and 100% of physiological plasma concentrations using rapid equilibrium dialysis assays. Varying plasma concentrations instead of chemical concentrations makes high throughput analytical methodology more likely to be successful. Assays at 100% plasma concentration were unsuccessful for 34 chemicals. For 12 of these 34 chemicals, Fub could be quantified at 10% and/or 30% plasma concentrations; these results imply that the assay failure at 100% plasma concentration was caused by plasma protein binding for these chemicals. Assay failure for the remaining 22 chemicals may
Genome-wide RNAi Screening to Identify Host Factors That Modulate Oncolytic Virus Therapy.
Allan, Kristina J; Mahoney, Douglas J; Baird, Stephen D; Lefebvre, Charles A; Stojdl, David F
2018-04-03
High-throughput genome-wide RNAi (RNA interference) screening technology has been widely used for discovering host factors that impact virus replication. Here we present the application of this technology to uncovering host targets that specifically modulate the replication of Maraba virus, an oncolytic rhabdovirus, and vaccinia virus with the goal of enhancing therapy. While the protocol has been tested for use with oncolytic Maraba virus and oncolytic vaccinia virus, this approach is applicable to other oncolytic viruses and can also be utilized for identifying host targets that modulate virus replication in mammalian cells in general. This protocol describes the development and validation of an assay for high-throughput RNAi screening in mammalian cells, the key considerations and preparation steps important for conducting a primary high-throughput RNAi screen, and a step-by-step guide for conducting a primary high-throughput RNAi screen; in addition, it broadly outlines the methods for conducting secondary screen validation and tertiary validation studies. The benefit of high-throughput RNAi screening is that it allows one to catalogue, in an extensive and unbiased fashion, host factors that modulate any aspect of virus replication for which one can develop an in vitro assay such as infectivity, burst size, and cytotoxicity. It has the power to uncover biotherapeutic targets unforeseen based on current knowledge.
Schieferstein, Jeremy M.; Pawate, Ashtamurthy S.; Wan, Frank; Sheraden, Paige N.; Broecker, Jana; Ernst, Oliver P.; Gennis, Robert B.
2017-01-01
Elucidating and clarifying the function of membrane proteins ultimately requires atomic resolution structures as determined most commonly by X-ray crystallography. Many high impact membrane protein structures have resulted from advanced techniques such as in meso crystallization that present technical difficulties for the set-up and scale-out of high-throughput crystallization experiments. In prior work, we designed a novel, low-throughput X-ray transparent microfluidic device that automated the mixing of protein and lipid by diffusion for in meso crystallization trials. Here, we report X-ray transparent microfluidic devices for high-throughput crystallization screening and optimization that overcome the limitations of scale and demonstrate their application to the crystallization of several membrane proteins. Two complementary chips are presented: (1) a high-throughput screening chip to test 192 crystallization conditions in parallel using as little as 8 nl of membrane protein per well and (2) a crystallization optimization chip to rapidly optimize preliminary crystallization hits through fine-gradient re-screening. We screened three membrane proteins for new in meso crystallization conditions, identifying several preliminary hits that we tested for X-ray diffraction quality. Further, we identified and optimized the crystallization condition for a photosynthetic reaction center mutant and solved its structure to a resolution of 3.5 Å. PMID:28469762
High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.
Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E
2017-01-01
Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.
High-Throughput Toxicity Testing: New Strategies for ...
In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it
Energy-efficient boarder node medium access control protocol for wireless sensor networks.
Razaque, Abdul; Elleithy, Khaled M
2014-03-12
This paper introduces the design, implementation, and performance analysis of the scalable and mobility-aware hybrid protocol named boarder node medium access control (BN-MAC) for wireless sensor networks (WSNs), which leverages the characteristics of scheduled and contention-based MAC protocols. Like contention-based MAC protocols, BN-MAC achieves high channel utilization, network adaptability under heavy traffic and mobility, and low latency and overhead. Like schedule-based MAC protocols, BN-MAC reduces idle listening time, emissions, and collision handling at low cost at one-hop neighbor nodes and achieves high channel utilization under heavy network loads. BN-MAC is particularly designed for region-wise WSNs. Each region is controlled by a boarder node (BN), which is of paramount importance. The BN coordinates with the remaining nodes within and beyond the region. Unlike other hybrid MAC protocols, BN-MAC incorporates three promising models that further reduce the energy consumption, idle listening time, overhearing, and congestion to improve the throughput and reduce the latency. One of the models used with BN-MAC is automatic active and sleep (AAS), which reduces the ideal listening time. When nodes finish their monitoring process, AAS lets them automatically go into the sleep state to avoid the idle listening state. Another model used in BN-MAC is the intelligent decision-making (IDM) model, which helps the nodes sense the nature of the environment. Based on the nature of the environment, the nodes decide whether to use the active or passive mode. This decision power of the nodes further reduces energy consumption because the nodes turn off the radio of the transceiver in the passive mode. The third model is the least-distance smart neighboring search (LDSNS), which determines the shortest efficient path to the one-hop neighbor and also provides cross-layering support to handle the mobility of the nodes. The BN-MAC also incorporates a semi-synchronous feature with a low duty cycle, which is advantageous for reducing the latency and energy consumption for several WSN application areas to improve the throughput. BN-MAC uses a unique window slot size to enhance the contention resolution issue for improved throughput. BN-MAC also prefers to communicate within a one-hop destination using Anycast, which maintains load balancing to maintain network reliability. BN-MAC is introduced with the goal of supporting four major application areas: monitoring and behavioral areas, controlling natural disasters, human-centric applications, and tracking mobility and static home automation devices from remote places. These application areas require a congestion-free mobility-supported MAC protocol to guarantee reliable data delivery. BN-MAC was evaluated using network simulator-2 (ns2) and compared with other hybrid MAC protocols, such as Zebra medium access control (Z-MAC), advertisement-based MAC (A-MAC), Speck-MAC, adaptive duty cycle SMAC (ADC-SMAC), and low-power real-time medium access control (LPR-MAC). The simulation results indicate that BN-MAC is a robust and energy-efficient protocol that outperforms other hybrid MAC protocols in the context of quality of service (QoS) parameters, such as energy consumption, latency, throughput, channel access time, successful delivery rate, coverage efficiency, and average duty cycle.
Energy-Efficient Boarder Node Medium Access Control Protocol for Wireless Sensor Networks
Razaque, Abdul; Elleithy, Khaled M.
2014-01-01
This paper introduces the design, implementation, and performance analysis of the scalable and mobility-aware hybrid protocol named boarder node medium access control (BN-MAC) for wireless sensor networks (WSNs), which leverages the characteristics of scheduled and contention-based MAC protocols. Like contention-based MAC protocols, BN-MAC achieves high channel utilization, network adaptability under heavy traffic and mobility, and low latency and overhead. Like schedule-based MAC protocols, BN-MAC reduces idle listening time, emissions, and collision handling at low cost at one-hop neighbor nodes and achieves high channel utilization under heavy network loads. BN-MAC is particularly designed for region-wise WSNs. Each region is controlled by a boarder node (BN), which is of paramount importance. The BN coordinates with the remaining nodes within and beyond the region. Unlike other hybrid MAC protocols, BN-MAC incorporates three promising models that further reduce the energy consumption, idle listening time, overhearing, and congestion to improve the throughput and reduce the latency. One of the models used with BN-MAC is automatic active and sleep (AAS), which reduces the ideal listening time. When nodes finish their monitoring process, AAS lets them automatically go into the sleep state to avoid the idle listening state. Another model used in BN-MAC is the intelligent decision-making (IDM) model, which helps the nodes sense the nature of the environment. Based on the nature of the environment, the nodes decide whether to use the active or passive mode. This decision power of the nodes further reduces energy consumption because the nodes turn off the radio of the transceiver in the passive mode. The third model is the least-distance smart neighboring search (LDSNS), which determines the shortest efficient path to the one-hop neighbor and also provides cross-layering support to handle the mobility of the nodes. The BN-MAC also incorporates a semi-synchronous feature with a low duty cycle, which is advantageous for reducing the latency and energy consumption for several WSN application areas to improve the throughput. BN-MAC uses a unique window slot size to enhance the contention resolution issue for improved throughput. BN-MAC also prefers to communicate within a one-hop destination using Anycast, which maintains load balancing to maintain network reliability. BN-MAC is introduced with the goal of supporting four major application areas: monitoring and behavioral areas, controlling natural disasters, human-centric applications, and tracking mobility and static home automation devices from remote places. These application areas require a congestion-free mobility-supported MAC protocol to guarantee reliable data delivery. BN-MAC was evaluated using network simulator-2 (ns2) and compared with other hybrid MAC protocols, such as Zebra medium access control (Z-MAC), advertisement-based MAC (A-MAC), Speck-MAC, adaptive duty cycle SMAC (ADC-SMAC), and low-power real-time medium access control (LPR-MAC). The simulation results indicate that BN-MAC is a robust and energy-efficient protocol that outperforms other hybrid MAC protocols in the context of quality of service (QoS) parameters, such as energy consumption, latency, throughput, channel access time, successful delivery rate, coverage efficiency, and average duty cycle. PMID:24625737
Hughey, Justin R; Keen, Justin M; Miller, Dave A; Brough, Chris; McGinity, James W
2012-11-15
The primary aim of the present study was to investigate the ability of hydroxypropyl and methoxyl substituted cellulose ethers to stabilize supersaturated concentrations of itraconazole (ITZ), a poorly water-soluble weak base, after an acid-to-neutral pH transition. A secondary aim of the study was to evaluate the effect of fusion processes on polymer stability and molecular weight. Polymer screening studies showed that stabilization of ITZ supersaturation was related to the molecular weight of the polymer and levels of hydroxypropyl and methoxyl substitution. METHOCEL E50LV (E50LV), which is characterized as having a high melt viscosity, was selected for solid dispersion formulation studies. Hot-melt extrusion processing of E50LV based compositions resulted in high torque loads, low material throughput and polymer degradation. KinetiSol Dispersing, a novel fusion based processing technique, was evaluated as a method to prepare the solid dispersions with reduced levels of polymer degradation. An experimental design revealed that polymer molecular weight was sensitive to shearing forces and high temperatures. However, optimal processing conditions resulted in significantly reduced E50LV degradation relative to HME processing. The technique was effectively utilized to prepare homogenous solid solutions of E50LV and ITZ, characterized as having a single glass transition temperature over a wide range of drug loadings. All prepared compositions provided for a high degree of ITZ supersaturation stabilization. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh
2006-01-01
We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.
Throughput analysis of the IEEE 802.4 token bus standard under heavy load
NASA Technical Reports Server (NTRS)
Pang, Joseph; Tobagi, Fouad
1987-01-01
It has become clear in the last few years that there is a trend towards integrated digital services. Parallel to the development of public Integrated Services Digital Network (ISDN) is service integration in the local area (e.g., a campus, a building, an aircraft). The types of services to be integrated depend very much on the specific local environment. However, applications tend to generate data traffic belonging to one of two classes. According to IEEE 802.4 terminology, the first major class of traffic is termed synchronous, such as packetized voice and data generated from other applications with real-time constraints, and the second class is called asynchronous which includes most computer data traffic such as file transfer or facsimile. The IEEE 802.4 token bus protocol which was designed to support both synchronous and asynchronous traffic is examined. The protocol is basically a timer-controlled token bus access scheme. By a suitable choice of the design parameters, it can be shown that access delay is bounded for synchronous traffic. As well, the bandwidth allocated to asynchronous traffic can be controlled. A throughput analysis of the protocol under heavy load with constant channel occupation of synchronous traffic and constant token-passing times is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takamiya, Mari; Discovery Technology Laboratories, Sohyaku, Innovative Research Division, Mitsubishi Tanabe Pharma Corporation, Kawagishi, Toda-shi, Saitama; Sakurai, Masaaki
A high-throughput RapidFire mass spectrometry assay is described for elongation of very long-chain fatty acids family 6 (Elovl6). Elovl6 is a microsomal enzyme that regulates the elongation of C12-16 saturated and monounsaturated fatty acids. Elovl6 may be a new therapeutic target for fat metabolism disorders such as obesity, type 2 diabetes, and nonalcoholic steatohepatitis. To identify new Elovl6 inhibitors, we developed a high-throughput fluorescence screening assay in 1536-well format. However, a number of false positives caused by fluorescent interference have been identified. To pick up the real active compounds among the primary hits from the fluorescence assay, we developed amore » RapidFire mass spectrometry assay and a conventional radioisotope assay. These assays have the advantage of detecting the main products directly without using fluorescent-labeled substrates. As a result, 276 compounds (30%) of the primary hits (921 compounds) in a fluorescence ultra-high-throughput screening method were identified as common active compounds in these two assays. It is concluded that both methods are very effective to eliminate false positives. Compared with the radioisotope method using an expensive {sup 14}C-labeled substrate, the RapidFire mass spectrometry method using unlabeled substrates is a high-accuracy, high-throughput method. In addition, some of the hit compounds selected from the screening inhibited cellular fatty acid elongation in HEK293 cells expressing Elovl6 transiently. This result suggests that these compounds may be promising lead candidates for therapeutic drugs. Ultra-high-throughput fluorescence screening followed by a RapidFire mass spectrometry assay was a suitable strategy for lead discovery against Elovl6. - Highlights: • A novel assay for elongation of very-long-chain fatty acids 6 (Elovl6) is proposed. • RapidFire mass spectrometry (RF-MS) assay is useful to select real screening hits. • RF-MS assay is proved to be beneficial because of its high-throughput and accuracy. • A combination of fluorescent and RF-MS assays is effective for Elovl6 inhibitors.« less
Little is known about the developmental toxicity of the expansive chemical landscape in existence today. Significant efforts are being made to apply novel methods to predict developmental activity of chemicals utilizing high-throughput screening (HTS) and high-content screening (...
High-throughput assays that can quantify chemical-induced changes at the cellular and molecular level have been recommended for use in chemical safety assessment. High-throughput, high content imaging assays for the key cellular events of neurodevelopment have been proposed to ra...
Evaluation of sequencing approaches for high-throughput toxicogenomics (SOT)
Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platfo...
High Throughput Assays and Exposure Science (ISES annual meeting)
High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...
High Throughput Exposure Estimation Using NHANES Data (SOT)
In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...
Atlanta I-85 HOV-to-HOT conversion : analysis of vehicle and person throughput.
DOT National Transportation Integrated Search
2013-10-01
This report summarizes the vehicle and person throughput analysis for the High Occupancy Vehicle to High Occupancy Toll Lane : conversion in Atlanta, GA, undertaken by the Georgia Institute of Technology research team. The team tracked changes in : o...
Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...
Accounting For Uncertainty in The Application Of High Throughput Datasets
The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...
Jordan, Scott
2018-01-24
Scott Jordan on "Advances in high-throughput speed, low-latency communication for embedded instrumentation" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.
Inter-Individual Variability in High-Throughput Risk Prioritization of Environmental Chemicals (Sot)
We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have...
We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...
HIGH-THROUGHPUT IDENTIFICATION OF CATALYTIC REDOX-ACTIVE CYSTEINE RESIDUES
Cysteine (Cys) residues often play critical roles in proteins; however, identification of their specific functions has been limited to case-by-case experimental approaches. We developed a procedure for high-throughput identification of catalytic redox-active Cys in proteins by se...
Development of a thyroperoxidase inhibition assay for high-throughput screening
High-throughput screening (HTPS) assays to detect inhibitors of thyroperoxidase (TPO), the enzymatic catalyst for thyroid hormone (TH) synthesis, are not currently available. Herein we describe the development of a HTPS TPO inhibition assay. Rat thyroid microsomes and a fluores...
High-throughput screening, predictive modeling and computational embryology - Abstract
High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...
Evaluating and Refining High Throughput Tools for Toxicokinetics
This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...
Picking Cell Lines for High-Throughput Transcriptomic Toxicity Screening (SOT)
High throughput, whole genome transcriptomic profiling is a promising approach to comprehensively evaluate chemicals for potential biological effects. To be useful for in vitro toxicity screening, gene expression must be quantified in a set of representative cell types that captu...
Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...
A rapid enzymatic assay for high-throughput screening of adenosine-producing strains
Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei
2015-01-01
Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842
Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).
Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475
A comparison of high-throughput techniques for assaying circadian rhythms in plants.
Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony
2015-01-01
Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.
Turning tumor-promoting copper into an anti-cancer weapon via high-throughput chemistry.
Wang, F; Jiao, P; Qi, M; Frezza, M; Dou, Q P; Yan, B
2010-01-01
Copper is an essential element for multiple biological processes. Its concentration is elevated to a very high level in cancer tissues for promoting cancer development through processes such as angiogenesis. Organic chelators of copper can passively reduce cellular copper and serve the role as inhibitors of angiogenesis. However, they can also actively attack cellular targets such as proteasome, which plays a critical role in cancer development and survival. The discovery of such molecules initially relied on a step by step synthesis followed by biological assays. Today high-throughput chemistry and high-throughput screening have significantly expedited the copper-binding molecules discovery to turn "cancer-promoting" copper into anti-cancer agents.
Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor
2016-01-01
Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.
A high performance hardware implementation image encryption with AES algorithm
NASA Astrophysics Data System (ADS)
Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab
2011-06-01
This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.
Microfluidic squeezing for intracellular antigen loading in polyclonal B-cells as cellular vaccines
NASA Astrophysics Data System (ADS)
Lee Szeto, Gregory; van Egeren, Debra; Worku, Hermoon; Sharei, Armon; Alejandro, Brian; Park, Clara; Frew, Kirubel; Brefo, Mavis; Mao, Shirley; Heimann, Megan; Langer, Robert; Jensen, Klavs; Irvine, Darrell J.
2015-05-01
B-cells are promising candidate autologous antigen-presenting cells (APCs) to prime antigen-specific T-cells both in vitro and in vivo. However to date, a significant barrier to utilizing B-cells as APCs is their low capacity for non-specific antigen uptake compared to “professional” APCs such as dendritic cells. Here we utilize a microfluidic device that employs many parallel channels to pass single cells through narrow constrictions in high throughput. This microscale “cell squeezing” process creates transient pores in the plasma membrane, enabling intracellular delivery of whole proteins from the surrounding medium into B-cells via mechano-poration. We demonstrate that both resting and activated B-cells process and present antigens delivered via mechano-poration exclusively to antigen-specific CD8+T-cells, and not CD4+T-cells. Squeezed B-cells primed and expanded large numbers of effector CD8+T-cells in vitro that produced effector cytokines critical to cytolytic function, including granzyme B and interferon-γ. Finally, antigen-loaded B-cells were also able to prime antigen-specific CD8+T-cells in vivo when adoptively transferred into mice. Altogether, these data demonstrate crucial proof-of-concept for mechano-poration as an enabling technology for B-cell antigen loading, priming of antigen-specific CD8+T-cells, and decoupling of antigen uptake from B-cell activation.
Quesada-Cabrera, Raul; Weng, Xiaole; Hyett, Geoff; Clark, Robin J H; Wang, Xue Z; Darr, Jawwad A
2013-09-09
High-throughput continuous hydrothermal flow synthesis was used to manufacture 66 unique nanostructured oxide samples in the Ce-Zr-Y-O system. This synthesis approach resulted in a significant increase in throughput compared to that of conventional batch or continuous hydrothermal synthesis methods. The as-prepared library samples were placed into a wellplate for both automated high-throughput powder X-ray diffraction and Raman spectroscopy data collection, which allowed comprehensive structural characterization and phase mapping. The data suggested that a continuous cubic-like phase field connects all three Ce-Zr-O, Ce-Y-O, and Y-Zr-O binary systems together with a smooth and steady transition between the structures of neighboring compositions. The continuous hydrothermal process led to as-prepared crystallite sizes in the range of 2-7 nm (as determined by using the Scherrer equation).
State of the Art High-Throughput Approaches to Genotoxicity: Flow Micronucleus, Ames II, GreenScreen and Comet (Presented by Dr. Marilyn J. Aardema, Chief Scientific Advisor, Toxicology, Dr. Leon Stankowski, et. al. (6/28/2012)
Fun with High Throughput Toxicokinetics (CalEPA webinar)
Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...
Incorporating Human Dosimetry and Exposure into High-Throughput In Vitro Toxicity Screening
Many chemicals in commerce today have undergone limited or no safety testing. To reduce the number of untested chemicals and prioritize limited testing resources, several governmental programs are using high-throughput in vitro screens for assessing chemical effects across multip...
Environmental Impact on Vascular Development Predicted by High Throughput Screening
Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...
High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials
United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...
AOPs and Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making
As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will b...
We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...
HTTK: R Package for High-Throughput Toxicokinetics
Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...
tcpl: The ToxCast Pipeline for High-Throughput Screening Data
Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...
High-throughput screening, predictive modeling and computational embryology
High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...
In Vitro Toxicity Screening Technique for Volatile Substances Using Flow-Through System#
In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However the cha...
High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, c...
High-Throughput Toxicokinetics (HTTK) R package (CompTox CoP presentation)
Toxicokinetics (TK) provides a bridge between HTS and HTE by predicting tissue concentrations due to exposure, but traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to determine range of effic...
NASA Astrophysics Data System (ADS)
Lagus, Todd P.; Edd, Jon F.
2013-03-01
Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.
High-Throughput Cloning and Expression Library Creation for Functional Proteomics
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-01-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047
High-Throughput Lectin Microarray-Based Analysis of Live Cell Surface Glycosylation
Li, Yu; Tao, Sheng-ce; Zhu, Heng; Schneck, Jonathan P.
2011-01-01
Lectins, plant-derived glycan-binding proteins, have long been used to detect glycans on cell surfaces. However, the techniques used to characterize serum or cells have largely been limited to mass spectrometry, blots, flow cytometry, and immunohistochemistry. While these lectin-based approaches are well established and they can discriminate a limited number of sugar isomers by concurrently using a limited number of lectins, they are not amenable for adaptation to a high-throughput platform. Fortunately, given the commercial availability of lectins with a variety of glycan specificities, lectins can be printed on a glass substrate in a microarray format to profile accessible cell-surface glycans. This method is an inviting alternative for analysis of a broad range of glycans in a high-throughput fashion and has been demonstrated to be a feasible method of identifying binding-accessible cell surface glycosylation on living cells. The current unit presents a lectin-based microarray approach for analyzing cell surface glycosylation in a high-throughput fashion. PMID:21400689
Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.
2016-01-01
Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633
Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko
2012-01-01
Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904
NCBI GEO: archive for high-throughput functional genomic data.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron
2009-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.
Khan, Arifa S; Vacante, Dominick A; Cassart, Jean-Pol; Ng, Siemon H S; Lambert, Christophe; Charlebois, Robert L; King, Kathryn E
Several nucleic-acid based technologies have recently emerged with capabilities for broad virus detection. One of these, high throughput sequencing, has the potential for novel virus detection because this method does not depend upon prior viral sequence knowledge. However, the use of high throughput sequencing for testing biologicals poses greater challenges as compared to other newly introduced tests due to its technical complexities and big data bioinformatics. Thus, the Advanced Virus Detection Technologies Users Group was formed as a joint effort by regulatory and industry scientists to facilitate discussions and provide a forum for sharing data and experiences using advanced new virus detection technologies, with a focus on high throughput sequencing technologies. The group was initiated as a task force that was coordinated by the Parenteral Drug Association and subsequently became the Advanced Virus Detection Technologies Interest Group to continue efforts for using new technologies for detection of adventitious viruses with broader participation, including international government agencies, academia, and technology service providers. © PDA, Inc. 2016.
The application of the high throughput sequencing technology in the transposable elements.
Liu, Zhen; Xu, Jian-hong
2015-09-01
High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.
Near-common-path interferometer for imaging Fourier-transform spectroscopy in wide-field microscopy
Wadduwage, Dushan N.; Singh, Vijay Raj; Choi, Heejin; Yaqoob, Zahid; Heemskerk, Hans; Matsudaira, Paul; So, Peter T. C.
2017-01-01
Imaging Fourier-transform spectroscopy (IFTS) is a powerful method for biological hyperspectral analysis based on various imaging modalities, such as fluorescence or Raman. Since the measurements are taken in the Fourier space of the spectrum, it can also take advantage of compressed sensing strategies. IFTS has been readily implemented in high-throughput, high-content microscope systems based on wide-field imaging modalities. However, there are limitations in existing wide-field IFTS designs. Non-common-path approaches are less phase-stable. Alternatively, designs based on the common-path Sagnac interferometer are stable, but incompatible with high-throughput imaging. They require exhaustive sequential scanning over large interferometric path delays, making compressive strategic data acquisition impossible. In this paper, we present a novel phase-stable, near-common-path interferometer enabling high-throughput hyperspectral imaging based on strategic data acquisition. Our results suggest that this approach can improve throughput over those of many other wide-field spectral techniques by more than an order of magnitude without compromising phase stability. PMID:29392168
An improved high-throughput lipid extraction method for the analysis of human brain lipids.
Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett
2013-03-01
We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.
Graph-based signal integration for high-throughput phenotyping
2012-01-01
Background Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. Results MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. Conclusions We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping. PMID:23320851
High-throughput cloning and expression library creation for functional proteomics.
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-05-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particularly important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single-gene experiments, creating the need for fast, flexible, and reliable cloning systems. These collections of ORF clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial, we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator(TM) DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This tutorial is part of the International Proteomics Tutorial Programme (IPTP12). © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zhu, Shiyou; Li, Wei; Liu, Jingze; Chen, Chen-Hao; Liao, Qi; Xu, Ping; Xu, Han; Xiao, Tengfei; Cao, Zhongzheng; Peng, Jingyu; Yuan, Pengfei; Brown, Myles; Liu, Xiaole Shirley; Wei, Wensheng
2017-01-01
CRISPR/Cas9 screens have been widely adopted to analyse coding gene functions, but high throughput screening of non-coding elements using this method is more challenging, because indels caused by a single cut in non-coding regions are unlikely to produce a functional knockout. A high-throughput method to produce deletions of non-coding DNA is needed. Herein, we report a high throughput genomic deletion strategy to screen for functional long non-coding RNAs (lncRNAs) that is based on a lentiviral paired-guide RNA (pgRNA) library. Applying our screening method, we identified 51 lncRNAs that can positively or negatively regulate human cancer cell growth. We individually validated 9 lncRNAs using CRISPR/Cas9-mediated genomic deletion and functional rescue, CRISPR activation or inhibition, and gene expression profiling. Our high-throughput pgRNA genome deletion method should enable rapid identification of functional mammalian non-coding elements. PMID:27798563
Choudhry, Priya
2016-01-01
Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849
High-throughput determination of structural phase diagram and constituent phases using GRENDEL
NASA Astrophysics Data System (ADS)
Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.
2015-11-01
Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.
High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics
NASA Astrophysics Data System (ADS)
Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine
2016-06-01
Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.
AD HOC Networks for the Autonomous Car
NASA Astrophysics Data System (ADS)
Ron, Davidescu; Negrus, Eugen
2017-10-01
The future of the vehicle is made of cars, roads and infrastructures connected in a two way automated communication in a holistic system. It is a mandatory to use Encryption to maintain Confidentiality, Integrity and Availability in an ad hoc vehicle network. Vehicle to Vehicle communication, requires multichannel interaction between mobile, moving and changing parties to insure the full benefit from data sharing and real time decision making, a network of such users referred as mobile ad hoc network (MANET), however as ad hoc networks were not implemented in such a scale, it is not clear what is the best method and protocol to apply. Furthermore the visibility of secure preferred asymmetric encrypted ad hoc networks in a real time environment of dense moving autonomous vehicles has to be demonstrated, In order to evaluate the performance of Ad Hoc networks in changing conditions a simulation of multiple protocols was performed on large number of mobile nodes. The following common routing protocols were tested, DSDV is a proactive protocol, every mobile station maintains a routing table with all available destinations, DSR is a reactive routing protocol which allows nodes in the MANET to dynamically discover a source route across multiple network hops, AODV is a reactive routing protocol Instead of being proactive. It minimizes the number of broadcasts by creating routes based on demand, SAODV is a secure version of AODV, requires heavyweight asymmetric cryptographic, ARIANDE is a routing protocol that relies on highly efficient symmetric cryptography the concept is primarily based on DSR. A methodical evolution was performed in a various density of transportation, based on known communication bench mark parameters including, Throughput Vs. time, Routing Load per packets and bytes. Out of the none encrypted protocols, It is clear that in terms of performance of throughput and routing load DSR protocol has a clear advantage the high node number mode. The encrypted protocols show lower performance with ARIANDE being superior to SAODV and SRP. Nevertheless all protocol simulation proved it to match required real time performance.
Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won
2015-08-21
We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.
Optimization and high-throughput screening of antimicrobial peptides.
Blondelle, Sylvie E; Lohner, Karl
2010-01-01
While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.
Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.
Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen
2018-07-20
Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang
2017-04-01
Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.
High Throughput Transcriptomics @ USEPA (Toxicology ...
The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.
Mobile element biology – new possibilities with high-throughput sequencing
Xing, Jinchuan; Witherspoon, David J.; Jorde, Lynn B.
2014-01-01
Mobile elements compose more than half of the human genome, but until recently their large-scale detection was time-consuming and challenging. With the development of new high-throughput sequencing technologies, the complete spectrum of mobile element variation in humans can now be identified and analyzed. Thousands of new mobile element insertions have been discovered, yielding new insights into mobile element biology, evolution, and genomic variation. We review several high-throughput methods, with an emphasis on techniques that specifically target mobile element insertions in humans, and we highlight recent applications of these methods in evolutionary studies and in the analysis of somatic alterations in human cancers. PMID:23312846
Advances in high throughput DNA sequence data compression.
Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz
2016-06-01
Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
LOCATE: a mouse protein subcellular localization database
Fink, J. Lynn; Aturaliya, Rajith N.; Davis, Melissa J.; Zhang, Fasheng; Hanson, Kelly; Teasdale, Melvena S.; Kai, Chikatoshi; Kawai, Jun; Carninci, Piero; Hayashizaki, Yoshihide; Teasdale, Rohan D.
2006-01-01
We present here LOCATE, a curated, web-accessible database that houses data describing the membrane organization and subcellular localization of proteins from the FANTOM3 Isoform Protein Sequence set. Membrane organization is predicted by the high-throughput, computational pipeline MemO. The subcellular locations of selected proteins from this set were determined by a high-throughput, immunofluorescence-based assay and by manually reviewing >1700 peer-reviewed publications. LOCATE represents the first effort to catalogue the experimentally verified subcellular location and membrane organization of mammalian proteins using a high-throughput approach and provides localization data for ∼40% of the mouse proteome. It is available at . PMID:16381849
USDA-ARS?s Scientific Manuscript database
Extraction of DNA from tissue samples can be expensive both in time and monetary resources and can often require handling and disposal of hazardous chemicals. We have developed a high throughput protocol for extracting DNA from honey bees that is of a high enough quality and quantity to enable hundr...
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure, however traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...
Under the ExpoCast program, United States Environmental Protection Agency (EPA) researchers have developed a high-throughput (HT) framework for estimating aggregate exposures to chemicals from multiple pathways to support rapid prioritization of chemicals. Here, we present method...
Environmental surveillance and monitoring. The next frontiers for high-throughput toxicology
High throughput toxicity testing (HTT) technologies along with the world-wide web are revolutionizing both generation and access to data regarding the bioactivities that chemicals can elicit when they interact with specific proteins, genes, or other targets in the body of an orga...
High-Throughput Models for Exposure-Based Chemical Prioritization in the ExpoCast Project
The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research pr...
High-Throughput Exposure Potential Prioritization for ToxCast Chemicals
The U.S. EPA must consider lists of hundreds to thousands of chemicals when prioritizing research resources in order to identify risk to human populations and the environment. High-throughput assays to identify biological activity in vitro have allowed the ToxCastTM program to i...
Use of High-Throughput Testing and Approaches for Evaluating Chemical Risk-Relevance to Humans
ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput screening (HTS) and computational models that integrate knowledge of biological systems and in vivo toxicities. Many of these assays probe signaling pathways and cellular processes critical to...
High-Throughput Simulation of Environmental Chemical Fate for Exposure Prioritization
The U.S. EPA must consider lists of hundreds to thousands of chemicals when allocating resources to identify risk in human populations and the environment. High-throughput screening assays to characterize biological activity in vitro have allowed the ToxCastTM program to identify...
Notredame, Cedric
2018-05-02
Cedric Notredame from the Centre for Genomic Regulation gives a presentation on New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era at the JGI/Argonne HPC Workshop on January 26, 2010.
USDA-ARS?s Scientific Manuscript database
Contigs with sequence similarities to several nucleorhabdoviruses were identified by high-throughput sequencing analysis from a black currant (Ribes nigrum L.) cultivar. The complete genomic sequence of this new nucleorhabdovirus is 14,432 nucleotides. Its genomic organization is typical of nucleorh...
Estimating Toxicity Pathway Activating Doses for High Throughput Chemical Risk Assessments
Estimating a Toxicity Pathway Activating Dose (TPAD) from in vitro assays as an analog to a reference dose (RfD) derived from in vivo toxicity tests would facilitate high throughput risk assessments of thousands of data-poor environmental chemicals. Estimating a TPAD requires def...
Momentum is growing worldwide to use in vitro high-throughput screening (HTS) to evaluate human health effects of chemicals. However, the integration of dosimetry into HTS assays and incorporation of population variability will be essential before its application in a risk assess...
The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracterization...
Defining the taxonomic domain of applicability for mammalian-based high-throughput screening assays
Cell-based high throughput screening (HTS) technologies are becoming mainstream in chemical safety evaluations. The US Environmental Protection Agency (EPA) Toxicity Forecaster (ToxCastTM) and the multi-agency Tox21 Programs have been at the forefront in advancing this science, m...
Athavale, Ajay
2018-01-04
Ajay Athavale (Monsanto) presents "High Throughput Plasmid Sequencing with Illumina and CLC Bio" at the 7th Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting held in June, 2012 in Santa Fe, NM.
The Impact of Data Fragmentation on High-Throughput Clinical Phenotyping
ERIC Educational Resources Information Center
Wei, Weiqi
2012-01-01
Subject selection is essential and has become the rate-limiting step for harvesting knowledge to advance healthcare through clinical research. Present manual approaches inhibit researchers from conducting deep and broad studies and drawing confident conclusions. High-throughput clinical phenotyping (HTCP), a recently proposed approach, leverages…
High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.
Chen, Yu-Chih; Yoon, Euisik
2017-01-01
Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.
Droplet microfluidic technology for single-cell high-throughput screening.
Brouzes, Eric; Medkova, Martina; Savenelli, Neal; Marran, Dave; Twardowski, Mariusz; Hutchison, J Brian; Rothberg, Jonathan M; Link, Darren R; Perrimon, Norbert; Samuels, Michael L
2009-08-25
We present a droplet-based microfluidic technology that enables high-throughput screening of single mammalian cells. This integrated platform allows for the encapsulation of single cells and reagents in independent aqueous microdroplets (1 pL to 10 nL volumes) dispersed in an immiscible carrier oil and enables the digital manipulation of these reactors at a very high-throughput. Here, we validate a full droplet screening workflow by conducting a droplet-based cytotoxicity screen. To perform this screen, we first developed a droplet viability assay that permits the quantitative scoring of cell viability and growth within intact droplets. Next, we demonstrated the high viability of encapsulated human monocytic U937 cells over a period of 4 days. Finally, we developed an optically-coded droplet library enabling the identification of the droplets composition during the assay read-out. Using the integrated droplet technology, we screened a drug library for its cytotoxic effect against U937 cells. Taken together our droplet microfluidic platform is modular, robust, uses no moving parts, and has a wide range of potential applications including high-throughput single-cell analyses, combinatorial screening, and facilitating small sample analyses.
Microfluidic guillotine for single-cell wound repair studies
NASA Astrophysics Data System (ADS)
Blauch, Lucas R.; Gai, Ya; Khor, Jian Wei; Sood, Pranidhi; Marshall, Wallace F.; Tang, Sindy K. Y.
2017-07-01
Wound repair is a key feature distinguishing living from nonliving matter. Single cells are increasingly recognized to be capable of healing wounds. The lack of reproducible, high-throughput wounding methods has hindered single-cell wound repair studies. This work describes a microfluidic guillotine for bisecting single Stentor coeruleus cells in a continuous-flow manner. Stentor is used as a model due to its robust repair capacity and the ability to perform gene knockdown in a high-throughput manner. Local cutting dynamics reveals two regimes under which cells are bisected, one at low viscous stress where cells are cut with small membrane ruptures and high viability and one at high viscous stress where cells are cut with extended membrane ruptures and decreased viability. A cutting throughput up to 64 cells per minute—more than 200 times faster than current methods—is achieved. The method allows the generation of more than 100 cells in a synchronized stage of their repair process. This capacity, combined with high-throughput gene knockdown in Stentor, enables time-course mechanistic studies impossible with current wounding methods.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.
Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N
2004-01-01
Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.
2012-01-01
Background As Next-Generation Sequencing data becomes available, existing hardware environments do not provide sufficient storage space and computational power to store and process the data due to their enormous size. This is and will be a frequent problem that is encountered everyday by researchers who are working on genetic data. There are some options available for compressing and storing such data, such as general-purpose compression software, PBAT/PLINK binary format, etc. However, these currently available methods either do not offer sufficient compression rates, or require a great amount of CPU time for decompression and loading every time the data is accessed. Results Here, we propose a novel and simple algorithm for storing such sequencing data. We show that, the compression factor of the algorithm ranges from 16 to several hundreds, which potentially allows SNP data of hundreds of Gigabytes to be stored in hundreds of Megabytes. We provide a C++ implementation of the algorithm, which supports direct loading and parallel loading of the compressed format without requiring extra time for decompression. By applying the algorithm to simulated and real datasets, we show that the algorithm gives greater compression rate than the commonly used compression methods, and the data-loading process takes less time. Also, The C++ library provides direct-data-retrieving functions, which allows the compressed information to be easily accessed by other C++ programs. Conclusions The SpeedGene algorithm enables the storage and the analysis of next generation sequencing data in current hardware environment, making system upgrades unnecessary. PMID:22591016
High-Reflectivity Coatings for a Vacuum Ultraviolet Spectropolarimeter
NASA Astrophysics Data System (ADS)
Narukage, Noriyuki; Kubo, Masahito; Ishikawa, Ryohko; Ishikawa, Shin-nosuke; Katsukawa, Yukio; Kobiki, Toshihiko; Giono, Gabriel; Kano, Ryouhei; Bando, Takamasa; Tsuneta, Saku; Auchère, Frédéric; Kobayashi, Ken; Winebarger, Amy; McCandless, Jim; Chen, Jianrong; Choi, Joanne
2017-03-01
Precise polarization measurements in the vacuum ultraviolet (VUV) region are expected to be a new tool for inferring the magnetic fields in the upper atmosphere of the Sun. High-reflectivity coatings are key elements to achieving high-throughput optics for precise polarization measurements. We fabricated three types of high-reflectivity coatings for a solar spectropolarimeter in the hydrogen Lyman-α (Lyα; 121.567 nm) region and evaluated their performance. The first high-reflectivity mirror coating offers a reflectivity of more than 80 % in Lyα optics. The second is a reflective narrow-band filter coating that has a peak reflectivity of 57 % in Lyα, whereas its reflectivity in the visible light range is lower than 1/10 of the peak reflectivity (˜ 5 % on average). This coating can be used to easily realize a visible light rejection system, which is indispensable for a solar telescope, while maintaining high throughput in the Lyα line. The third is a high-efficiency reflective polarizing coating that almost exclusively reflects an s-polarized beam at its Brewster angle of 68° with a reflectivity of 55 %. This coating achieves both high polarizing power and high throughput. These coatings contributed to the high-throughput solar VUV spectropolarimeter called the Chromospheric Lyman-Alpha SpectroPolarimeter (CLASP), which was launched on 3 September, 2015.
A simple and sensitive high-throughput GFP screening in woody and herbaceous plants.
Hily, Jean-Michel; Liu, Zongrang
2009-03-01
Green fluorescent protein (GFP) has been used widely as a powerful bioluminescent reporter, but its visualization by existing methods in tissues or whole plants and its utilization for high-throughput screening remains challenging in many species. Here, we report a fluorescence image analyzer-based method for GFP detection and its utility for high-throughput screening of transformed plants. Of three detection methods tested, the Typhoon fluorescence scanner was able to detect GFP fluorescence in all Arabidopsis thaliana tissues and apple leaves, while regular fluorescence microscopy detected it only in Arabidopsis flowers and siliques but barely in the leaves of either Arabidopsis or apple. The hand-held UV illumination method failed in all tissues of both species. Additionally, the Typhoon imager was able to detect GFP fluorescence in both green and non-green tissues of Arabidopsis seedlings as well as in imbibed seeds, qualifying it as a high-throughput screening tool, which was further demonstrated by screening the seedlings of primary transformed T(0) seeds. Of the 30,000 germinating Arabidopsis seedlings screened, at least 69 GFP-positive lines were identified, accounting for an approximately 0.23% transformation efficiency. About 14,000 seedlings grown in 16 Petri plates could be screened within an hour, making the screening process significantly more efficient and robust than any other existing high-throughput screening method for transgenic plants.
Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K
2018-01-01
The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.
Alterman, Julia F; Coles, Andrew H; Hall, Lauren M; Aronin, Neil; Khvorova, Anastasia; Didiot, Marie-Cécile
2017-08-20
Primary neurons represent an ideal cellular system for the identification of therapeutic oligonucleotides for the treatment of neurodegenerative diseases. However, due to the sensitive nature of primary cells, the transfection of small interfering RNAs (siRNA) using classical methods is laborious and often shows low efficiency. Recent progress in oligonucleotide chemistry has enabled the development of stabilized and hydrophobically modified small interfering RNAs (hsiRNAs). This new class of oligonucleotide therapeutics shows extremely efficient self-delivery properties and supports potent and durable effects in vitro and in vivo . We have developed a high-throughput in vitro assay to identify and test hsiRNAs in primary neuronal cultures. To simply, rapidly, and accurately quantify the mRNA silencing of hundreds of hsiRNAs, we use the QuantiGene 2.0 quantitative gene expression assay. This high-throughput, 96-well plate-based assay can quantify mRNA levels directly from sample lysate. Here, we describe a method to prepare short-term cultures of mouse primary cortical neurons in a 96-well plate format for high-throughput testing of oligonucleotide therapeutics. This method supports the testing of hsiRNA libraries and the identification of potential therapeutics within just two weeks. We detail methodologies of our high throughput assay workflow from primary neuron preparation to data analysis. This method can help identify oligonucleotide therapeutics for treatment of various neurological diseases.
Yoshii, Yukie; Furukawa, Takako; Waki, Atsuo; Okuyama, Hiroaki; Inoue, Masahiro; Itoh, Manabu; Zhang, Ming-Rong; Wakizaka, Hidekatsu; Sogawa, Chizuru; Kiyono, Yasushi; Yoshii, Hiroshi; Fujibayashi, Yasuhisa; Saga, Tsuneo
2015-05-01
Anti-cancer drug development typically utilizes high-throughput screening with two-dimensional (2D) cell culture. However, 2D culture induces cellular characteristics different from tumors in vivo, resulting in inefficient drug development. Here, we report an innovative high-throughput screening system using nanoimprinting 3D culture to simulate in vivo conditions, thereby facilitating efficient drug development. We demonstrated that cell line-based nanoimprinting 3D screening can more efficiently select drugs that effectively inhibit cancer growth in vivo as compared to 2D culture. Metabolic responses after treatment were assessed using positron emission tomography (PET) probes, and revealed similar characteristics between the 3D spheroids and in vivo tumors. Further, we developed an advanced method to adopt cancer cells from patient tumor tissues for high-throughput drug screening with nanoimprinting 3D culture, which we termed Cancer tissue-Originated Uniformed Spheroid Assay (COUSA). This system identified drugs that were effective in xenografts of the original patient tumors. Nanoimprinting 3D spheroids showed low permeability and formation of hypoxic regions inside, similar to in vivo tumors. Collectively, the nanoimprinting 3D culture provides easy-handling high-throughput drug screening system, which allows for efficient drug development by mimicking the tumor environment. The COUSA system could be a useful platform for drug development with patient cancer cells. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy
Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian
2016-01-01
Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925
Dunning, F Mark; Piazza, Timothy M; Zeytin, Füsûn N; Tucker, Ward C
2014-03-03
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Direct screening of herbal blends for new synthetic cannabinoids by MALDI-TOF MS.
Gottardo, Rossella; Chiarini, Anna; Dal Prà, Ilaria; Seri, Catia; Rimondo, Claudia; Serpelloni, Giovanni; Armato, Ubaldo; Tagliaro, Franco
2012-01-01
Since 2004, a number of herbal blends containing different synthetic compounds mimicking the pharmacological activity of cannabinoids and displaying a high toxicological potential have appeared in the market. Their availability is mainly based on the so-called "e-commerce", being sold as legal alternatives to cannabis and cannabis derivatives. Although highly selective, sensitive, accurate, and quantitative methods based on GC-MS and LC-MS are available, they lack simplicity, rapidity, versatility and throughput, which are required for product monitoring. In this context, matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) offers a simple and rapid operation with high throughput. Thus, the aim of the present work was to develop a MALDI-TOF MS method for the rapid qualitative direct analysis of herbal blend preparations for synthetic cannabinoids to be used as front screening of confiscated clandestine preparations. The sample preparation was limited to herbal blend leaves finely grinding in a mortar and loading onto the MALDI plate followed by addition of 2 µl of the matrix/surfactant mixture [α-cyano-4-hydroxy-cinnamic acid/cetyltrimethylammonium bromide (CTAB)]. After drying, the sample plate was introduced into the ion source for analysis. MALDI-TOF conditions were as follows: mass spectra were analyzed in the range m/z 150-550 by averaging the data from 50 laser shots and using an accelerating voltage of 20 kV. The described method was successfully applied to the screening of 31 commercial herbal blends, previously analyzed by GC-MS. Among the samples analyzed, 21 contained synthetic cannabinoids (namely JWH-018, JWH-073, JWH-081, JWH-250, JWH-210, JWH-019, and AM-694). All the results were in agreement with GC-MS, which was used as the reference technique. Copyright © 2012 John Wiley & Sons, Ltd.
Break-up of droplets in a concentrated emulsion flowing through a narrow constriction
NASA Astrophysics Data System (ADS)
Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team
2014-11-01
Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.
In response to a proposed vision and strategy for toxicity testing in the 21st century nascent high throughput toxicology (HTT) programs have tested thousands of chemicals in hundreds of pathway-based biological assays. Although, to date, use of HTT data for safety assessment of ...
Molecular characterization of a novel Luteovirus from peach identified by high-throughput sequencing
USDA-ARS?s Scientific Manuscript database
Contigs with sequence homologies to Cherry-associated luteovirus were identified by high-throughput sequencing analysis of two peach accessions undergoing quarantine testing. The complete genomic sequences of the two isolates of this virus are 5,819 and 5,814 nucleotides. Their genome organization i...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Toxicokinetics (TK) provides a bridge between toxicity and exposure assessment by predicting tissue concentrations due to exposure. However traditional TK methods are resource intensive. Relatively high throughput TK (HTTK) methods have been used by the pharmaceutical industry to...
Results from rodent and non-rodent prenatal developmental toxicity tests for over 300 chemicals have been curated into the relational database ToxRefDB. These same chemicals have been run in concentration-response format through over 500 high-throughput screening assays assessin...
Discovery of viruses and virus-like pathogens in pistachio using high-throughput sequencing
USDA-ARS?s Scientific Manuscript database
Pistachio (Pistacia vera L.) trees from the National Clonal Germplasm Repository (NCGR) and orchards in California were surveyed for viruses and virus-like agents by high-throughput sequencing (HTS). Analyses of 60 trees including clonal UCB-1 hybrid rootstock (P. atlantica × P. integerrima) identif...
SeqAPASS to evaluate conservation of high-throughput screening targets across non-mammalian species
Cell-based high-throughput screening (HTS) and computational technologies are being applied as tools for toxicity testing in the 21st century. The U.S. Environmental Protection Agency (EPA) embraced these technologies and created the ToxCast Program in 2007, which has served as a...
20180312 - Applying a High-Throughput PBTK Model for IVIVE (SOT)
The ability to link in vitro and in vivo toxicity enables the use of high-throughput in vitro assays as an alternative to resource intensive animal studies. Toxicokinetics (TK) should help describe this link, but prior work found weak correlation when using a TK model for in vitr...
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples...
The U.S. EPA must consider thousands of chemicals when allocating resources to assess risk in human populations and the environment. High-throughput screening assays to characterize biological activity in vitro are being implemented in the ToxCastTM program to rapidly characteri...
Application of the Adverse Outcome Pathway (AOP) framework and high throughput toxicity testing in chemical-specific risk assessment requires reconciliation of chemical concentrations sufficient to trigger a molecular initiating event measured in vitro and at the relevant target ...
Evaluation of food-relevant chemicals in the ToxCast high-throughput screening program
There are thousands of chemicals that are directly added to or come in contact with food, many of which have undergone little to no toxicological evaluation. The ToxCast high-throughput screening (HTS) program has evaluated over 1,800 chemicals in concentration-response across ~8...
In vitro based assays are used to identify potential endocrine disrupting chemicals. Thyroperoxidase (TPO), an enzyme essential for thyroid hormone (TH) synthesis, is a target site for disruption of the thyroid axis for which a high-throughput screening (HTPS) assay has recently ...
We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compare...
An Evaluation of 25 Selected ToxCast Chemicals in Medium-Throughput Assays to Detect Genotoxicity
ABSTRACTToxCast is a multi-year effort to develop a cost-effective approach for the US EPA to prioritize chemicals for toxicity testing. Initial evaluation of more than 500 high-throughput (HT) microwell-based assays without metabolic activation showed that most lacked high speci...
High Throughput Assays for Exposure Science (NIEHS OHAT Staff Meeting presentation)
High throughput screening (HTS) data that characterize chemically induced biological activity have been generated for thousands of chemicals by the US interagency Tox21 and the US EPA ToxCast programs. In many cases there are no data available for comparing bioactivity from HTS w...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
The U.S. Environmental Protection Agency’s ToxCast program has screened thousands of chemicals for biological activity, primarily using high-throughput in vitro bioassays. Adverse outcome pathways (AOPs) offer a means to link pathway-specific biological activities with potential ...
“httk”: EPA’s Tool for High Throughput Toxicokinetics (CompTox CoP)
Thousands of chemicals have been pro?led by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concentr...
High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limi...
Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing
In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. ...
The toxicity-testing paradigm has evolved to include high-throughput (HT) methods for addressing the increasing need to screen hundreds to thousands of chemicals rapidly. Approaches that involve in vitro screening assays, in silico predictions of exposure concentrations, and phar...
Predictive Model of Rat Reproductive Toxicity from ToxCast High Throughput Screening
The EPA ToxCast research program uses high throughput screening for bioactivity profiling and predicting the toxicity of large numbers of chemicals. ToxCast Phase‐I tested 309 well‐characterized chemicals in over 500 assays for a wide range of molecular targets and cellular respo...
Applying a High-Throughput PBTK Model for IVIVE
The ability to link in vitro and in vivo toxicity enables the use of high-throughput in vitro assays as an alternative to resource intensive animal studies. Toxicokinetics (TK) should help describe this link, but prior work found weak correlation when using a TK model for in vitr...
High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)
USDA-ARS?s Scientific Manuscript database
Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...
In vitro, high-throughput approaches have been widely recommended as an approach to screen chemicals for the potential to cause developmental neurotoxicity and prioritize them for additional testing. The choice of cellular models for such an approach will have important ramificat...
High-throughput exposure modeling to support prioritization of chemicals in personal care products
We demonstrate the application of a high-throughput modeling framework to estimate exposure to chemicals used in personal care products (PCPs). As a basis for estimating exposure, we use the product intake fraction (PiF), defined as the mass of chemical taken by an individual or ...
AbstractHigh-throughput methods are useful for rapidly screening large numbers of chemicals for biological activity, including the perturbation of pathways that may lead to adverse cellular effects. In vitro assays for the key events of neurodevelopment, including apoptosis, may ...
One use of alternative methods is to target animal use at only those chemicals and tests that are absolutely necessary. We discuss prioritization of testing based on high-throughput screening assays (HTS), QSAR modeling, high-throughput toxicokinetics (HTTK), and exposure modelin...
We demonstrate a computational network model that integrates 18 in vitro, high-throughput screening assays measuring estrogen receptor (ER) binding, dimerization, chromatin binding, transcriptional activation and ER-dependent cell proliferation. The network model uses activity pa...
The CTD2 Center at Emory University used high-throughput protein-protein interaction (PPI) mapping for Hippo signaling pathway profiling to rapidly unveil promising PPIs as potential therapeutic targets and advance functional understanding of signaling circuitry in cells. Read the abstract.
Over the past ten years, the US government has invested in high-throughput (HT) methods to screen chemicals for biological activity. Under the interagency Tox21 consortium and the US Environmental Protection Agency’s (EPA) ToxCast™ program, thousands of chemicals have...
USDA-ARS?s Scientific Manuscript database
In the last few years, high-throughput genomics promised to bridge the gap between plant physiology and plant sciences. In addition, high-throughput genotyping technologies facilitate marker-based selection for better performing genotypes. In strawberry, Fragaria vesca was the first reference sequen...
Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Automated crystallographic system for high-throughput protein structure determination.
Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F
2003-07-01
High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.
Advanced High-Level Waste Glass Research and Development Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, David K.; Vienna, John D.; Schweiger, Michael J.
2015-07-01
The U.S. Department of Energy Office of River Protection (ORP) has implemented an integrated program to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product quality requirements. The integrated ORP program is focused on providing a technical, science-based foundation from which key decisions can be made regarding the successful operation of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) facilities. The fundamental data stemming from this program will support development of advanced glass formulations, key process control models, and tactical processing strategies to ensure safe and successful operations formore » both the low-activity waste (LAW) and high-level waste (HLW) vitrification facilities with an appreciation toward reducing overall mission life. The purpose of this advanced HLW glass research and development plan is to identify the near-, mid-, and longer-term research and development activities required to develop and validate advanced HLW glasses and their associated models to support facility operations at WTP, including both direct feed and full pretreatment flowsheets. This plan also integrates technical support of facility operations and waste qualification activities to show the interdependence of these activities with the advanced waste glass (AWG) program to support the full WTP mission. Figure ES-1 shows these key ORP programmatic activities and their interfaces with both WTP facility operations and qualification needs. The plan is a living document that will be updated to reflect key advancements and mission strategy changes. The research outlined here is motivated by the potential for substantial economic benefits (e.g., significant increases in waste throughput and reductions in glass volumes) that will be realized when advancements in glass formulation continue and models supporting facility operations are implemented. Developing and applying advanced glass formulations will reduce the cost of Hanford tank waste management by reducing the schedule for tank waste treatment and reducing the amount of HLW glass for storage, transportation, and disposal. Additional benefits will be realized if advanced glasses are developed that demonstrate more tolerance for key components in the waste (such as Al 2O 3, Cr 2O 3, SO 3 and Na 2O) above the currently defined WTP constraints. Tolerating these higher concentrations of key waste loading limiters may reduce the burden on (or even eliminate the need for) leaching to remove Cr and Al and washing to remove excess S and Na from the HLW fraction. Advanced glass formulations may also make direct vitrification of the HLW fraction without significant pretreatment more cost effective. Finally, the advanced glass formulation efforts seek not only to increase waste loading in glass, but also to increase glass production rate. When coupled with higher waste loading, ensuring that all of the advanced glass formulations are processable at or above the current contract processing rate leads to significant improvements in waste throughput (the amount of waste being processed per unit time),which could significantly reduce the overall WTP mission life. The integration of increased waste loading, reduced leaching/washing requirements, and improved melting rates provides a system-wide approach to improve the effectiveness of the WTP process.« less
A high-throughput, multi-channel photon-counting detector with picosecond timing
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.
2009-06-01
High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.
Direct assembling methodologies for high-throughput bioscreening
Rodríguez-Dévora, Jorge I.; Shi, Zhi-dong; Xu, Tao
2012-01-01
Over the last few decades, high-throughput (HT) bioscreening, a technique that allows rapid screening of biochemical compound libraries against biological targets, has been widely used in drug discovery, stem cell research, development of new biomaterials, and genomics research. To achieve these ambitions, scaffold-free (or direct) assembly of biological entities of interest has become critical. Appropriate assembling methodologies are required to build an efficient HT bioscreening platform. The development of contact and non-contact assembling systems as a practical solution has been driven by a variety of essential attributes of the bioscreening system, such as miniaturization, high throughput, and high precision. The present article reviews recent progress on these assembling technologies utilized for the construction of HT bioscreening platforms. PMID:22021162
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; ...
2016-09-23
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
Lambert, Nathaniel D.; Pankratz, V. Shane; Larrabee, Beth R.; Ogee-Nwankwo, Adaeze; Chen, Min-hsin; Icenogle, Joseph P.
2014-01-01
Rubella remains a social and economic burden due to the high incidence of congenital rubella syndrome (CRS) in some countries. For this reason, an accurate and efficient high-throughput measure of antibody response to vaccination is an important tool. In order to measure rubella-specific neutralizing antibodies in a large cohort of vaccinated individuals, a high-throughput immunocolorimetric system was developed. Statistical interpolation models were applied to the resulting titers to refine quantitative estimates of neutralizing antibody titers relative to the assayed neutralizing antibody dilutions. This assay, including the statistical methods developed, can be used to assess the neutralizing humoral immune response to rubella virus and may be adaptable for assessing the response to other viral vaccines and infectious agents. PMID:24391140
NASA Astrophysics Data System (ADS)
Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung
2010-12-01
This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.
Suram, Santosh K; Newhouse, Paul F; Zhou, Lan; Van Campen, Douglas G; Mehta, Apurva; Gregoire, John M
2016-11-14
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4 V 1.5 Fe 0.5 O 10.5 as a light absorber with direct band gap near 2.7 eV. The strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platform for identifying new optical materials.
Optimization of hydrogen dispersion in thermophilic up-flow reactors for ex situ biogas upgrading.
Bassani, Ilaria; Kougias, Panagiotis G; Treu, Laura; Porté, Hugo; Campanaro, Stefano; Angelidaki, Irini
2017-06-01
This study evaluates the efficiency of four novel up-flow reactors for ex situ biogas upgrading converting externally provided CO 2 and H 2 to CH 4 , via hydrogenotrophic methanogenesis. The gases were injected through stainless steel diffusers combined with alumina ceramic sponge or through alumina ceramic membranes. Pore size, input gas loading and gas recirculation flow rate were modulated to optimize gas-liquid mass transfer, and thus methanation efficiency. Results showed that larger pore size diffusion devices achieved the best kinetics and output-gas quality converting all the injected H 2 and CO 2 , up to 3.6L/L REACTOR ·d H 2 loading rate. Specifically, reactors' CH 4 content increased from 23 to 96% and the CH 4 yield reached 0.25L CH4/ L H2 . High throughput 16S rRNA gene sequencing revealed predominance of bacteria belonging to Anaerobaculum genus and to uncultured order MBA08. Additionally, the massive increase of hydrogenotrophic methanogens, such as Methanothermobacter thermautotrophicus, and syntrophic bacteria demonstrates the selection-effect of H 2 on community composition. Copyright © 2017 Elsevier Ltd. All rights reserved.
improved and higher throughput methods for analysis of biomass feedstocks Agronomics-using NIR spectroscopy in-house and external client training. She has also developed improved and high-throughput methods
High-throughput and automated SAXS/USAXS experiment for industrial use at BL19B2 in SPring-8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osaka, Keiichi, E-mail: k-osaka@spring8.or.jp; Inoue, Daisuke; Sato, Masugu
A highly automated system combining a sample transfer robot with focused SR beam has been established for small-angle and ultra small-angle X-ray scattering (SAXS/USAXS) measurement at BL19B2 for industrial use of SPring-8. High-throughput data collection system can be realized by means of X-ray beam of high photon flux density concentrated by a cylindrical mirror, and a two-dimensional pixel detector PILATUS-2M. For SAXS measurement, we can obtain high-quality data within 1 minute for one exposure using this system. The sample transfer robot has a capacity of 90 samples with a large variety of shapes. The fusion of high-throughput and robotic systemmore » has enhanced the usability of SAXS/USAXS capability for industrial application.« less
Multistrip western blotting to increase quantitative data output.
Kiyatkin, Anatoly; Aksamitiene, Edita
2009-01-01
The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.
Development of critical dimension measurement scanning electron microscope for ULSI (S-8000 series)
NASA Astrophysics Data System (ADS)
Ezumi, Makoto; Otaka, Tadashi; Mori, Hiroyoshi; Todokoro, Hideo; Ose, Yoichi
1996-05-01
The semiconductor industry is moving from half-micron to quarter-micron design rules. To support this evolution, Hitachi has developed a new critical dimension measurement scanning electron microscope (CD-SEM), the model S-8800 series, for quality control of quarter- micron process lines. The new CD-SEM provides detailed examination of process conditions with 5 nm resolution and 5 nm repeatability (3 sigma) at accelerating voltage 800 V using secondary electron imaging. In addition, a newly developed load-lock system has a capability of achieving a high sample throughput of 20 wafers/hour (5 point measurements per wafer) under continuous operation. To support user friendliness, the system incorporates a graphical user interface (GUI), an automated pattern recognition system which helps locating measurement points, both manual and semi-automated operation, and user-programmable operating parameters.
IEEE 802.11e EDCF performance evaluation
NASA Astrophysics Data System (ADS)
Huang, Benxiong; Zhang, Fan; Wang, Yan; Wang, Xiaoling
2004-04-01
This paper evaluates the performances of the contention-based channel access mechanism of IEEE 802.11e, called enhanced distributed coordination function (EDCF), compared with the 802.11 legacy MAC in supporting voice, video and data applications through network simulation of a scenario of 802.11e. Then we discuss the effects of Contention Window (CW) and Arbitration Inter-Frame Space (AIFS) on service differentiation and total throughput. We also consider an optional feature of the EDCF, called contention-free burst (CFB). Through our simulation study, we can draw a conclusion that the EDCF with TXOP can provide better-differentiated channel access for different traffic types than EDCF without TXOP especially at high traffic load conditions. But the movements caused by the parameters in CFB seem a lot bouncing and instability when in different application and configuration.
Power management of remote microgrids considering battery lifetime
NASA Astrophysics Data System (ADS)
Chalise, Santosh
Currently, 20% (1.3 billion) of the world's population still lacks access to electricity and many live in remote areas where connection to the grid is not economical or practical. Remote microgrids could be the solution to the problem because they are designed to provide power for small communities within clearly defined electrical boundaries. Reducing the cost of electricity for remote microgrids can help to increase access to electricity for populations in remote areas and developing countries. The integration of renewable energy and batteries in diesel based microgrids has shown to be effective in reducing fuel consumption. However, the operational cost remains high due to the low lifetime of batteries, which are heavily used to improve the system's efficiency. In microgrid operation, a battery can act as a source to augment the generator or a load to ensure full load operation. In addition, a battery increases the utilization of PV by storing extra energy. However, the battery has a limited energy throughput. Therefore, it is required to provide balance between fuel consumption and battery lifetime throughput in order to lower the cost of operation. This work presents a two-layer power management system for remote microgrids. First layer is day ahead scheduling, where power set points of dispatchable resources were calculated. Second layer is real time dispatch, where schedule set points from the first layer are accepted and resources are dispatched accordingly. A novel scheduling algorithm is proposed for a dispatch layer, which considers the battery lifetime in optimization and is expected to reduce the operational cost of the microgrid. This method is based on a goal programming approach which has the fuel and the battery wear cost as two objectives to achieve. The effectiveness of this method was evaluated through a simulation study of a PV-diesel hybrid microgrid using deterministic and stochastic approach of optimization.
USDA-ARS?s Scientific Manuscript database
The effect of refrigeration on bacterial communities within raw and pasteurized buffalo milk was studied using high-throughput sequencing. High quality samples of raw buffalo milk were obtained from five dairy farms in the Guangxi province of China. A sample of each milk was pasteurized, and both r...
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila.
Chiaraviglio, Lucius; Kirby, James E
2015-12-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila
Chiaraviglio, Lucius
2015-01-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509
High-throughput sequence alignment using Graphics Processing Units
Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh
2007-01-01
Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU. PMID:18070356
Rice-Map: a new-generation rice genome browser.
Wang, Jun; Kong, Lei; Zhao, Shuqi; Zhang, He; Tang, Liang; Li, Zhe; Gu, Xiaocheng; Luo, Jingchu; Gao, Ge
2011-03-30
The concurrent release of rice genome sequences for two subspecies (Oryza sativa L. ssp. japonica and Oryza sativa L. ssp. indica) facilitates rice studies at the whole genome level. Since the advent of high-throughput analysis, huge amounts of functional genomics data have been delivered rapidly, making an integrated online genome browser indispensable for scientists to visualize and analyze these data. Based on next-generation web technologies and high-throughput experimental data, we have developed Rice-Map, a novel genome browser for researchers to navigate, analyze and annotate rice genome interactively. More than one hundred annotation tracks (81 for japonica and 82 for indica) have been compiled and loaded into Rice-Map. These pre-computed annotations cover gene models, transcript evidences, expression profiling, epigenetic modifications, inter-species and intra-species homologies, genetic markers and other genomic features. In addition to these pre-computed tracks, registered users can interactively add comments and research notes to Rice-Map as User-Defined Annotation entries. By smoothly scrolling, dragging and zooming, users can browse various genomic features simultaneously at multiple scales. On-the-fly analysis for selected entries could be performed through dedicated bioinformatic analysis platforms such as WebLab and Galaxy. Furthermore, a BioMart-powered data warehouse "Rice Mart" is offered for advanced users to fetch bulk datasets based on complex criteria. Rice-Map delivers abundant up-to-date japonica and indica annotations, providing a valuable resource for both computational and bench biologists. Rice-Map is publicly accessible at http://www.ricemap.org/, with all data available for free downloading.
High throughput screening of CO2 solubility in aqueous monoamine solutions.
Porcheron, Fabien; Gibert, Alexandre; Mougin, Pascal; Wender, Aurélie
2011-03-15
Post-combustion Carbon Capture and Storage technology (CCS) is viewed as an efficient solution to reduce CO(2) emissions of coal-fired power stations. In CCS, an aqueous amine solution is commonly used as a solvent to selectively capture CO(2) from the flue gas. However, this process generates additional costs, mostly from the reboiler heat duty required to release the carbon dioxide from the loaded solvent solution. In this work, we present thermodynamic results of CO(2) solubility in aqueous amine solutions from a 6-reactor High Throughput Screening (HTS) experimental device. This device is fully automated and designed to perform sequential injections of CO(2) within stirred-cell reactors containing the solvent solutions. The gas pressure within each reactor is monitored as a function of time, and the resulting transient pressure curves are transformed into CO(2) absorption isotherms. Solubility measurements are first performed on monoethanolamine, diethanolamine, and methyldiethanolamine aqueous solutions at T = 313.15 K. Experimental results are compared with existing data in the literature to validate the HTS device. In addition, a comprehensive thermodynamic model is used to represent CO(2) solubility variations in different classes of amine structures upon a wide range of thermodynamic conditions. This model is used to fit the experimental data and to calculate the cyclic capacity, which is a key parameter for CO(2) process design. Solubility measurements are then performed on a set of 50 monoamines and cyclic capacities are extracted using the thermodynamic model, to asses the potential of these molecules for CO(2) capture.
Preliminary Assessment of Microwave Readout Multiplexing Factor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croce, Mark Philip; Koehler, Katrina Elizabeth; Rabin, Michael W.
2017-01-23
Ultra-high resolution microcalorimeter gamma spectroscopy is a new non-destructive assay technology for measurement of plutonium isotopic composition, with the potential to reduce total measurement uncertainty to a level competitive with destructive analysis methods [1-4]. Achieving this level of performance in practical applications requires not only the energy resolution now routinely achieved with transition-edge sensor microcalorimeter arrays (an order of magnitude better than for germanium detectors) but also high throughput. Microcalorimeter gamma spectrometers have not yet achieved detection efficiency and count rate capability that is comparable to germanium detectors, largely because of limits from existing readout technology. Microcalorimeter detectors must bemore » operated at low temperature to achieve their exceptional energy resolution. Although the typical 100 mK operating temperatures can be achieved with reliable, cryogen-free systems, the cryogenic complexity and heat load from individual readout channels for large sensor arrays is prohibitive. Multiplexing is required for practical systems. The most mature multiplexing technology at present is time-division multiplexing (TDM) [3, 5-6]. In TDM, the sensor outputs are switched by applying bias current to one SQUID amplifier at a time. Transition-edge sensor (TES) microcalorimeter arrays as large as 256 pixels have been developed for X-ray and gamma-ray spectroscopy using TDM technology. Due to bandwidth limits and noise scaling, TDM is limited to a maximum multiplexing factor of approximately 32-40 sensors on one readout line [8]. Increasing the size of microcalorimeter arrays above the kilopixel scale, required to match the throughput of germanium detectors, requires the development of a new readout technology with a much higher multiplexing factor.« less
An adaptive density-based routing protocol for flying Ad Hoc networks
NASA Astrophysics Data System (ADS)
Zheng, Xueli; Qi, Qian; Wang, Qingwen; Li, Yongqiang
2017-10-01
An Adaptive Density-based Routing Protocol (ADRP) for Flying Ad Hoc Networks (FANETs) is proposed in this paper. The main objective is to calculate forwarding probability adaptively in order to increase the efficiency of forwarding in FANETs. ADRP dynamically fine-tunes the rebroadcasting probability of a node for routing request packets according to the number of neighbour nodes. Indeed, it is more interesting to privilege the retransmission by nodes with little neighbour nodes. We describe the protocol, implement it and evaluate its performance using NS-2 network simulator. Simulation results reveal that ADRP achieves better performance in terms of the packet delivery fraction, average end-to-end delay, normalized routing load, normalized MAC load and throughput, which is respectively compared with AODV.
Sasagawa, Yohei; Danno, Hiroki; Takada, Hitomi; Ebisawa, Masashi; Tanaka, Kaori; Hayashi, Tetsutaro; Kurisaki, Akira; Nikaido, Itoshi
2018-03-09
High-throughput single-cell RNA-seq methods assign limited unique molecular identifier (UMI) counts as gene expression values to single cells from shallow sequence reads and detect limited gene counts. We thus developed a high-throughput single-cell RNA-seq method, Quartz-Seq2, to overcome these issues. Our improvements in the reaction steps make it possible to effectively convert initial reads to UMI counts, at a rate of 30-50%, and detect more genes. To demonstrate the power of Quartz-Seq2, we analyzed approximately 10,000 transcriptomes from in vitro embryonic stem cells and an in vivo stromal vascular fraction with a limited number of reads.
Hattrick-Simpers, Jason R.; Gregoire, John M.; Kusne, A. Gilad
2016-05-26
With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. Here, we review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams andmore » beyond.« less
Optimizing multi-dimensional high throughput screening using zebrafish
Truong, Lisa; Bugel, Sean M.; Chlebowski, Anna; Usenko, Crystal Y.; Simonich, Michael T.; Massey Simonich, Staci L.; Tanguay, Robert L.
2016-01-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. PMID:27453428
Combinatorial and high-throughput approaches in polymer science
NASA Astrophysics Data System (ADS)
Zhang, Huiqi; Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.
2005-01-01
Combinatorial and high-throughput approaches have become topics of great interest in the last decade due to their potential ability to significantly increase research productivity. Recent years have witnessed a rapid extension of these approaches in many areas of the discovery of new materials including pharmaceuticals, inorganic materials, catalysts and polymers. This paper mainly highlights our progress in polymer research by using an automated parallel synthesizer, microwave synthesizer and ink-jet printer. The equipment and methodologies in our experiments, the high-throughput experimentation of different polymerizations (such as atom transfer radical polymerization, cationic ring-opening polymerization and emulsion polymerization) and the automated matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS) sample preparation are described.
Zador, Anthony M.; Dubnau, Joshua; Oyibo, Hassana K.; Zhan, Huiqing; Cao, Gang; Peikon, Ian D.
2012-01-01
Connectivity determines the function of neural circuits. Historically, circuit mapping has usually been viewed as a problem of microscopy, but no current method can achieve high-throughput mapping of entire circuits with single neuron precision. Here we describe a novel approach to determining connectivity. We propose BOINC (“barcoding of individual neuronal connections”), a method for converting the problem of connectivity into a form that can be read out by high-throughput DNA sequencing. The appeal of using sequencing is that its scale—sequencing billions of nucleotides per day is now routine—is a natural match to the complexity of neural circuits. An inexpensive high-throughput technique for establishing circuit connectivity at single neuron resolution could transform neuroscience research. PMID:23109909
High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME
NASA Astrophysics Data System (ADS)
Otis, Richard A.; Liu, Zi-Kui
2017-05-01
One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.
Guan, Yue Hugh; Hewitson, Peter; van den Heuvel, Remco N A M; Zhao, Yan; Siebers, Rick P G; Zhuang, Ying-Ping; Sutherland, Ian
2015-12-11
Manufacturing high-value added biotech biopharmaceutical products (e.g. therapeutic proteins) requires quick-to-develop, GMP-compliant, easy-to-scale and cost effective preparatory chromatography technologies. In this work, we describe the construction and testing of a set of 5-mm inner diameter stainless steel toroidal columns for use on commercially available preparatory scale synchronous J-type counter-current chromatography (CCC) machinery. We used a 20.2m long column with an aqueous two-phase system containing 14% (w/w) PEG1000 and 14% (w/w) potassium phosphate at pH 7, and tested a sample loading of 5% column volume and a mobile phase flow rate of 20ml/min. We then satisfactorily demonstrated the potential for a weekly protein separation and preparation throughput of ca. 11g based on a normal weekly routine for separating a pair of model proteins by making five stacked injections on a single portion of stationary phase with no stripping. Compared to our previous 1.6mm bore PTFE toroidal column, the present columns enlarged the nominal column processing throughput by nearly 10. For an ideal model protein injection modality, we observed a scaling up factor of at least 21. The 2 scales of protein separation and purification steps were realized on the same commercial CCC device. Copyright © 2015 Elsevier B.V. All rights reserved.
Factor analysis and predictive validity of microcomputer-based tests
NASA Technical Reports Server (NTRS)
Kennedy, R. S.; Baltzley, D. R.; Turnage, J. J.; Jones, M. B.
1989-01-01
11 tests were selected from two microcomputer-based performance test batteries because previously these tests exhibited rapid stability (less than 10 min, of practice) and high retest reliability efficiencies (r greater than 0.707 for each 3 min. of testing). The battery was administered three times to each of 108 college students (48 men and 60 women) and a factor analysis was performed. Two of the three identified factors appear to be related to information processing ("encoding" and "throughput/decoding"), and the third named an "output/speed" factor. The spatial, memory, and verbal tests loaded on the "encoding" factor and included Grammatical Reasoning, Pattern Comparison, Continuous Recall, and Matrix Rotation. The "throughput/decoding" tests included perceptual/numerical tests like Math Processing, Code Substitution, and Pattern Comparison. The output speed factor was identified by Tapping and Reaction Time tests. The Wonderlic Personnel Test was group administered before the first and after the last administration of the performance tests. The multiple Rs in the total sample between combined Wonderlic as a criterion and less than 5 min. of microcomputer testing on Grammatical Reasoning and Math Processing as predictors ranged between 0.41 and 0.52 on the three test administrations. Based on these results, the authors recommend a core battery which, if time permits, would consist of two tests from each factor. Such a battery is now known to permit stable, reliable, and efficient assessment.
Real time network traffic monitoring for wireless local area networks based on compressed sensing
NASA Astrophysics Data System (ADS)
Balouchestani, Mohammadreza
2017-05-01
A wireless local area network (WLAN) is an important type of wireless networks which connotes different wireless nodes in a local area network. WLANs suffer from important problems such as network load balancing, large amount of energy, and load of sampling. This paper presents a new networking traffic approach based on Compressed Sensing (CS) for improving the quality of WLANs. The proposed architecture allows reducing Data Delay Probability (DDP) to 15%, which is a good record for WLANs. The proposed architecture is increased Data Throughput (DT) to 22 % and Signal to Noise (S/N) ratio to 17 %, which provide a good background for establishing high qualified local area networks. This architecture enables continuous data acquisition and compression of WLAN's signals that are suitable for a variety of other wireless networking applications. At the transmitter side of each wireless node, an analog-CS framework is applied at the sensing step before analog to digital converter in order to generate the compressed version of the input signal. At the receiver side of wireless node, a reconstruction algorithm is applied in order to reconstruct the original signals from the compressed signals with high probability and enough accuracy. The proposed algorithm out-performs existing algorithms by achieving a good level of Quality of Service (QoS). This ability allows reducing 15 % of Bit Error Rate (BER) at each wireless node.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
AIRNET: A real-time comunications network for aircraft
NASA Technical Reports Server (NTRS)
Weaver, Alfred C.; Cain, Brendan G.; Colvin, M. Alexander; Simoncic, Robert
1990-01-01
A real-time local area network was developed for use on aircraft and space vehicles. It uses token ring technology to provide high throughput, low latency, and high reliability. The system was implemented on PCs and PC/ATs operating on PCbus, and on Intel 8086/186/286/386s operating on Multibus. A standard IEEE 802.2 logical link control interface was provided to (optional) upper layer software; this permits the controls designer to utilize standard communications protocols (e.g., ISO, TCP/IP) if time permits, or to utilize a very fast link level protocol directly if speed is critical. Both unacknowledged datagram and reliable virtual circuit services are supported. A station operating an 8 MHz Intel 286 as a host can generate a sustained load of 1.8 megabits per second per station, and a 100-byte message can be delivered from the transmitter's user memory to the receiver's user memory, including all operating system and network overhead, in under 4 milliseconds.
A fast, programmable hardware architecture for the processing of spaceborne SAR data
NASA Technical Reports Server (NTRS)
Bennett, J. R.; Cumming, I. G.; Lim, J.; Wedding, R. M.
1984-01-01
The development of high-throughput SAR processors (HTSPs) for the spaceborne SARs being planned by NASA, ESA, DFVLR, NASDA, and the Canadian Radarsat Project is discussed. The basic parameters and data-processing requirements of the SARs are listed in tables, and the principal problems are identified as real-operations rates in excess of 2 x 10 to the 9th/sec, I/O rates in excess of 8 x 10 to the 6th samples/sec, and control computation loads (as for range cell migration correction) as high as 1.4 x 10 to the 6th instructions/sec. A number of possible HTSP architectures are reviewed; host/array-processor (H/AP) and distributed-control/data-path (DCDP) architectures are examined in detail and illustrated with block diagrams; and a cost/speed comparison of these two architectures is presented. The H/AP approach is found to be adequate and economical for speeds below 1/200 of real time, while DCDP is more cost-effective above 1/50 of real time.
Validating the Airspace Concept Evaluation System for Different Weather Days
NASA Technical Reports Server (NTRS)
Zelinski, Shannon; Meyn, Larry
2006-01-01
This paper extends the process for validating the Airspace Concept Evaluation System using real-world historical flight operational data. System inputs such as flight plans and airport en-route capacities, are generated and processed to create a realistic reproduction of a single day's operations within the National Airspace System. System outputs such as airport throughput, delays, and en-route sector loads are then compared to real world operational metrics and delay statistics for the reproduced day. The process is repeated for 4 historical days with high and low traffic volume and delay attributed to weather. These 4 days are simulated using default en-route capacities and variable en-route capacities used to emulate weather. The validation results show that default enroute capacity simulations are closer to real-world data for low weather days than high weather days. The use of reduced variable enroute capacities adds a large delay bias to ACES but delay trends between weather days are better represented.
Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.
Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian
2016-07-05
This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.
The Stochastic Human Exposure and Dose Simulation Model – High-Throughput (SHEDS-HT) is a U.S. Environmental Protection Agency research tool for predicting screening-level (low-tier) exposures to chemicals in consumer products. This course will present an overview of this m...
Chemical perturbation of vascular development is a putative toxicity pathway which may result in developmental toxicity. EPA’s high-throughput screening (HTS) ToxCast program contains assays which measure cellular signals and biological processes critical for blood vessel develop...
The past five years have witnessed a rapid shift in the exposure science and toxicology communities towards high-throughput (HT) analyses of chemicals as potential stressors of human and ecological health. Modeling efforts have largely led the charge in the exposure science field...
Forecasting Exposure in Order to Use High Throughput Hazard Data in a Risk-based Context (WC9)
The ToxCast program and Tox21 consortium have evaluated over 8000 chemicals using in vitro high-throughput screening (HTS) to identify potential hazards. Complementary exposure science needed to assess risk, and the U.S. Environmental Protection Agency (EPA)’s ExpoCast initiative...
An industrial engineering approach to laboratory automation for high throughput screening
Menke, Karl C.
2000-01-01
Across the pharmaceutical industry, there are a variety of approaches to laboratory automation for high throughput screening. At Sphinx Pharmaceuticals, the principles of industrial engineering have been applied to systematically identify and develop those automated solutions that provide the greatest value to the scientists engaged in lead generation. PMID:18924701
The U.S. Environmental Protection Agency’s ToxCast program has screened thousands of chemicals for biological activity, primarily using high-throughput in vitro bioassays. Adverse outcome pathways (AOPs) offer a means to link pathway-specific biological activities with pote...
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...
The Environmental Protection Agency has implemented a high throughput screening program, ToxCast, to quickly evaluate large numbers of chemicals for their effects on hundreds of different biological targets. To understand how these measurements relate to adverse effects in an or...
USDA-ARS?s Scientific Manuscript database
The rapid advancement in high-throughput SNP genotyping technologies along with next generation sequencing (NGS) platforms has decreased the cost, improved the quality of large-scale genome surveys, and allowed specialty crops with limited genomic resources such as carrot (Daucus carota) to access t...
The U.S. EPA’s Endocrine Disruptor Screening Program (EDSP) and Office of Research and Development (ORD) are currently developing high throughput assays to screen chemicals that may alter the thyroid hormone pathway. One potential target in this pathway is the sodium iodide...
Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics (TK). While HTS generates in vitro bioactivity d...
Inhibition of Retinoblastoma Protein Inactivation
2017-11-01
SUBJECT TERMS cell cycle, Retinoblastoma protein, E2F transcription factor, high throughput screen, drug discovery, x-ray crystallography 16. SECURITY...screening by x-ray crystallography . 2.0 KEYWORDS Retinoblastoma (Rb) pathway, E2F transcription factor, cancer, cell-cycle inhibition, activation...modulation, inhibition, high throughput screening, fragment-based screening, x-ray crystallography . 3.0 ACCOMPLISHMENTS Summary: We
High Throughput Sequence Analysis for Disease Resistance in Maize
USDA-ARS?s Scientific Manuscript database
Preliminary results of a computational analysis of high throughput sequencing data from Zea mays and the fungus Aspergillus are reported. The Illumina Genome Analyzer was used to sequence RNA samples from two strains of Z. mays (Va35 and Mp313) collected over a time course as well as several specie...
High Throughput Prioritization for Integrated Toxicity Testing Based on ToxCast Chemical Profiling
The rational prioritization of chemicals for integrated toxicity testing is a central goal of the U.S. EPA’s ToxCast™ program (http://epa.gov/ncct/toxcast/). ToxCast includes a wide-ranging battery of over 500 in vitro high-throughput screening assays which in Phase I was used to...
The focus of this meeting is the SAP's review and comment on the Agency's proposed high-throughput computational model of androgen receptor pathway activity as an alternative to the current Tier 1 androgen receptor assay (OCSPP 890.1150: Androgen Receptor Binding Rat Prostate Cyt...
Evaluation of High-throughput Genotoxicity Assays Used in Profiling the US EPA ToxCast Chemicals
Three high-throughput screening (HTS) genotoxicity assays-GreenScreen HC GADD45a-GFP (Gentronix Ltd.), CellCiphr p53 (Cellumen Inc.) and CellSensor p53RE-bla (Invitrogen Corp.)-were used to analyze the collection of 320 predominantly pesticide active compounds being tested in Pha...
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Collaborative Core Research Program for Chemical-Biological Warfare Defense
2015-01-04
Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD...Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD) Current pharmaceutical approaches involving drug discovery...structural analysis and docking program generally known as fragment based drug design (FBDD). The main advantage of using these approaches is that