Software Voting in Asynchronous NMR (N-Modular Redundancy) Computer Structures.
1983-05-06
added reliability is exchanged for increased system cost and decreased throughput. Some applications require extremely reliable systems, so the only...not the other way around. Although no systems proidc abstract voting yet. as more applications are written for NMR systems, the programmers are going...throughput goes down, the overhead goes up. Mathematically : Overhead= Non redundant Throughput- Actual Throughput (1) In this section, the actual throughput
Uplink Downlink Rate Balancing and Throughput Scaling in FDD Massive MIMO Systems
NASA Astrophysics Data System (ADS)
Bergel, Itsik; Perets, Yona; Shamai, Shlomo
2016-05-01
In this work we extend the concept of uplink-downlink rate balancing to frequency division duplex (FDD) massive MIMO systems. We consider a base station with large number antennas serving many single antenna users. We first show that any unused capacity in the uplink can be traded off for higher throughput in the downlink in a system that uses either dirty paper (DP) coding or linear zero-forcing (ZF) precoding. We then also study the scaling of the system throughput with the number of antennas in cases of linear Beamforming (BF) Precoding, ZF Precoding, and DP coding. We show that the downlink throughput is proportional to the logarithm of the number of antennas. While, this logarithmic scaling is lower than the linear scaling of the rate in the uplink, it can still bring significant throughput gains. For example, we demonstrate through analysis and simulation that increasing the number of antennas from 4 to 128 will increase the throughput by more than a factor of 5. We also show that a logarithmic scaling of downlink throughput as a function of the number of receive antennas can be achieved even when the number of transmit antennas only increases logarithmically with the number of receive antennas.
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).
Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.
The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio)
Usui, Takuji; Noble, Daniel W.A.; O’Dea, Rose E.; Fangmeier, Melissa L.; Lagisz, Malgorzata; Hesselson, Daniel
2018-01-01
Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34–0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes. PMID:29372124
NASA Astrophysics Data System (ADS)
Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki
In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
Adaptive data rate capacity of meteor-burst communications
NASA Astrophysics Data System (ADS)
Larsen, J. D.; Melville, S. W.; Mawrey, R. S.
The use of adaptive data rates in the meteor-burst communications environment is investigated. Measured results obtained from a number of meteor links are presented and compared with previous theoretical predictions. The contribution of various meteor trail families to throughput capacity are also investigated. The results show that the use of adaptive data rates can significantly increase the throughput capacity of meteor-burst communication systems. The greatest rate of increase in throughput with increase in operating rate is found at low operating rates. This finding has been confirmed for a variety of links and days. Reasonable correspondence is obtained between the predicted modified overdense model and the observed results. Overdense trails, in particular two trail types within the overdense family, are shown to dominate adaptive data throughput.
NASA Technical Reports Server (NTRS)
Giulianetti, Demo J.
2001-01-01
Ground and airborne technologies were developed in the Terminal Area Productivity (TAP) project for increasing throughput at major airports by safely maintaining good-weather operating capacity during bad weather. Methods were demonstrated for accurately predicting vortices to prevent wake-turbulence encounters and to reduce in-trail separation requirements for aircraft approaching the same runway for landing. Technology was demonstrated that safely enabled independent simultaneous approaches in poor weather conditions to parallel runways spaced less than 3,400 ft apart. Guidance, control, and situation-awareness systems were developed to reduce congestion in airport surface operations resulting from the increased throughput, particularly during night and instrument meteorological conditions (IMC). These systems decreased runway occupancy time by safely and smoothly decelerating the aircraft, increasing taxi speed, and safely steering the aircraft off the runway. Simulations were performed in which optimal trajectories were determined by air traffic control (ATC) and communicated to flight crews by means of Center TRACON Automation System/Flight Management System (CTASFMS) automation to reduce flight delays, increase throughput, and ensure flight safety.
Noyes, Aaron; Huffman, Ben; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Sunasara, Khurram; Mukhopadhyay, Tarit
2015-08-01
The biotech industry is under increasing pressure to decrease both time to market and development costs. Simultaneously, regulators are expecting increased process understanding. High throughput process development (HTPD) employs small volumes, parallel processing, and high throughput analytics to reduce development costs and speed the development of novel therapeutics. As such, HTPD is increasingly viewed as integral to improving developmental productivity and deepening process understanding. Particle conditioning steps such as precipitation and flocculation may be used to aid the recovery and purification of biological products. In this first part of two articles, we describe an ultra scale-down system (USD) for high throughput particle conditioning (HTPC) composed of off-the-shelf components. The apparatus is comprised of a temperature-controlled microplate with magnetically driven stirrers and integrated with a Tecan liquid handling robot. With this system, 96 individual reaction conditions can be evaluated in parallel, including downstream centrifugal clarification. A comprehensive suite of high throughput analytics enables measurement of product titer, product quality, impurity clearance, clarification efficiency, and particle characterization. HTPC at the 1 mL scale was evaluated with fermentation broth containing a vaccine polysaccharide. The response profile was compared with the Pilot-scale performance of a non-geometrically similar, 3 L reactor. An engineering characterization of the reactors and scale-up context examines theoretical considerations for comparing this USD system with larger scale stirred reactors. In the second paper, we will explore application of this system to industrially relevant vaccines and test different scale-up heuristics. © 2015 Wiley Periodicals, Inc.
Increasing throughput of multiplexed electrical bus in pipe-lined architecture
Asaad, Sameh; Brezzo, Bernard V; Kapur, Mohit
2014-05-27
Techniques are disclosed for increasing the throughput of a multiplexed electrical bus by exploiting available pipeline stages of a computer or other system. For example, a method for increasing a throughput of an electrical bus that connects at least two devices in a system comprises introducing at least one signal hold stage in a signal-receiving one of the two devices, such that a maximum frequency at which the two devices are operated is not limited by a number of cycles of an operating frequency of the electrical bus needed for a signal to propagate from a signal-transmitting one of the two devices to the signal-receiving one of the two devices. Preferably, the signal hold stage introduced in the signal-receiving one of the two devices is a pipeline stage re-allocated from the signal-transmitting one of the two devices.
A high-throughput method for GMO multi-detection using a microfluidic dynamic array.
Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J
2014-02-01
The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.
Monolithic amorphous silicon modules on continuous polymer substrate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimmer, D.P.
This report examines manufacturing monolithic amorphous silicon modules on a continuous polymer substrate. Module production costs can be reduced by increasing module performance, expanding production, and improving and modifying production processes. Material costs can be reduced by developing processes that use a 1-mil polyimide substrate and multilayers of low-cost material for the front encapsulant. Research to speed up a-Si and ZnO deposition rates is needed to improve throughputs. To keep throughput rates compatible with depositions, multibeam fiber optic delivery systems for laser scribing can be used. However, mechanical scribing systems promise even higher throughputs. Tandem cells and production experience canmore » increase device efficiency and stability. Two alternative manufacturing processes are described: (1) wet etching and sheet handling and (2) wet etching and roll-to-roll fabrication.« less
Microscale screening systems for 3D cellular microenvironments: platforms, advances, and challenges
Montanez-Sauri, Sara I.; Beebe, David J.; Sung, Kyung Eun
2015-01-01
The increasing interest in studying cells using more in vivo-like three-dimensional (3D) microenvironments has created a need for advanced 3D screening platforms with enhanced functionalities and increased throughput. 3D screening platforms that better mimic in vivo microenvironments with enhanced throughput would provide more in-depth understanding of the complexity and heterogeneity of microenvironments. The platforms would also better predict the toxicity and efficacy of potential drugs in physiologically relevant conditions. Traditional 3D culture models (e.g. spinner flasks, gyratory rotation devices, non-adhesive surfaces, polymers) were developed to create 3D multicellular structures. However, these traditional systems require large volumes of reagents and cells, and are not compatible with high throughput screening (HTS) systems. Microscale technology offers the miniaturization of 3D cultures and allows efficient screening of various conditions. This review will discuss the development, most influential works, and current advantages and challenges of microscale culture systems for screening cells in 3D microenvironments. PMID:25274061
Study of data I/O performance on distributed disk system in mask data preparation
NASA Astrophysics Data System (ADS)
Ohara, Shuichiro; Odaira, Hiroyuki; Chikanaga, Tomoyuki; Hamaji, Masakazu; Yoshioka, Yasuharu
2010-09-01
Data volume is getting larger every day in Mask Data Preparation (MDP). In the meantime, faster data handling is always required. MDP flow typically introduces Distributed Processing (DP) system to realize the demand because using hundreds of CPU is a reasonable solution. However, even if the number of CPU were increased, the throughput might be saturated because hard disk I/O and network speeds could be bottlenecks. So, MDP needs to invest a lot of money to not only hundreds of CPU but also storage and a network device which make the throughput faster. NCS would like to introduce new distributed processing system which is called "NDE". NDE could be a distributed disk system which makes the throughput faster without investing a lot of money because it is designed to use multiple conventional hard drives appropriately over network. NCS studies I/O performance with OASIS® data format on NDE which contributes to realize the high throughput in this paper.
Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo
2017-11-21
Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Extended length microchannels for high density high throughput electrophoresis systems
Davidson, James C.; Balch, Joseph W.
2000-01-01
High throughput electrophoresis systems which provide extended well-to-read distances on smaller substrates, thus compacting the overall systems. The electrophoresis systems utilize a high density array of microchannels for electrophoresis analysis with extended read lengths. The microchannel geometry can be used individually or in conjunction to increase the effective length of a separation channel while minimally impacting the packing density of channels. One embodiment uses sinusoidal microchannels, while another embodiment uses plural microchannels interconnected by a via. The extended channel systems can be applied to virtually any type of channel confined chromatography.
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Tiersch, Terrence R.; Yang, Huiping; Hu, E.
2011-01-01
With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666
A High-Throughput Processor for Flight Control Research Using Small UAVs
NASA Technical Reports Server (NTRS)
Klenke, Robert H.; Sleeman, W. C., IV; Motter, Mark A.
2006-01-01
There are numerous autopilot systems that are commercially available for small (<100 lbs) UAVs. However, they all share several key disadvantages for conducting aerodynamic research, chief amongst which is the fact that most utilize older, slower, 8- or 16-bit microcontroller technologies. This paper describes the development and testing of a flight control system (FCS) for small UAV s based on a modern, high throughput, embedded processor. In addition, this FCS platform contains user-configurable hardware resources in the form of a Field Programmable Gate Array (FPGA) that can be used to implement custom, application-specific hardware. This hardware can be used to off-load routine tasks such as sensor data collection, from the FCS processor thereby further increasing the computational throughput of the system.
Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob
2018-05-17
Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.
Performance analysis of Aloha networks with power capture and near/far effect
NASA Astrophysics Data System (ADS)
McCartin, Joseph T.
1989-06-01
An analysis is presented for the throughput characteristics for several classes of Aloha packet networks. Specifically, the throughput for variable packet length Aloha utilizing multiple power levels to induce receiver capture is derived. The results are extended to an analysis of a selective-repeat ARQ Aloha network. Analytical results are presented which indicate a significant increase in throughput for a variable packet network implementing a random two power level capture scheme. Further research into the area of the near/far effect on Aloha networks is included. Improvements in throughput for mobile radio Aloha networks which are subject to the near/far effect are presented. Tactical Command, Control and Communications (C3) systems of the future will rely on Aloha ground mobile data networks. The incorporation of power capture and the near/far effect into future tactical networks will result in improved system analysis, design, and performance.
Quesada-Cabrera, Raul; Weng, Xiaole; Hyett, Geoff; Clark, Robin J H; Wang, Xue Z; Darr, Jawwad A
2013-09-09
High-throughput continuous hydrothermal flow synthesis was used to manufacture 66 unique nanostructured oxide samples in the Ce-Zr-Y-O system. This synthesis approach resulted in a significant increase in throughput compared to that of conventional batch or continuous hydrothermal synthesis methods. The as-prepared library samples were placed into a wellplate for both automated high-throughput powder X-ray diffraction and Raman spectroscopy data collection, which allowed comprehensive structural characterization and phase mapping. The data suggested that a continuous cubic-like phase field connects all three Ce-Zr-O, Ce-Y-O, and Y-Zr-O binary systems together with a smooth and steady transition between the structures of neighboring compositions. The continuous hydrothermal process led to as-prepared crystallite sizes in the range of 2-7 nm (as determined by using the Scherrer equation).
Break-up of droplets in a concentrated emulsion flowing through a narrow constriction
NASA Astrophysics Data System (ADS)
Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team
2014-11-01
Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heusinkveld, Harm J.; Westerink, Remco H.S., E-mail: R.Westerink@uu.nl
Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca{sup 2+} concentration ([Ca{sup 2+}]{sub i}) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca{sup 2+}]{sub i}, e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca{sup 2+}]{sub i} are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca{sup 2+}]{sub i} in plate reader systems, though the results of such plate reader-based measurements have beenmore » questioned. In view of the increasing use of plate reader systems for measurements of [Ca{sup 2+}]{sub i} a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca{sup 2+}]{sub i} is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca{sup 2+}]{sub i}. Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca{sup 2+}]{sub i} is associated with caveats and limitations that require further investigation. - Research Highlights: > The use of plate readers for high-throughput screening of intracellular Ca{sup 2+} is associated with many pitfalls and limitations. > Single cell fluorescent microscopy is recommended for measurements of intracellular Ca{sup 2+}. > Dual-wavelength dyes (Fura-2) are preferred over single-wavelength dyes (Fluo-4) for measurements of intracellular Ca{sup 2+}. > Probenecid prevents dye leakage but abolishes depolarization-evoked Ca{sup 2+} influx, severely hampering measurements of Ca{sup 2+}. > In general, care should be taken when interpreting data from high-throughput kinetic measurements.« less
Improved integrating-sphere throughput with a lens and nonimaging concentrator.
Chenault, D B; Snail, K A; Hanssen, L M
1995-12-01
A reflectometer design utilizing an integrating sphere with a lens and nonimaging concentrator is described. Compared with previous designs where a collimator was used to restrict the detector field of view, the concentrator-lens combination significantly increases the throughput of the reflectometer. A procedure for designing lens-concentrators is given along with the results of parametric studies. The measured angular response of a lens-concentrator system is compared with ray-trace predictions and with the response of an ideal system.
Firmware Development Improves System Efficiency
NASA Technical Reports Server (NTRS)
Chern, E. James; Butler, David W.
1993-01-01
Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.
NASA Astrophysics Data System (ADS)
Yu, Hao Yun; Liu, Chun-Hung; Shen, Yu Tian; Lee, Hsuan-Ping; Tsai, Kuen Yu
2014-03-01
Line edge roughness (LER) influencing the electrical performance of circuit components is a key challenge for electronbeam lithography (EBL) due to the continuous scaling of technology feature sizes. Controlling LER within an acceptable tolerance that satisfies International Technology Roadmap for Semiconductors requirements while achieving high throughput become a challenging issue. Although lower dosage and more-sensitive resist can be used to improve throughput, they would result in serious LER-related problems because of increasing relative fluctuation in the incident positions of electrons. Directed self-assembly (DSA) is a promising technique to relax LER-related pattern fidelity (PF) requirements because of its self-healing ability, which may benefit throughput. To quantify the potential of throughput improvement in EBL by introducing DSA for post healing, rigorous numerical methods are proposed to simultaneously maximize throughput by adjusting writing parameters of EBL systems subject to relaxed LER-related PF requirements. A fast, continuous model for parameter sweeping and a hybrid model for more accurate patterning prediction are employed for the patterning simulation. The tradeoff between throughput and DSA self-healing ability is investigated. Preliminary results indicate that significant throughput improvements are achievable at certain process conditions.
Information-based management mode based on value network analysis for livestock enterprises
NASA Astrophysics Data System (ADS)
Liu, Haoqi; Lee, Changhoon; Han, Mingming; Su, Zhongbin; Padigala, Varshinee Anu; Shen, Weizheng
2018-01-01
With the development of computer and IT technologies, enterprise management has gradually become information-based management. Moreover, due to poor technical competence and non-uniform management, most breeding enterprises show a lack of organisation in data collection and management. In addition, low levels of efficiency result in increasing production costs. This paper adopts 'struts2' in order to construct an information-based management system for standardised and normalised management within the process of production in beef cattle breeding enterprises. We present a radio-frequency identification system by studying multiple-tag anti-collision via a dynamic grouping ALOHA algorithm. This algorithm is based on the existing ALOHA algorithm and uses an improved packet dynamic of this algorithm, which is characterised by a high-throughput rate. This new algorithm can reach a throughput 42% higher than that of the general ALOHA algorithm. With a change in the number of tags, the system throughput is relatively stable.
NASA Astrophysics Data System (ADS)
Mbanjwa, Mesuli B.; Chen, Hao; Fourie, Louis; Ngwenya, Sibusiso; Land, Kevin
2014-06-01
Multiplexed or parallelised droplet microfluidic systems allow for increased throughput in the production of emulsions and microparticles, while maintaining a small footprint and utilising minimal ancillary equipment. The current paper demonstrates the design and fabrication of a multiplexed microfluidic system for producing biocatalytic microspheres. The microfluidic system consists of an array of 10 parallel microfluidic circuits, for simultaneous operation to demonstrate increased production throughput. The flow distribution was achieved using a principle of reservoirs supplying individual microfluidic circuits. The microfluidic devices were fabricated in poly (dimethylsiloxane) (PDMS) using soft lithography techniques. The consistency of the flow distribution was determined by measuring the size variations of the microspheres produced. The coefficient of variation of the particles was determined to be 9%, an indication of consistent particle formation and good flow distribution between the 10 microfluidic circuits.
Burdick, David B; Cavnor, Chris C; Handcock, Jeremy; Killcoyne, Sarah; Lin, Jake; Marzolf, Bruz; Ramsey, Stephen A; Rovira, Hector; Bressler, Ryan; Shmulevich, Ilya; Boyle, John
2010-07-14
High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services.
2010-01-01
Background High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Results Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. Conclusion The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services. PMID:20630057
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
Microprocessor-Based Systems Control for the Rigidized Inflatable Get-Away-Special Experiment
2004-03-01
communications and faster data throughput increase, satellites are becoming larger. Larger satellite antennas help to provide the needed gain to...increase communications in space. Compounding the performance and size trade-offs are the payload weight and size limit imposed by the launch vehicles...increased communications capacity, and reduce launch costs. This thesis develops and implements the computer control system and power system to
Computer Simulation and Field Experiment for Downlink Multiuser MIMO in Mobile WiMAX System.
Yamaguchi, Kazuhiro; Nagahashi, Takaharu; Akiyama, Takuya; Matsue, Hideaki; Uekado, Kunio; Namera, Takakazu; Fukui, Hiroshi; Nanamatsu, Satoshi
2015-01-01
The transmission performance for a downlink mobile WiMAX system with multiuser multiple-input multiple-output (MU-MIMO) systems in a computer simulation and field experiment is described. In computer simulation, a MU-MIMO transmission system can be realized by using the block diagonalization (BD) algorithm, and each user can receive signals without any signal interference from other users. The bit error rate (BER) performance and channel capacity in accordance with modulation schemes and the number of streams were simulated in a spatially correlated multipath fading environment. Furthermore, we propose a method for evaluating the transmission performance for this downlink mobile WiMAX system in this environment by using the computer simulation. In the field experiment, the received power and downlink throughput in the UDP layer were measured on an experimental mobile WiMAX system developed in Azumino City in Japan. In comparison with the simulated and experimented results, the measured maximum throughput performance in the downlink had almost the same performance as the simulated throughput. It was confirmed that the experimental mobile WiMAX system for MU-MIMO transmission successfully increased the total channel capacity of the system.
Computer Simulation and Field Experiment for Downlink Multiuser MIMO in Mobile WiMAX System
Yamaguchi, Kazuhiro; Nagahashi, Takaharu; Akiyama, Takuya; Matsue, Hideaki; Uekado, Kunio; Namera, Takakazu; Fukui, Hiroshi; Nanamatsu, Satoshi
2015-01-01
The transmission performance for a downlink mobile WiMAX system with multiuser multiple-input multiple-output (MU-MIMO) systems in a computer simulation and field experiment is described. In computer simulation, a MU-MIMO transmission system can be realized by using the block diagonalization (BD) algorithm, and each user can receive signals without any signal interference from other users. The bit error rate (BER) performance and channel capacity in accordance with modulation schemes and the number of streams were simulated in a spatially correlated multipath fading environment. Furthermore, we propose a method for evaluating the transmission performance for this downlink mobile WiMAX system in this environment by using the computer simulation. In the field experiment, the received power and downlink throughput in the UDP layer were measured on an experimental mobile WiMAX system developed in Azumino City in Japan. In comparison with the simulated and experimented results, the measured maximum throughput performance in the downlink had almost the same performance as the simulated throughput. It was confirmed that the experimental mobile WiMAX system for MU-MIMO transmission successfully increased the total channel capacity of the system. PMID:26421311
Huang, Dejian; Ou, Boxin; Hampsch-Woodill, Maureen; Flanagan, Judith A; Prior, Ronald L
2002-07-31
The oxygen radical absorbance capacity (ORAC) assay has been widely accepted as a standard tool to measure the antioxidant activity in the nutraceutical, pharmaceutical, and food industries. However, the ORAC assay has been criticized for a lack of accessibility due to the unavailability of the COBAS FARA II analyzer, an instrument discontinued by the manufacturer. In addition, the manual sample preparation is time-consuming and labor-intensive. The objective of this study was to develop a high-throughput instrument platform that can fully automate the ORAC assay procedure. The new instrument platform consists of a robotic eight-channel liquid handling system and a microplate fluorescence reader. By using the high-throughput platform, the efficiency of the assay is improved with at least a 10-fold increase in sample throughput over the current procedure. The mean of intra- and interday CVs was
Asif, Muhammad; Guo, Xiangzhou; Zhang, Jing; Miao, Jungang
2018-04-17
Digital cross-correlation is central to many applications including but not limited to Digital Image Processing, Satellite Navigation and Remote Sensing. With recent advancements in digital technology, the computational demands of such applications have increased enormously. In this paper we are presenting a high throughput digital cross correlator, capable of processing 1-bit digitized stream, at the rate of up to 2 GHz, simultaneously on 64 channels i.e., approximately 4 Trillion correlation and accumulation operations per second. In order to achieve higher throughput, we have focused on frequency based partitioning of our design and tried to minimize and localize high frequency operations. This correlator is designed for a Passive Millimeter Wave Imager intended for the detection of contraband items concealed on human body. The goals are to increase the system bandwidth, achieve video rate imaging, improve sensitivity and reduce the size. Design methodology is detailed in subsequent sections, elaborating the techniques enabling high throughput. The design is verified for Xilinx Kintex UltraScale device in simulation and the implementation results are given in terms of device utilization and power consumption estimates. Our results show considerable improvements in throughput as compared to our baseline design, while the correlator successfully meets the functional requirements.
NASA Astrophysics Data System (ADS)
Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.
2017-11-01
Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.
Genome sequencing in microfabricated high-density picolitre reactors.
Margulies, Marcel; Egholm, Michael; Altman, William E; Attiya, Said; Bader, Joel S; Bemben, Lisa A; Berka, Jan; Braverman, Michael S; Chen, Yi-Ju; Chen, Zhoutao; Dewell, Scott B; Du, Lei; Fierro, Joseph M; Gomes, Xavier V; Godwin, Brian C; He, Wen; Helgesen, Scott; Ho, Chun Heen; Ho, Chun He; Irzyk, Gerard P; Jando, Szilveszter C; Alenquer, Maria L I; Jarvie, Thomas P; Jirage, Kshama B; Kim, Jong-Bum; Knight, James R; Lanza, Janna R; Leamon, John H; Lefkowitz, Steven M; Lei, Ming; Li, Jing; Lohman, Kenton L; Lu, Hong; Makhijani, Vinod B; McDade, Keith E; McKenna, Michael P; Myers, Eugene W; Nickerson, Elizabeth; Nobile, John R; Plant, Ramona; Puc, Bernard P; Ronan, Michael T; Roth, George T; Sarkis, Gary J; Simons, Jan Fredrik; Simpson, John W; Srinivasan, Maithreyan; Tartaro, Karrie R; Tomasz, Alexander; Vogt, Kari A; Volkmer, Greg A; Wang, Shally H; Wang, Yong; Weiner, Michael P; Yu, Pengguang; Begley, Richard F; Rothberg, Jonathan M
2005-09-15
The proliferation of large-scale DNA-sequencing projects in recent years has driven a search for alternative methods to reduce time and cost. Here we describe a scalable, highly parallel sequencing system with raw throughput significantly greater than that of state-of-the-art capillary electrophoresis instruments. The apparatus uses a novel fibre-optic slide of individual wells and is able to sequence 25 million bases, at 99% or better accuracy, in one four-hour run. To achieve an approximately 100-fold increase in throughput over current Sanger sequencing technology, we have developed an emulsion method for DNA amplification and an instrument for sequencing by synthesis using a pyrosequencing protocol optimized for solid support and picolitre-scale volumes. Here we show the utility, throughput, accuracy and robustness of this system by shotgun sequencing and de novo assembly of the Mycoplasma genitalium genome with 96% coverage at 99.96% accuracy in one run of the machine.
Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V
2013-02-01
High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.
Semi-commercial scale production of carrageenan plant growth promoter by E-beam technology
NASA Astrophysics Data System (ADS)
Abad, Lucille V.; Dean, Giuseppe Filam O.; Magsino, Gil L.; Dela Cruz, Rafael Miguel M.; Tecson, Mariel G.; Abella, Matt Ezekiel S.; Hizon, Mark Gil S.
2018-02-01
The plant growth promoter (PGP) effect of different formulations of gamma-irradiated carrageenan solutions were tested on rice by foliar spraying. The best formulation was produced in large quantity for field application. Multilocation trials of around 1600 ha rice field in different regions of the Philippines indicated increase in yield of an average of around 20%. Increased resistance to tungro virus was also noted. Likewise, there was extensive root growth, increase in number of tillers, and development of sturdy stems that prevented lodging of rice plant. E-beam irradiation of carrageenan PGP was also studied to increase its throughput of production. Degradation of carrageenan by e-beam irradiation is inhibited by the formation of crosslinks. Optimisation by addition of hydrogen peroxide to improve degradation is discussed. A continuous flow liquid handling system has been fabricated to increase throughput of the carrageenan PGP. Using the optimized parameters, the system can produce a volume of approximately 1700 L/h.
Microfluidic strategies for understanding the mechanics of cells and cell-mimetic systems
Dahl, Joanna B.; Lin, Jung-Ming G.; Muller, Susan J.; Kumar, Sanjay
2016-01-01
Microfluidic systems are attracting increasing interest for the high-throughput measurement of cellular biophysical properties and for the creation of engineered cellular microenvironments. Here we review recent applications of microfluidic technologies to the mechanics of living cells and synthetic cell-mimetic systems. We begin by discussing the use of microfluidic devices to dissect the mechanics of cellular mimics such as capsules and vesicles. We then explore applications to circulating cells, including erythrocytes and other normal blood cells, and rare populations with potential disease diagnostic value, such as circulating tumor cells. We conclude by discussing how microfluidic devices have been used to investigate the mechanics, chemotaxis, and invasive migration of adherent cells. In these ways, microfluidic technologies represent an increasingly important toolbox for investigating cellular mechanics and motility at high throughput and in a format that lends itself to clinical translation. PMID:26134738
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-01-01
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution. PMID:26961061
Zeming, Kerwin Kwek; Salafi, Thoriq; Chen, Chia-Hung; Zhang, Yong
2016-03-10
Deterministic lateral displacement (DLD) method for particle separation in microfluidic devices has been extensively used for particle separation in recent years due to its high resolution and robust separation. DLD has shown versatility for a wide spectrum of applications for sorting of micro particles such as parasites, blood cells to bacteria and DNA. DLD model is designed for spherical particles and efficient separation of blood cells is challenging due to non-uniform shape and size. Moreover, separation in sub-micron regime requires the gap size of DLD systems to be reduced which exponentially increases the device resistance, resulting in greatly reduced throughput. This paper shows how simple application of asymmetrical DLD gap-size by changing the ratio of lateral-gap (GL) to downstream-gap (GD) enables efficient separation of RBCs without greatly restricting throughput. This method reduces the need for challenging fabrication of DLD pillars and provides new insight to the current DLD model. The separation shows an increase in DLD critical diameter resolution (separate smaller particles) and increase selectivity for non-spherical RBCs. The RBCs separate better as compared to standard DLD model with symmetrical gap sizes. This method can be applied to separate non-spherical bacteria or sub-micron particles to enhance throughput and DLD resolution.
NASA Astrophysics Data System (ADS)
Fuchs, Christian; Poulenard, Sylvain; Perlot, Nicolas; Riedi, Jerome; Perdigues, Josep
2017-02-01
Optical satellite communications play an increasingly important role in a number of space applications. However, if the system concept includes optical links to the surface of the Earth, the limited availability due to clouds and other atmospheric impacts need to be considered to give a reliable estimate of the system performance. An OGS network is required for increasing the availability to acceptable figures. In order to realistically estimate the performance and achievable throughput in various scenarios, a simulation tool has been developed under ESA contract. The tool is based on a database of 5 years of cloud data with global coverage and can thus easily simulate different optical ground station network topologies for LEO- and GEO-to-ground links. Further parameters, like e.g. limited availability due to sun blinding and atmospheric turbulence, are considered as well. This paper gives an overview about the simulation tool, the cloud database, as well as the modelling behind the simulation scheme. Several scenarios have been investigated: LEO-to-ground links, GEO feeder links, and GEO relay links. The key results of the optical ground station network optimization and throughput estimations will be presented. The implications of key technical parameters, as e.g. memory size aboard the satellite, will be discussed. Finally, potential system designs for LEO- and GEO-systems will be presented.
High-throughput SRCD using multi-well plates and its applications
NASA Astrophysics Data System (ADS)
Hussain, Rohanah; Jávorfi, Tamás; Rudd, Timothy R.; Siligardi, Giuliano
2016-12-01
The sample compartment for high-throughput synchrotron radiation circular dichroism (HT-SRCD) has been developed to satisfy an increased demand of protein characterisation in terms of folding and binding interaction properties not only in the traditional field of structural biology but also in the growing research area of material science with the potential to save time by 80%. As the understanding of protein behaviour in different solvent environments has increased dramatically the development of novel functions such as recombinant proteins modified to have different functions from harvesting solar energy to metabolonics for cleaning heavy and metal and organic molecule pollutions, there is a need to characterise speedily these system.
Buckner, Diana; Wilson, Suzanne; Kurk, Sandra; Hardy, Michele; Miessner, Nicole; Jutila, Mark A
2006-09-01
Innate immune system stimulants (innate adjuvants) offer complementary approaches to vaccines and antimicrobial compounds to increase host resistance to infection. The authors established fetal bovine intestinal epithelial cell (BIEC) cultures to screen natural product and synthetic compound libraries for novel mucosal adjuvants. They showed that BIECs from fetal intestine maintained an in vivo phenotype as reflected in cytokeratin expression, expression of antigens restricted to intestinal enterocytes, and induced interleukin-8 (IL-8) production. BIECs could be infected by and support replication of bovine rotavirus. A semi-high-throughput enzyme-linked immunosorbent assay-based assay that measured IL-8 production by BIECs was established and used to screen commercially available natural compounds for novel adjuvant activity. Five novel hits were identified, demonstrating the utility of the assay for selecting and screening new epithelial cell adjuvants. Although the identified compounds had not previously been shown to induce IL-8 production in epithelial cells, other known functions for 3 of the 5 were consistent with this activity. Statistical analysis of the throughput data demonstrated that the assay is adaptable to a high-throughput format for screening both synthetic and natural product derived compound libraries.
Thermoelectric properties of the LaCoO3-LaCrO3 system using a high-throughput combinatorial approach
NASA Astrophysics Data System (ADS)
Talley, K. R.; Barron, S. C.; Nguyen, N.; Wong-Ng, W.; Martin, J.; Zhang, Y. L.; Song, X.
2017-02-01
A combinatorial film of the LaCo1-xCrxO3 system was fabricated using the LaCoO3 and LaCrO3 targets at the NIST Pulsed Laser Deposition (PLD) facility. As the ionic size of Cr3+ is greater than that of Co3+, the unit cell volume of the series increases with increasing x. Using a custom screening tool, the Seebeck coefficient of LaCo1-xCrxO3 approaches a measured maximum of 286 μV/K, near to the cobalt-rich end of the film library (with x ≈ 0.49). The resistivity value increases continuously with increasing x. The measured power factor, PF, of this series, which is related to the efficiency of energy conversion, also exhibits a maximum at the composition of x ≈ 0.49, which corresponds to the maximum value of the Seebeck coefficient. Our results illustrate the efficiency of applying the high-throughput combinatorial technique to study thermoelectric materials.
Effect of adaptive cruise control systems on mixed traffic flow near an on-ramp
NASA Astrophysics Data System (ADS)
Davis, L. C.
2007-06-01
Mixed traffic flow consisting of vehicles equipped with adaptive cruise control (ACC) and manually driven vehicles is analyzed using car-following simulations. Simulations of merging from an on-ramp onto a freeway reported in the literature have not thus far demonstrated a substantial positive impact of ACC. In this paper cooperative merging for ACC vehicles is proposed to improve throughput and increase distance traveled in a fixed time. In such a system an ACC vehicle senses not only the preceding vehicle in the same lane but also the vehicle immediately in front in the other lane. Prior to reaching the merge region, the ACC vehicle adjusts its velocity to ensure that a safe gap for merging is obtained. If on-ramp demand is moderate, cooperative merging produces significant improvement in throughput (20%) and increases up to 3.6 km in distance traveled in 600 s for 50% ACC mixed flow relative to the flow of all-manual vehicles. For large demand, it is shown that autonomous merging with cooperation in the flow of all ACC vehicles leads to throughput limited only by the downstream capacity, which is determined by speed limit and headway time.
High throughput system for magnetic manipulation of cells, polymers, and biomaterials
Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.
2008-01-01
In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357
A Robotic Platform for Quantitative High-Throughput Screening
Michael, Sam; Auld, Douglas; Klumpp, Carleen; Jadhav, Ajit; Zheng, Wei; Thorne, Natasha; Austin, Christopher P.; Inglese, James
2008-01-01
Abstract High-throughput screening (HTS) is increasingly being adopted in academic institutions, where the decoupling of screening and drug development has led to unique challenges, as well as novel uses of instrumentation, assay formulations, and software tools. Advances in technology have made automated unattended screening in the 1,536-well plate format broadly accessible and have further facilitated the exploration of new technologies and approaches to screening. A case in point is our recently developed quantitative HTS (qHTS) paradigm, which tests each library compound at multiple concentrations to construct concentration-response curves (CRCs) generating a comprehensive data set for each assay. The practical implementation of qHTS for cell-based and biochemical assays across libraries of > 100,000 compounds (e.g., between 700,000 and 2,000,000 sample wells tested) requires maximal efficiency and miniaturization and the ability to easily accommodate many different assay formats and screening protocols. Here, we describe the design and utilization of a fully integrated and automated screening system for qHTS at the National Institutes of Health's Chemical Genomics Center. We report system productivity, reliability, and flexibility, as well as modifications made to increase throughput, add additional capabilities, and address limitations. The combination of this system and qHTS has led to the generation of over 6 million CRCs from > 120 assays in the last 3 years and is a technology that can be widely implemented to increase efficiency of screening and lead generation. PMID:19035846
NASA Astrophysics Data System (ADS)
Huntzinger, D. N.; McCray, J. E.; Siegrist, R.; Lowe, K.; VanCuyk, S.
2001-05-01
Sixteen, one-dimensional column lysimeters have been developed to evaluate the influence of loading regime and infiltrative surface character on hydraulic performance in wastewater soil absorption systems. A duplicate design was utilized to evaluate two infiltrative surface conditions (gravel-free vs. gravel-laden) under four hydraulic loading regimes representative of possible field conditions. By loading the columns at rates of 25 to 200 cm/day, the 17 weeks of column operation actually reflect up to approximately 13 yrs of field operation (at 5 cm/day). Therefore, the cumulative mass throughput and infiltrative rate loss for each loading regime can be examined to determine the viability of accelerated loading as a means to compress the time scale of observation, while still producing meaningfully results for the field scale. During operation, the columns were loaded with septic tank effluent at a prescribed rate and routinely monitoring for applied effluent composition, infiltration rate, time-dependant soil water content, water volume throughput, and percolate composition. Bromide tracer tests were completed prior to system startup and at weeks 2, 6, and 17 of system operation. Hydraulic characterization of the columns is based on measurements of the hydraulic loading rate, volumetric throughput, soil water content, and bromide breakthrough curves. Incipient ponding of wastewater developed during the 1st week of operation for columns loaded at the highest hydraulic rate (loading regimes 1 and 2), and during the 3rd and 6th week of operation for loading regimes 3 and 4, respectfully. The bromide breakthrough curves exhibit later breakthrough and tailing as system life increases, indicating the development of spatially variability in hydraulic conductivity within the column and the development of a clogging zone at the infiltrative surface. Throughput is assessed for each loading regime to determine the infiltration rate loss versus days of operation. Loading regimes 1 and 2 approach a comparable long-term throughput rate less than 20 cm/day, while loading regimes 3 and 4 reach a long-term throughput rate of less than 10 cm/day. These one-dimensional columns allow for the analysis of infiltrative rate loss and hydraulic behavior as a result of infiltrative surface character and loading regime.
DOT National Transportation Integrated Search
2006-08-01
A Freeway Management System (FMS) employs various tools to manage a freeway to increase throughput and reduce delay : without additional lanes. The FMS acquires data from the roadway and process these data to identify and respond to : problems. If so...
Spectrum Access In Cognitive Radio Using a Two-Stage Reinforcement Learning Approach
NASA Astrophysics Data System (ADS)
Raj, Vishnu; Dias, Irene; Tholeti, Thulasi; Kalyani, Sheetal
2018-02-01
With the advent of the 5th generation of wireless standards and an increasing demand for higher throughput, methods to improve the spectral efficiency of wireless systems have become very important. In the context of cognitive radio, a substantial increase in throughput is possible if the secondary user can make smart decisions regarding which channel to sense and when or how often to sense. Here, we propose an algorithm to not only select a channel for data transmission but also to predict how long the channel will remain unoccupied so that the time spent on channel sensing can be minimized. Our algorithm learns in two stages - a reinforcement learning approach for channel selection and a Bayesian approach to determine the optimal duration for which sensing can be skipped. Comparisons with other learning methods are provided through extensive simulations. We show that the number of sensing is minimized with negligible increase in primary interference; this implies that lesser energy is spent by the secondary user in sensing and also higher throughput is achieved by saving on sensing.
NASA Astrophysics Data System (ADS)
Wang, Liping; Ji, Yusheng; Liu, Fuqiang
The integration of multihop relays with orthogonal frequency-division multiple access (OFDMA) cellular infrastructures can meet the growing demands for better coverage and higher throughput. Resource allocation in the OFDMA two-hop relay system is more complex than that in the conventional single-hop OFDMA system. With time division between transmissions from the base station (BS) and those from relay stations (RSs), fixed partitioning of the BS subframe and RS subframes can not adapt to various traffic demands. Moreover, single-hop scheduling algorithms can not be used directly in the two-hop system. Therefore, we propose a semi-distributed algorithm called ASP to adjust the length of every subframe adaptively, and suggest two ways to extend single-hop scheduling algorithms into multihop scenarios: link-based and end-to-end approaches. Simulation results indicate that the ASP algorithm increases system utilization and fairness. The max carrier-to-interference ratio (Max C/I) and proportional fairness (PF) scheduling algorithms extended using the end-to-end approach obtain higher throughput than those using the link-based approach, but at the expense of more overhead for information exchange between the BS and RSs. The resource allocation scheme using ASP and end-to-end PF scheduling achieves a tradeoff between system throughput maximization and fairness.
Progress Report: Transportable Gasifier for On-Farm Disposal ...
Report A prototype transportable gasifier intended to process a minimum of 25 tons per day of animal mortalities (scalable to 200 tons per day) was built as part of an interagency effort involving the U.S. Environmental Protection Agency, the Department of Homeland Security, the U.S. Department of Agriculture, and the Department of Defense as well as the State of North Carolina. This effort is intended to demonstrate the feasibility of gasification for disposal of contaminated carcasses and to identify technical challenges and improvements that will simplify, improve, and enhance the gasifier system as a mobile response tool. Initial testing of the prototype in 2008 and 2010 demonstrated partial success by meeting the transportability and rapid deployment requirements. However, the throughput of animal carcasses was approximately 1/3 of the intended design capacity. Modifications have been made to the fuel system, burner system, feed system, control system, power distribution, and ash handling system to increase its operating capacity to the rated design throughput. Further testing will be performed to demonstrate the throughput as well as to demonstrate the ability of the unit to operate around the clock for an extended period of time. This report gives a status update on the progress of the project. Purpose is to give an update on the Transportable Animal Carcass Gasifier.
High-throughput screening (HTS) and modeling of the retinoid ...
Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system
Multiplexed mass cytometry profiling of cellular states perturbed by small-molecule regulators
Bodenmiller, Bernd; Zunder, Eli R.; Finck, Rachel; Chen, Tiffany J.; Savig, Erica S.; Bruggner, Robert V.; Simonds, Erin F.; Bendall, Sean C.; Sachs, Karen; Krutzik, Peter O.; Nolan, Garry P.
2013-01-01
The ability to comprehensively explore the impact of bio-active molecules on human samples at the single-cell level can provide great insight for biomedical research. Mass cytometry enables quantitative single-cell analysis with deep dimensionality, but currently lacks high-throughput capability. Here we report a method termed mass-tag cellular barcoding (MCB) that increases mass cytometry throughput by sample multiplexing. 96-well format MCB was used to characterize human peripheral blood mononuclear cell (PBMC) signaling dynamics, cell-to-cell communication, the signaling variability between 8 donors, and to define the impact of 27 inhibitors on this system. For each compound, 14 phosphorylation sites were measured in 14 PBMC types, resulting in 18,816 quantified phosphorylation levels from each multiplexed sample. This high-dimensional systems-level inquiry allowed analysis across cell-type and signaling space, reclassified inhibitors, and revealed off-target effects. MCB enables high-content, high-throughput screening, with potential applications for drug discovery, pre-clinical testing, and mechanistic investigation of human disease. PMID:22902532
Enhancement and outreach for the active management screening tool.
DOT National Transportation Integrated Search
2012-07-01
Active traffic managementwidely deployed for decades in Europe but in its infancy in the United Statesmaximizes the effectiveness and efficiency of the facility, and increases throughput and safety through integrated systems with new technology...
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
NASA Technical Reports Server (NTRS)
Egen, N. B.; Twitty, G. E.; Bier, M.
1979-01-01
Isoelectric focusing is a high-resolution technique for separating and purifying large peptides, proteins, and other biomolecules. The apparatus described in the present paper constitutes a new approach to fluid stabilization and increased throughput. Stabilization is achieved by flowing the process fluid uniformly through an array of closely spaced filter elements oriented parallel both to the electrodes and the direction of the flow. This seems to overcome the major difficulties of parabolic flow and electroosmosis at the walls, while limiting the convection to chamber compartments defined by adjacent spacers. Increased throughput is achieved by recirculating the process fluid through external heat exchange reservoirs, where the Joule heat is dissipated.
Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli
2010-01-01
The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers
Live Virtual Constructive Distributed Test Environment Characterization Report
NASA Technical Reports Server (NTRS)
Murphy, Jim; Kim, Sam K.
2013-01-01
This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.
CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak
NASA Astrophysics Data System (ADS)
Vanderlaan, J. F.; Cummings, J. W.
1993-10-01
The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPU's. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.
Throughput assurance of wireless body area networks coexistence based on stochastic geometry
Wang, Yinglong; Shu, Minglei; Wu, Shangbin
2017-01-01
Wireless body area networks (WBANs) are expected to influence the traditional medical model by assisting caretakers with health telemonitoring. Within WBANs, the transmit power of the nodes should be as small as possible owing to their limited energy capacity but should be sufficiently large to guarantee the quality of the signal at the receiving nodes. When multiple WBANs coexist in a small area, the communication reliability and overall throughput can be seriously affected due to resource competition and interference. We show that the total network throughput largely depends on the WBANs distribution density (λp), transmit power of their nodes (Pt), and their carrier-sensing threshold (γ). Using stochastic geometry, a joint carrier-sensing threshold and power control strategy is proposed to meet the demand of coexisting WBANs based on the IEEE 802.15.4 standard. Given different network distributions and carrier-sensing thresholds, the proposed strategy derives a minimum transmit power according to varying surrounding environment. We obtain expressions for transmission success probability and throughput adopting this strategy. Using numerical examples, we show that joint carrier-sensing thresholds and transmit power strategy can effectively improve the overall system throughput and reduce interference. Additionally, this paper studies the effects of a guard zone on the throughput using a Matern hard-core point process (HCPP) type II model. Theoretical analysis and simulation results show that the HCPP model can increase the success probability and throughput of networks. PMID:28141841
QoS support for end users of I/O-intensive applications using shared storage systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Marion Kei; Zhang, Xuechen; Jiang, Song
2011-01-19
I/O-intensive applications are becoming increasingly common on today's high-performance computing systems. While performance of compute-bound applications can be effectively guaranteed with techniques such as space sharing or QoS-aware process scheduling, it remains a challenge to meet QoS requirements for end users of I/O-intensive applications using shared storage systems because it is difficult to differentiate I/O services for different applications with individual quality requirements. Furthermore, it is difficult for end users to accurately specify performance goals to the storage system using I/O-related metrics such as request latency or throughput. As access patterns, request rates, and the system workload change in time,more » a fixed I/O performance goal, such as bounds on throughput or latency, can be expensive to achieve and may not lead to a meaningful performance guarantees such as bounded program execution time. We propose a scheme supporting end-users QoS goals, specified in terms of program execution time, in shared storage environments. We automatically translate the users performance goals into instantaneous I/O throughput bounds using a machine learning technique, and use dynamically determined service time windows to efficiently meet the throughput bounds. We have implemented this scheme in the PVFS2 parallel file system and have conducted an extensive evaluation. Our results show that this scheme can satisfy realistic end-user QoS requirements by making highly efficient use of the I/O resources. The scheme seeks to balance programs attainment of QoS requirements, and saves as much of the remaining I/O capacity as possible for best-effort programs.« less
A cognitive gateway-based spectrum sharing method in downlink round robin scheduling of LTE system
NASA Astrophysics Data System (ADS)
Deng, Hongyu; Wu, Cheng; Wang, Yiming
2017-07-01
A key technique of LTE is how to allocate efficiently the resource of radio spectrum. Traditional Round Robin (RR) scheduling scheme may lead to too many resource residues when allocating resources. When the number of users in the current transmission time interval (TTI) is not the greatest common divisor of resource block groups (RBGs), and such a phenomenon lasts for a long time, the spectrum utilization would be greatly decreased. In this paper, a novel spectrum allocation scheme of cognitive gateway (CG) was proposed, in which the LTE spectrum utilization and CG’s throughput were greatly increased by allocating idle resource blocks in the shared TTI in LTE system to CG. Our simulation results show that the spectrum resource sharing method can improve LTE spectral utilization and increase the CG’s throughput as well as network use time.
Optimizing the Energy and Throughput of a Water-Quality Monitoring System.
Olatinwo, Segun O; Joubert, Trudi-H
2018-04-13
This work presents a new approach to the maximization of energy and throughput in a wireless sensor network (WSN), with the intention of applying the approach to water-quality monitoring. Water-quality monitoring using WSN technology has become an interesting research area. Energy scarcity is a critical issue that plagues the widespread deployment of WSN systems. Different power supplies, harvesting energy from sustainable sources, have been explored. However, when energy-efficient models are not put in place, energy harvesting based WSN systems may experience an unstable energy supply, resulting in an interruption in communication, and low system throughput. To alleviate these problems, this paper presents the joint maximization of the energy harvested by sensor nodes and their information-transmission rate using a sum-throughput technique. A wireless information and power transfer (WIPT) method is considered by harvesting energy from dedicated radio frequency sources. Due to the doubly near-far condition that confronts WIPT systems, a new WIPT system is proposed to improve the fairness of resource utilization in the network. Numerical simulation results are presented to validate the mathematical formulations for the optimization problem, which maximize the energy harvested and the overall throughput rate. Defining the performance metrics of achievable throughput and fairness in resource sharing, the proposed WIPT system outperforms an existing state-of-the-art WIPT system, with the comparison based on numerical simulations of both systems. The improved energy efficiency of the proposed WIPT system contributes to addressing the problem of energy scarcity.
Optimizing the Energy and Throughput of a Water-Quality Monitoring System
Olatinwo, Segun O.
2018-01-01
This work presents a new approach to the maximization of energy and throughput in a wireless sensor network (WSN), with the intention of applying the approach to water-quality monitoring. Water-quality monitoring using WSN technology has become an interesting research area. Energy scarcity is a critical issue that plagues the widespread deployment of WSN systems. Different power supplies, harvesting energy from sustainable sources, have been explored. However, when energy-efficient models are not put in place, energy harvesting based WSN systems may experience an unstable energy supply, resulting in an interruption in communication, and low system throughput. To alleviate these problems, this paper presents the joint maximization of the energy harvested by sensor nodes and their information-transmission rate using a sum-throughput technique. A wireless information and power transfer (WIPT) method is considered by harvesting energy from dedicated radio frequency sources. Due to the doubly near–far condition that confronts WIPT systems, a new WIPT system is proposed to improve the fairness of resource utilization in the network. Numerical simulation results are presented to validate the mathematical formulations for the optimization problem, which maximize the energy harvested and the overall throughput rate. Defining the performance metrics of achievable throughput and fairness in resource sharing, the proposed WIPT system outperforms an existing state-of-the-art WIPT system, with the comparison based on numerical simulations of both systems. The improved energy efficiency of the proposed WIPT system contributes to addressing the problem of energy scarcity. PMID:29652866
Robotic Patterning a Superhydrophobic Surface for Collective Cell Migration Screening.
Pang, Yonggang; Yang, Jing; Hui, Zhixin; Grottkau, Brian E
2018-04-01
Collective cell migration, in which cells migrate as a group, is fundamental in many biological and pathological processes. There is increasing interest in studying the collective cell migration in high throughput. Cell scratching, insertion blocker, and gel-dissolving techniques are some methodologies used previously. However, these methods have the drawbacks of cell damage, substrate surface alteration, limitation in medium exchange, and solvent interference. The superhydrophobic surface, on which the water contact angle is greater than 150 degrees, has been recently utilized to generate patterned arrays. Independent cell culture areas can be generated on a substrate that functions the same as a conventional multiple well plate. However, so far there has been no report on superhydrophobic patterning for the study of cell migration. In this study, we report on the successful development of a robotically patterned superhydrophobic array for studying collective cell migration in high throughput. The array was developed on a rectangular single-well cell culture plate consisting of hydrophilic flat microwells separated by the superhydrophobic surface. The manufacturing process is robotic and includes patterning discrete protective masks to the substrate using 3D printing, robotic spray coating of silica nanoparticles, robotic mask removal, robotic mini silicone blocker patterning, automatic cell seeding, and liquid handling. Compared with a standard 96-well plate, our system increases the throughput by 2.25-fold and generates a cell-free area in each well non-destructively. Our system also demonstrates higher efficiency than conventional way of liquid handling using microwell plates, and shorter processing time than manual operating in migration assays. The superhydrophobic surface had no negative impact on cell viability. Using our system, we studied the collective migration of human umbilical vein endothelial cells and cancer cells using assays of endpoint quantification, dynamic cell tracking, and migration quantification following varied drug treatments. This system provides a versatile platform to study collective cell migration in high throughput for a broad range of applications.
Verdirame, Maria; Veneziano, Maria; Alfieri, Anna; Di Marco, Annalise; Monteagudo, Edith; Bonelli, Fabio
2010-03-11
Turbulent Flow Chromatography (TFC) is a powerful approach for on-line extraction in bioanalytical studies. It improves sensitivity and reduces sample preparation time, two factors that are of primary importance in drug discovery. In this paper the application of the ARIA system to the analytical support of in vivo pharmacokinetics (PK) and in vitro drug metabolism studies is described, with an emphasis in high throughput optimization. For PK studies, a comparison between acetonitrile plasma protein precipitation (APPP) and TFC was carried out. Our optimized TFC methodology gave better S/N ratios and lower limit of quantification (LOQ) than conventional procedures. A robust and high throughput analytical method to support hepatocyte metabolic stability screening of new chemical entities was developed by hyphenation of TFC with mass spectrometry. An in-loop dilution injection procedure was implemented to overcome one of the main issues when using TFC, that is the early elution of hydrophilic compounds that renders low recoveries. A comparison between off-line solid phase extraction (SPE) and TFC was also carried out, and recovery, sensitivity (LOQ), matrix effect and robustness were evaluated. The use of two parallel columns in the configuration of the system provided a further increase of the throughput. Copyright 2009 Elsevier B.V. All rights reserved.
Embedded Multiprocessor Technology for VHSIC Insertion
NASA Technical Reports Server (NTRS)
Hayes, Paul J.
1990-01-01
Viewgraphs on embedded multiprocessor technology for VHSIC insertion are presented. The objective was to develop multiprocessor system technology providing user-selectable fault tolerance, increased throughput, and ease of application representation for concurrent operation. The approach was to develop graph management mapping theory for proper performance, model multiprocessor performance, and demonstrate performance in selected hardware systems.
78 FR 42527 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-16
... Diabetes. Competitive Advantages: Beneficial metabolic effects of this mouse model include high basal insulin secretion, improved glucose tolerance, increased serum insulin, and resistance to high-fat diet... currently available systems. Potential Commercial Applications: High-throughput protein production...
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
A bioinformatics roadmap for the human vaccines project.
Scheuermann, Richard H; Sinkovits, Robert S; Schenkelberg, Theodore; Koff, Wayne C
2017-06-01
Biomedical research has become a data intensive science in which high throughput experimentation is producing comprehensive data about biological systems at an ever-increasing pace. The Human Vaccines Project is a new public-private partnership, with the goal of accelerating development of improved vaccines and immunotherapies for global infectious diseases and cancers by decoding the human immune system. To achieve its mission, the Project is developing a Bioinformatics Hub as an open-source, multidisciplinary effort with the overarching goal of providing an enabling infrastructure to support the data processing, analysis and knowledge extraction procedures required to translate high throughput, high complexity human immunology research data into biomedical knowledge, to determine the core principles driving specific and durable protective immune responses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, Daniel J.; Tfaily, Malak M.; Moore, Ronald J.
To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, lowmore » pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.« less
A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.
Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham
2017-08-01
Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.
The effects of a physician slowdown on emergency department volume and treatment.
Walsh, Brian; Eskin, Barnet; Allegra, John; Rothman, Jonathan; Junker, Elizabeth
2006-11-01
In February 2003, many physicians in New Jersey participated in a work slowdown to publicize large increases in malpractice premiums and generate support for legislative reform. It was anticipated that the community physician slowdown (hereafter referred to as "slowdown") would increase emergency department (ED) visits. The authors' goal was to help others prepare for anticipated increases in ED volumes by describing the preparatory staffing changes made and quantifying increases in ED volume. This was a retrospective cohort study performed at a New Jersey suburban teaching hospital with 70,000 annual visits. Consecutive patients seen by emergency physicians were enrolled. The authors extracted patient visit data from the computerized tracking system and analyzed hours worked by personnel, patient volumes, admission rates, and patient throughput times. Variables from each day of the slowdown with baseline values for the same day of the week for the four weeks before and after the slowdown were compared. A Bonferroni correction was used, with p < 0.01 considered statistically significant. Total patient volume increased 79% from baseline (95% confidence interval [CI] = 20% to 137%). Pediatric volume increased 223% (95% CI = 171% to 274%). Overall admission rate decreased 29% compared with baseline (95% CI = 8% to 51%). Patient throughput times did not change significantly. Similar results for these variables were found for the second through fourth days of the slowdown. Emergency department visits, especially pediatric visits, increased markedly during the community physician slowdown. Anticipatory increases in staffing effectively prevented increased throughput times.
Next generation platforms for high-throughput biodosimetry
Repin, Mikhail; Turner, Helen C.; Garty, Guy; Brenner, David J.
2014-01-01
Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of biodosimetry assays was described. These platforms can be used at different stages of biodosimetry assays starting from blood collection into microtubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multiwell and multichannel plates. Robotically friendly platforms can be used for different biodosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. PMID:24837249
Microengineering methods for cell-based microarrays and high-throughput drug-screening applications.
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-09-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.
Shih, Tsung-Ting; Hsieh, Cheng-Chuan; Luo, Yu-Ting; Su, Yi-An; Chen, Ping-Hung; Chuang, Yu-Chen; Sun, Yuh-Chang
2016-04-15
Herein, a hyphenated system combining a high-throughput solid-phase extraction (htSPE) microchip with inductively coupled plasma-mass spectrometry (ICP-MS) for rapid determination of trace heavy metals was developed. Rather than performing multiple analyses in parallel for the enhancement of analytical throughput, we improved the processing speed for individual samples by increasing the operation flow rate during SPE procedures. To this end, an innovative device combining a micromixer and a multi-channeled extraction unit was designed. Furthermore, a programmable valve manifold was used to interface the developed microchip and ICP-MS instrumentation in order to fully automate the system, leading to a dramatic reduction in operation time and human error. Under the optimized operation conditions for the established system, detection limits of 1.64-42.54 ng L(-1) for the analyte ions were achieved. Validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Each analysis could be readily accomplished within just 186 s using the established system. This represents, to the best of our knowledge, an unprecedented speed for the analysis of trace heavy metal ions. Copyright © 2016 Elsevier B.V. All rights reserved.
Bae, Seunghee; An, In-Sook; An, Sungkwan
2015-09-01
Ultraviolet (UV) radiation is a major inducer of skin aging and accumulated exposure to UV radiation increases DNA damage in skin cells, including dermal fibroblasts. In the present study, we developed a novel DNA repair regulating material discovery (DREAM) system for the high-throughput screening and identification of putative materials regulating DNA repair in skin cells. First, we established a modified lentivirus expressing the luciferase and hypoxanthine phosphoribosyl transferase (HPRT) genes. Then, human dermal fibroblast WS-1 cells were infected with the modified lentivirus and selected with puromycin to establish cells that stably expressed luciferase and HPRT (DREAM-F cells). The first step in the DREAM protocol was a 96-well-based screening procedure, involving the analysis of cell viability and luciferase activity after pretreatment of DREAM-F cells with reagents of interest and post-treatment with UVB radiation, and vice versa. In the second step, we validated certain effective reagents identified in the first step by analyzing the cell cycle, evaluating cell death, and performing HPRT-DNA sequencing in DREAM-F cells treated with these reagents and UVB. This DREAM system is scalable and forms a time-saving high-throughput screening system for identifying novel anti-photoaging reagents regulating DNA damage in dermal fibroblasts.
Microengineering Methods for Cell Based Microarrays and High-Throughput Drug Screening Applications
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-01-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time-consuming and often face ethical concerns due to extensive use of animals. To improve cost-effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems have facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell based drug-screening models, which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell based drug screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds a great potential to provide repeatable 3D cell based constructs with high temporal, spatial control and versatility. PMID:21725152
Repeated Transmissions In Mobile/Satellite Communications
NASA Technical Reports Server (NTRS)
Yan, Tsun-Yee; Clare, Loren P.
1988-01-01
Repetition increases throughput and decreases delay. Paper discusses theoretical performance of communication system for land-mobile stations with satellite relay using ALOHA random-access protocol modified for repeated transimssions. Methods and conclusions contribute to general understanding of packet communications in fading channels.
Suzuki, Kazumichi; Palmer, Matthew B; Sahoo, Narayan; Zhang, Xiaodong; Poenisch, Falk; Mackin, Dennis S; Liu, Amy Y; Wu, Richard; Zhu, X Ronald; Frank, Steven J; Gillin, Michael T; Lee, Andrew K
2016-07-01
To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery system downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. The mean monthly equipment clinical availability for the spot scanning port in April 2012-March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012-August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for approximately 30%-40% of total treatment time for the total target volumes exceeding 200 cm(3), which was the case for more than 80% of the patients in this study. When total treatment time was modeled as a function of the number of fields and total target volume, the model overestimated total treatment time by 12% on average, with a standard deviation of 32%. A sensitivity analysis of throughput capacity for a hypothetical four-room spot scanning proton therapy center identified several priority items for improvements in throughput capacity, including operation time, beam delivery time, and patient immobilization and setup time. The spot scanning port at our proton therapy center has operated at a high performance level and has been used to treat a large number of complex cases. Further improvements in efficiency may be feasible in the areas of facility operation, beam delivery, patient immobilization and setup, and optimization of treatment scheduling.
High-Throughput Incubation and Quantification of Agglutination Assays in a Microfluidic System.
Castro, David; Conchouso, David; Kodzius, Rimantas; Arevalo, Arpys; Foulds, Ian G
2018-06-04
In this paper, we present a two-phase microfluidic system capable of incubating and quantifying microbead-based agglutination assays. The microfluidic system is based on a simple fabrication solution, which requires only laboratory tubing filled with carrier oil, driven by negative pressure using a syringe pump. We provide a user-friendly interface, in which a pipette is used to insert single droplets of a 1.25-µL volume into a system that is continuously running and therefore works entirely on demand without the need for stopping, resetting or washing the system. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5⁻10-fold improvement over traditional agglutination assays. We study system parameters such as channel length, incubation time and flow speed to select optimal assay conditions, using the streptavidin-biotin interaction as a model analyte quantified using optical image processing. We then investigate the effect of changing the concentration of both analyte and microbead concentrations, with a minimum detection limit of 100 ng/mL. The system can be both low- and high-throughput, depending on the rate at which assays are inserted. In our experiments, we were able to easily produce throughputs of 360 assays per hour by simple manual pipetting, which could be increased even further by automation and parallelization. Agglutination assays are a versatile tool, capable of detecting an ever-growing catalog of infectious diseases, proteins and metabolites. A system such as this one is a step towards being able to produce high-throughput microfluidic diagnostic solutions with widespread adoption. The development of analytical techniques in the microfluidic format, such as the one presented in this work, is an important step in being able to continuously monitor the performance and microfluidic outputs of organ-on-chip devices.
Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard
2015-02-09
Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.
Brito Palma, Bernardo; Fisher, Charles W; Rueff, José; Kranendonk, Michel
2016-05-16
The formation of reactive metabolites through biotransformation is the suspected cause of many adverse drug reactions. Testing for the propensity of a drug to form reactive metabolites has increasingly become an integral part of lead-optimization strategy in drug discovery. DNA reactivity is one undesirable facet of a drug or its metabolites and can lead to increased risk of cancer and reproductive toxicity. Many drugs are metabolized by cytochromes P450 in the liver and other tissues, and these reactions can generate hard electrophiles. These hard electrophilic reactive metabolites may react with DNA and may be detected in standard in vitro genotoxicity assays; however, the majority of these assays fall short due to the use of animal-derived organ extracts that inadequately represent human metabolism. The current study describes the development of bacterial systems that efficiently detect DNA-damaging electrophilic reactive metabolites generated by human P450 biotransformation. These assays use a GFP reporter system that detects DNA damage through induction of the SOS response and a GFP reporter to control for cytotoxicity. Two human CYP1A2-competent prototypes presented here have appropriate characteristics for the detection of DNA-damaging reactive metabolites in a high-throughput manner. The advantages of this approach include a short assay time (120-180 min) with real-time measurement, sensitivity to small amounts of compound, and adaptability to a microplate format. These systems are suitable for high-throughput assays and can serve as prototypes for the development of future enhanced versions.
An extended smart utilization medium access control (ESU-MAC) protocol for ad hoc wireless systems
NASA Astrophysics Data System (ADS)
Vashishtha, Jyoti; Sinha, Aakash
2006-05-01
The demand for spontaneous setup of a wireless communication system has increased in recent years for areas like battlefield, disaster relief operations etc., where a pre-deployment of network infrastructure is difficult or unavailable. A mobile ad-hoc network (MANET) is a promising solution, but poses a lot of challenges for all the design layers, specifically medium access control (MAC) layer. Recent existing works have used the concepts of multi-channel and power control in designing MAC layer protocols. SU-MAC developed by the same authors, efficiently uses the 'available' data and control bandwidth to send control information and results in increased throughput via decreasing contention on the control channel. However, SU-MAC protocol was limited for static ad-hoc network and also faced the busy-receiver node problem. We present the Extended SU-MAC (ESU-MAC) protocol which works mobile nodes. Also, we significantly improve the scheme of control information exchange in ESU-MAC to overcome the busy-receiver node problem and thus, further avoid the blockage of control channel for longer periods of time. A power control scheme is used as before to reduce interference and to effectively re-use the available bandwidth. Simulation results show that ESU-MAC protocol is promising for mobile, ad-hoc network in terms of reduced contention at the control channel and improved throughput because of channel re-use. Results show a considerable increase in throughput compared to SU-MAC which could be attributed to increased accessibility of control channel and improved utilization of data channels due to superior control information exchange scheme.
A rapid enzymatic assay for high-throughput screening of adenosine-producing strains
Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei
2015-01-01
Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842
Lu, Zhi-Yan; Guo, Xiao-Jue; Li, Hui; Huang, Zhong-Zi; Lin, Kuang-Fei; Liu, Yong-Di
2015-01-01
A high-throughput screening system for moderately halophilic phenol-degrading bacteria from various habitats was developed to replace the conventional strain screening owing to its high efficiency. Bacterial enrichments were cultivated in 48 deep well microplates instead of shake flasks or tubes. Measurement of phenol concentrations was performed in 96-well microplates instead of using the conventional spectrophotometric method or high-performance liquid chromatography (HPLC). The high-throughput screening system was used to cultivate forty-three bacterial enrichments and gained a halophilic bacterial community E3 with the best phenol-degrading capability. Halomonas sp. strain 4-5 was isolated from the E3 community. Strain 4-5 was able to degrade more than 94% of the phenol (500 mg·L−1 starting concentration) over a range of 3%–10% NaCl. Additionally, the strain accumulated the compatible solute, ectoine, with increasing salt concentrations. PCR detection of the functional genes suggested that the largest subunit of multicomponent phenol hydroxylase (LmPH) and catechol 1,2-dioxygenase (C12O) were active in the phenol degradation process. PMID:26020478
High-throughput diagnosis of potato cyst nematodes in soil samples.
Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon
2015-01-01
Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders are having ...
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
Abstract One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders a...
White, David T; Eroglu, Arife Unal; Wang, Guohua; Zhang, Liyun; Sengupta, Sumitra; Ding, Ding; Rajpurohit, Surendra K; Walker, Steven L; Ji, Hongkai; Qian, Jiang; Mumm, Jeff S
2017-01-01
The zebrafish has emerged as an important model for whole-organism small-molecule screening. However, most zebrafish-based chemical screens have achieved only mid-throughput rates. Here we describe a versatile whole-organism drug discovery platform that can achieve true high-throughput screening (HTS) capacities. This system combines our automated reporter quantification in vivo (ARQiv) system with customized robotics, and is termed ‘ARQiv-HTS’. We detail the process of establishing and implementing ARQiv-HTS: (i) assay design and optimization, (ii) calculation of sample size and hit criteria, (iii) large-scale egg production, (iv) automated compound titration, (v) dispensing of embryos into microtiter plates, and (vi) reporter quantification. We also outline what we see as best practice strategies for leveraging the power of ARQiv-HTS for zebrafish-based drug discovery, and address technical challenges of applying zebrafish to large-scale chemical screens. Finally, we provide a detailed protocol for a recently completed inaugural ARQiv-HTS effort, which involved the identification of compounds that elevate insulin reporter activity. Compounds that increased the number of insulin-producing pancreatic beta cells represent potential new therapeutics for diabetic patients. For this effort, individual screening sessions took 1 week to conclude, and sessions were performed iteratively approximately every other day to increase throughput. At the conclusion of the screen, more than a half million drug-treated larvae had been evaluated. Beyond this initial example, however, the ARQiv-HTS platform is adaptable to almost any reporter-based assay designed to evaluate the effects of chemical compounds in living small-animal models. ARQiv-HTS thus enables large-scale whole-organism drug discovery for a variety of model species and from numerous disease-oriented perspectives. PMID:27831568
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Measurements of file transfer rates over dedicated long-haul connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S; Settlemyer, Bradley W; Imam, Neena
2016-01-01
Wide-area file transfers are an integral part of several High-Performance Computing (HPC) scenarios. Dedicated network connections with high capacity, low loss rate and low competing traffic, are increasingly being provisioned over current HPC infrastructures to support such transfers. To gain insights into these file transfers, we collected transfer rate measurements for Lustre and xfs file systems between dedicated multi-core servers over emulated 10 Gbps connections with round trip times (rtt) in 0-366 ms range. Memory transfer throughput over these connections is measured using iperf, and file IO throughput on host systems is measured using xddprof. We consider two file systemmore » configurations: Lustre over IB network and xfs over SSD connected to PCI bus. Files are transferred using xdd across these connections, and the transfer rates are measured, which indicate the need to jointly optimize the connection and host file IO parameters to achieve peak transfer rates. In particular, these measurements indicate that (i) peak file transfer rate is lower than peak connection and host IO throughput, in some cases by as much as 50% or lower, (ii) xdd request sizes that achieve peak throughput for host file IO do not necessarily lead to peak file transfer rates, and (iii) parallelism in host IO and TCP transport does not always improve the file transfer rates.« less
[Weighted gene co-expression network analysis in biomedicine research].
Liu, Wei; Li, Li; Ye, Hua; Tu, Wei
2017-11-25
High-throughput biological technologies are now widely applied in biology and medicine, allowing scientists to monitor thousands of parameters simultaneously in a specific sample. However, it is still an enormous challenge to mine useful information from high-throughput data. The emergence of network biology provides deeper insights into complex bio-system and reveals the modularity in tissue/cellular networks. Correlation networks are increasingly used in bioinformatics applications. Weighted gene co-expression network analysis (WGCNA) tool can detect clusters of highly correlated genes. Therefore, we systematically reviewed the application of WGCNA in the study of disease diagnosis, pathogenesis and other related fields. First, we introduced principle, workflow, advantages and disadvantages of WGCNA. Second, we presented the application of WGCNA in disease, physiology, drug, evolution and genome annotation. Then, we indicated the application of WGCNA in newly developed high-throughput methods. We hope this review will help to promote the application of WGCNA in biomedicine research.
Russo, Marina; Dugo, Paola; Marzocco, Stefania; Inferrera, Veronica; Mondello, Luigi
2015-12-01
Important objectives of a high-performance liquid chromatography preparative process are: purity of products isolated, yield, and throughput. The multidimensional preparative liquid chromatography method used in this work was developed mainly to increase the throughput; moreover purity and yield are increased thanks to the automated collection of the molecules based on the intensity of a signal generated from the mass spectrometer detector, in this way only a specific product can be targeted. This preparative system allowed, in few analyses both in the first and second dimensions, the isolation of eight pure compounds present at very different concentration in the original sample with high purity (>95%) and yield, which showed how the system is efficient and versatile. Pure molecules were used to validate the analytical method and to test the anti-inflammatory and antiproliferative potential of flavonoids. The contemporary presence, in bergamot juice, of all the flavonoids together increases the anti-inflammatory effect with respect to the single compound alone. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Controlling high-throughput manufacturing at the nano-scale
NASA Astrophysics Data System (ADS)
Cooper, Khershed P.
2013-09-01
Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
Multiplexed high resolution soft x-ray RIXS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Y.-D.; Voronov, D.; Warwick, T.
2016-07-27
High-resolution Resonance Inelastic X-ray Scattering (RIXS) is a technique that allows us to probe the electronic excitations of complex materials with unprecedented precision. However, the RIXS process has a low cross section, compounded by the fact that the optical spectrometers used to analyze the scattered photons can only collect a small solid angle and overall have a small efficiency. Here we present a method to significantly increase the throughput of RIXS systems, by energy multiplexing, so that a complete RIXS map of scattered intensity versus photon energy in and photon energy out can be recorded simultaneously{sup 1}. This parallel acquisitionmore » scheme should provide a gain in throughput of over 100.. A system based on this principle, QERLIN, is under construction at the Advanced Light Source (ALS).« less
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Wake Vortex Systems Cost/Benefits Analysis
NASA Technical Reports Server (NTRS)
Crisp, Vicki K.
1997-01-01
The goals of cost/benefit assessments are to provide quantitative and qualitative data to aid in the decision-making process. Benefits derived from increased throughput (or decreased delays) used to balance life-cycle costs. Packaging technologies together may provide greater gains (demonstrate higher return on investment).
NASA Astrophysics Data System (ADS)
Ohene-Kwofie, Daniel; Otoo, Ekow
2015-10-01
The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level. We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.
Bernstock, Joshua D; Lee, Yang-ja; Peruzzotti-Jametti, Luca; Southall, Noel; Johnson, Kory R; Maric, Dragan; Volpe, Giulio; Kouznetsova, Jennifer; Zheng, Wei; Pluchino, Stefano
2015-01-01
The conjugation/de-conjugation of Small Ubiquitin-like Modifier (SUMO) has been shown to be associated with a diverse set of physiologic/pathologic conditions. The clinical significance and ostensible therapeutic utility offered via the selective control of the global SUMOylation process has become readily apparent in ischemic pathophysiology. Herein, we describe the development of a novel quantitative high-throughput screening (qHTS) system designed to identify small molecules capable of increasing SUMOylation via the regulation/inhibition of members of the microRNA (miRNA)-182 family. This assay employs a SHSY5Y human neuroblastoma cell line stably transfected with a dual firefly-Renilla luciferase reporter system for identification of specific inhibitors of either miR-182 or miR-183. In this study, we have identified small molecules capable of inducing increased global conjugation of SUMO in both SHSY5Y cells and rat E18-derived primary cortical neurons. The protective effects of a number of the identified compounds were confirmed via an in vitro ischemic model (oxygen/glucose deprivation). Of note, this assay can be easily repurposed to allow high-throughput analyses of the potential drugability of other relevant miRNA(s) in ischemic pathobiology. PMID:26661196
High throughput screening of CO2-tolerating microalgae using GasPak bags
2013-01-01
Background Microalgae are diverse in terms of their speciation and function. More than 35,000 algal strains have been described, and thousands of algal cultures are maintained in different culture collection centers. The ability of CO2 uptake by microalgae varies dramatically among algal species. It becomes challenging to select suitable algal candidates that can proliferate under high CO2 concentration from a large collection of algal cultures. Results Here, we described a high throughput screening method to rapidly identify high CO2 affinity microalgae. The system integrates a CO2 mixer, GasPak bags and microplates. Microalgae on the microplates will be cultivated in GasPak bags charged with different CO2 concentrations. Using this method, we identified 17 algal strains whose growth rates were not influenced when the concentration of CO2 was increased from 2 to 20% (v/v). Most CO2 tolerant strains identified in this study were closely related to the species Scenedesmus and Chlorococcum. One of Scenedesmus strains (E7A) has been successfully tested in in the scale up photo bioreactors (500 L) bubbled with flue gas which contains 10-12% CO2. Conclusion Our high throughput CO2 testing system provides a rapid and reliable way for identifying microalgal candidate strains that can grow under high CO2 condition from a large pool of culture collection species. This high throughput system can also be modified for selecting algal strains that can tolerate other gases, such as NOx, SOx, or flue gas. PMID:24341988
HTP-NLP: A New NLP System for High Throughput Phenotyping.
Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L
2017-01-01
Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.
Advanced continuous cultivation methods for systems microbiology.
Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo
2015-09-01
Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories.
Reflective optical imaging system
Shafer, David R.
2000-01-01
An optical system compatible with short wavelength (extreme ultraviolet) radiation comprising four reflective elements for projecting a mask image onto a substrate. The four optical elements are characterized in order from object to image as convex, concave, convex and concave mirrors. The optical system is particularly suited for step and scan lithography methods. The invention increases the slit dimensions associated with ringfield scanning optics, improves wafer throughput and allows higher semiconductor device density.
Performance Analysis of the NAS Y-MP Workload
NASA Technical Reports Server (NTRS)
Bergeron, Robert J.; Kutler, Paul (Technical Monitor)
1997-01-01
This paper describes the performance characteristics of the computational workloads on the NAS Cray Y-MP machines, a Y-MP 832 and later a Y-MP 8128. Hardware measurements indicated that the Y-MP workload performance matured over time, ultimately sustaining an average throughput of 0.8 GFLOPS and a vector operation fraction of 87%. The measurements also revealed an operation rate exceeding 1 per clock period, a well-balanced architecture featuring a strong utilization of vector functional units, and an efficient memory organization. Introduction of the larger memory 8128 increased throughput by allowing a more efficient utilization of CPUs. Throughput also depended on the metering of the batch queues; low-idle Saturday workloads required a buffer of small jobs to prevent memory starvation of the CPU. UNICOS required about 7% of total CPU time to service the 832 workloads; this overhead decreased to 5% for the 8128 workloads. While most of the system time went to service I/O requests, efficient scheduling prevented excessive idle due to I/O wait. System measurements disclosed no obvious bottlenecks in the response of the machine and UNICOS to the workloads. In most cases, Cray-provided software tools were- quite sufficient for measuring the performance of both the machine and operating, system.
Incorporating Active Runway Crossings in Airport Departure Scheduling
NASA Technical Reports Server (NTRS)
Gupta, Gautam; Malik, Waqar; Jung, Yoon C.
2010-01-01
A mixed integer linear program is presented for deterministically scheduling departure and ar rival aircraft at airport runways. This method addresses different schemes of managing the departure queuing area by treating it as first-in-first-out queues or as a simple par king area where any available aircraft can take-off ir respective of its relative sequence with others. In addition, this method explicitly considers separation criteria between successive aircraft and also incorporates an optional prioritization scheme using time windows. Multiple objectives pertaining to throughput and system delay are used independently. Results indicate improvement over a basic first-come-first-serve rule in both system delay and throughput. Minimizing system delay results in small deviations from optimal throughput, whereas minimizing throughput results in large deviations in system delay. Enhancements for computational efficiency are also presented in the form of reformulating certain constraints and defining additional inequalities for better bounds.
Analysis and Testing of Mobile Wireless Networks
NASA Technical Reports Server (NTRS)
Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)
2002-01-01
Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.
A system performance throughput model applicable to advanced manned telescience systems
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1990-01-01
As automated space systems become more complex, autonomous, and opaque to the flight crew, it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed that are related to total system validation. An evaluative throughput model is presented which can be used to generate a human operator-related benchmark or figure of merit for a given system which involves humans at the input and output ends as well as other automated intelligent agents. The concept of sustained and accurate command/control data information transfer is introduced. The first two input parameters of the model involve nominal and off-nominal predicted events. The first of these calls for a detailed task analysis while the second is for a contingency event assessment. The last two required input parameters involving actual (measured) events, namely human performance and continuous semi-automated system performance. An expression combining these four parameters was found using digital simulations and identical, representative, random data to yield the smallest variance.
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
Tier-2 Optimisation for Computational Density/Diversity and Big Data
NASA Astrophysics Data System (ADS)
Fay, R. B.; Bland, J.
2014-06-01
As the number of cores on chip continues to trend upwards and new CPU architectures emerge, increasing CPU density and diversity presents multiple challenges to site administrators. These include scheduling for massively multi-core systems (potentially including Graphical Processing Units (GPU), integrated and dedicated) and Many Integrated Core (MIC)) to ensure a balanced throughput of jobs while preserving overall cluster throughput, as well as the increasing complexity of developing for these heterogeneous platforms, and the challenge in managing this more complex mix of resources. In addition, meeting data demands as both dataset sizes increase and as the rate of demand scales with increased computational power requires additional performance from the associated storage elements. In this report, we evaluate one emerging technology, Solid State Drive (SSD) caching for RAID controllers, with consideration to its potential to assist in meeting evolving demand. We also briefly consider the broader developing trends outlined above in order to identify issues that may develop and assess what actions should be taken in the immediate term to address those.
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
Yajuan, Xiao; Xin, Liang; Zhiyuan, Li
2012-01-01
The patch clamp technique is commonly used in electrophysiological experiments and offers direct insight into ion channel properties through the characterization of ion channel activity. This technique can be used to elucidate the interaction between a drug and a specific ion channel at different conformational states to understand the ion channel modulators’ mechanisms. The patch clamp technique is regarded as a gold standard for ion channel research; however, it suffers from low throughput and high personnel costs. In the last decade, the development of several automated electrophysiology platforms has greatly increased the screen throughput of whole cell electrophysiological recordings. New advancements in the automated patch clamp systems have aimed to provide high data quality, high content, and high throughput. However, due to the limitations noted above, automated patch clamp systems are not capable of replacing manual patch clamp systems in ion channel research. While automated patch clamp systems are useful for screening large amounts of compounds in cell lines that stably express high levels of ion channels, the manual patch clamp technique is still necessary for studying ion channel properties in some research areas and for specific cell types, including primary cells that have mixed cell types and differentiated cells that derive from induced pluripotent stem cells (iPSCs) or embryonic stem cells (ESCs). Therefore, further improvements in flexibility with regard to cell types and data quality will broaden the applications of the automated patch clamp systems in both academia and industry. PMID:23346269
Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations.
Trnka, Hjalte; Wu, Jian X; Van De Weert, Marco; Grohganz, Holger; Rantanen, Jukka
2013-12-01
Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilarity concept complicate the development phase of safe and cost-effective drug products. To streamline the development phase and to make high-throughput formulation screening possible, efficient solutions for analyzing critical quality attributes such as cake quality with minimal material consumption are needed. The aim of this study was to develop a fuzzy logic system based on image analysis (IA) for analyzing cake quality. Freeze-dried samples with different visual quality attributes were prepared in well plates. Imaging solutions together with image analytical routines were developed for extracting critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed fuzzy logic-based system was found to give comparable quality scores with visual evaluation, making high-throughput classification of cake quality possible. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Networked Airborne Communications Using Adaptive Multi Beam Directional Links
2016-03-05
Networked Airborne Communications Using Adaptive Multi-Beam Directional Links R. Bruce MacLeod Member, IEEE, and Adam Margetts Member, IEEE MIT...provide new techniques for increasing throughput in airborne adaptive directional net- works. By adaptive directional linking, we mean systems that can...techniques can dramatically increase the capacity in airborne networks. Advances in digital array technology are beginning to put these gains within reach
Beeman, Katrin; Baumgärtner, Jens; Laubenheimer, Manuel; Hergesell, Karlheinz; Hoffmann, Martin; Pehl, Ulrich; Fischer, Frank; Pieck, Jan-Carsten
2017-12-01
Mass spectrometry (MS) is known for its label-free detection of substrates and products from a variety of enzyme reactions. Recent hardware improvements have increased interest in the use of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS for high-throughput drug discovery. Despite interest in this technology, several challenges remain and must be overcome before MALDI-MS can be integrated as an automated "in-line reader" for high-throughput drug discovery. Two such hurdles include in situ sample processing and deposition, as well as integration of MALDI-MS for enzymatic screening assays that usually contain high levels of MS-incompatible components. Here we adapt our c-MET kinase assay to optimize for MALDI-MS compatibility and test its feasibility for compound screening. The pros and cons of the Echo (Labcyte) as a transfer system for in situ MALDI-MS sample preparation are discussed. We demonstrate that this method generates robust data in a 1536-grid format. We use the MALDI-MS to directly measure the ratio of c-MET substrate and phosphorylated product to acquire IC50 curves and demonstrate that the pharmacology is unaffected. The resulting IC50 values correlate well between the common label-based capillary electrophoresis and the label-free MALDI-MS detection method. We predict that label-free MALDI-MS-based high-throughput screening will become increasingly important and more widely used for drug discovery.
Robust reflective pupil slicing technology
NASA Astrophysics Data System (ADS)
Meade, Jeffrey T.; Behr, Bradford B.; Cenko, Andrew T.; Hajian, Arsen R.
2014-07-01
Tornado Spectral Systems (TSS) has developed the High Throughput Virtual Slit (HTVSTM), robust all-reflective pupil slicing technology capable of replacing the slit in research-, commercial- and MIL-SPEC-grade spectrometer systems. In the simplest configuration, the HTVS allows optical designers to remove the lossy slit from pointsource spectrometers and widen the input slit of long-slit spectrometers, greatly increasing throughput without loss of spectral resolution or cross-dispersion information. The HTVS works by transferring etendue between image plane axes but operating in the pupil domain rather than at a focal plane. While useful for other technologies, this is especially relevant for spectroscopic applications by performing the same spectral narrowing as a slit without throwing away light on the slit aperture. HTVS can be implemented in all-reflective designs and only requires a small number of reflections for significant spectral resolution enhancement-HTVS systems can be efficiently implemented in most wavelength regions. The etendueshifting operation also provides smooth scaling with input spot/image size without requiring reconfiguration for different targets (such as different seeing disk diameters or different fiber core sizes). Like most slicing technologies, HTVS provides throughput increases of several times without resolution loss over equivalent slitbased designs. HTVS technology enables robust slit replacement in point-source spectrometer systems. By virtue of pupilspace operation this technology has several advantages over comparable image-space slicer technology, including the ability to adapt gracefully and linearly to changing source size and better vertical packing of the flux distribution. Additionally, this technology can be implemented with large slicing factors in both fast and slow beams and can easily scale from large, room-sized spectrometers through to small, telescope-mounted devices. Finally, this same technology is directly applicable to multi-fiber spectrometers to achieve similar enhancement. HTVS also provides the ability to anamorphically "stretch" the slit image in long-slit spectrometers, allowing the instrument designer to optimize the plate scale in the dispersion axis and cross-dispersion axes independently without sacrificing spatial information. This allows users to widen the input slit, with the associated gain of throughput and loss of spatial selectivity, while maintaining the spectral resolution of the spectrometer system. This "stretching" places increased requirements on detector focal plane height, as with image slicing techniques, but provides additional degrees of freedom to instrument designers to build the best possible spectrometer systems. We discuss the details of this technology for an astronomical context, covering the applicability from small telescope mounted spectrometers through long-slit imagers and radial-velocity engines. This powerful tool provides additional degrees of freedom when designing a spectrometer, enabling instrument designers to further optimize systems for the required scientific goals.
High throughput imaging cytometer with acoustic focussing.
Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter
2015-10-31
We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.
NASA Technical Reports Server (NTRS)
Lee, Shihyan; Meister, Gerhard
2017-01-01
Since Moderate Resolution Imaging Spectroradiometer Aqua's launch in 2002, the radiometric system gains of the reflective solar bands have been degrading, indicating changes in the systems optical throughput. To estimate the optical throughput degradation, the electronic gain changes were estimated and removed from the measured system gain. The derived optical throughput degradation shows a rate that is much faster in the shorter wavelengths than the longer wavelengths. The wavelength-dependent optical throughput degradation modulated the relative spectral response (RSR) of the bands. In addition, the optical degradation is also scan angle-dependent due to large changes in response versus the scan angle over time. We estimated the modulated RSR as a function of time and scan angles and its impacts on sensor radiometric calibration for the ocean science. Our results show that the calibration bias could be up to 1.8 % for band 8 (412 nm) due to its larger out-of-band response. For the other ocean bands, the calibration biases are much smaller with magnitudes at least one order smaller.
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Pelkowski, Sean D.; Kapoor, Mrinal; Richendrfer, Holly A.; Wang, Xingyue; Colwill, Ruth M.; Creton, Robbert
2011-01-01
Early brain development can be influenced by numerous genetic and environmental factors, with long-lasting effects on brain function and behavior. The identification of these factors is facilitated by recent innovations in high-throughput screening. However, large-scale screening in whole organisms remains challenging, in particular when studying changes in brain function or behavior in vertebrate model systems. In this study, we present a novel imaging system for high-throughput analyses of behavior in zebrafish larvae. The three-camera system can image twelve multiwell plates simultaneously and is unique in its ability to provide local visual stimuli in the wells of a multiwell plate. The acquired images are converted into a series of coordinates, which characterize the location and orientation of the larvae. The developed imaging techniques were tested by measuring avoidance behaviors in seven-day-old zebrafish larvae. The system effectively quantified larval avoidance and revealed an increased edge preference in response to a blue or red ‘bouncing ball’ stimulus. Larvae also avoid a bouncing ball stimulus when it is counter-balanced with a stationary ball, but do not avoid blinking balls counter-balanced with a stationary ball. These results indicate that the seven-day-old larvae respond specifically to movement, rather than color, size, or local changes in light intensity. The imaging system and assays for measuring avoidance behavior may be used to screen for genetic and environmental factors that cause developmental brain disorders and for novel drugs that could prevent or treat these disorders. PMID:21549762
Pelkowski, Sean D; Kapoor, Mrinal; Richendrfer, Holly A; Wang, Xingyue; Colwill, Ruth M; Creton, Robbert
2011-09-30
Early brain development can be influenced by numerous genetic and environmental factors, with long-lasting effects on brain function and behavior. The identification of these factors is facilitated by recent innovations in high-throughput screening. However, large-scale screening in whole organisms remains challenging, in particular when studying changes in brain function or behavior in vertebrate model systems. In this study, we present a novel imaging system for high-throughput analyses of behavior in zebrafish larvae. The three-camera system can image 12 multiwell plates simultaneously and is unique in its ability to provide local visual stimuli in the wells of a multiwell plate. The acquired images are converted into a series of coordinates, which characterize the location and orientation of the larvae. The developed imaging techniques were tested by measuring avoidance behaviors in seven-day-old zebrafish larvae. The system effectively quantified larval avoidance and revealed an increased edge preference in response to a blue or red 'bouncing ball' stimulus. Larvae also avoid a bouncing ball stimulus when it is counter-balanced with a stationary ball, but do not avoid blinking balls counter-balanced with a stationary ball. These results indicate that the seven-day-old larvae respond specifically to movement, rather than color, size, or local changes in light intensity. The imaging system and assays for measuring avoidance behavior may be used to screen for genetic and environmental factors that cause developmental brain disorders and for novel drugs that could prevent or treat these disorders. Copyright © 2011 Elsevier B.V. All rights reserved.
Systems metabolic engineering: genome-scale models and beyond.
Blazeck, John; Alper, Hal
2010-07-01
The advent of high throughput genome-scale bioinformatics has led to an exponential increase in available cellular system data. Systems metabolic engineering attempts to use data-driven approaches--based on the data collected with high throughput technologies--to identify gene targets and optimize phenotypical properties on a systems level. Current systems metabolic engineering tools are limited for predicting and defining complex phenotypes such as chemical tolerances and other global, multigenic traits. The most pragmatic systems-based tool for metabolic engineering to arise is the in silico genome-scale metabolic reconstruction. This tool has seen wide adoption for modeling cell growth and predicting beneficial gene knockouts, and we examine here how this approach can be expanded for novel organisms. This review will highlight advances of the systems metabolic engineering approach with a focus on de novo development and use of genome-scale metabolic reconstructions for metabolic engineering applications. We will then discuss the challenges and prospects for this emerging field to enable model-based metabolic engineering. Specifically, we argue that current state-of-the-art systems metabolic engineering techniques represent a viable first step for improving product yield that still must be followed by combinatorial techniques or random strain mutagenesis to achieve optimal cellular systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Kazumichi, E-mail: kazumichisuzuki@gmail.c
Purpose: To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. Methods: At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery systemmore » downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. Results: The mean monthly equipment clinical availability for the spot scanning port in April 2012–March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012–August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for approximately 30%–40% of total treatment time for the total target volumes exceeding 200 cm{sup 3}, which was the case for more than 80% of the patients in this study. When total treatment time was modeled as a function of the number of fields and total target volume, the model overestimated total treatment time by 12% on average, with a standard deviation of 32%. A sensitivity analysis of throughput capacity for a hypothetical four-room spot scanning proton therapy center identified several priority items for improvements in throughput capacity, including operation time, beam delivery time, and patient immobilization and setup time. Conclusions: The spot scanning port at our proton therapy center has operated at a high performance level and has been used to treat a large number of complex cases. Further improvements in efficiency may be feasible in the areas of facility operation, beam delivery, patient immobilization and setup, and optimization of treatment scheduling.« less
High-Throughput Cloning and Expression Library Creation for Functional Proteomics
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-01-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047
High-throughput cloning and expression library creation for functional proteomics.
Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua
2013-05-01
The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particularly important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single-gene experiments, creating the need for fast, flexible, and reliable cloning systems. These collections of ORF clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial, we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator(TM) DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This tutorial is part of the International Proteomics Tutorial Programme (IPTP12). © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Formal Messaging Notation for Alaskan Aviation Data
NASA Technical Reports Server (NTRS)
Rios, Joseph L.
2015-01-01
Data exchange is an increasingly important aspect of the National Airspace System. While many data communication channels have become more capable of sending and receiving data at higher throughput rates, there is still a need to use communication channels efficiently with limited throughput. The limitation can be based on technological issues, financial considerations, or both. This paper provides a complete description of several important aviation weather data in Abstract Syntax Notation format. By doing so, data providers can take advantage of Abstract Syntax Notation's ability to encode data in a highly compressed format. When data such as pilot weather reports, surface weather observations, and various weather predictions are compressed in such a manner, it allows for the efficient use of throughput-limited communication channels. This paper provides details on the Abstract Syntax Notation One (ASN.1) implementation for Alaskan aviation data, and demonstrates its use on real-world aviation weather data samples as Alaska has sparse terrestrial data infrastructure and data are often sent via relatively costly satellite channels.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.
NASA Astrophysics Data System (ADS)
Kalsom Yusof, Umi; Nor Akmal Khalid, Mohd
2015-05-01
Semiconductor industries need to constantly adjust to the rapid pace of change in the market. Most manufactured products usually have a very short life cycle. These scenarios imply the need to improve the efficiency of capacity planning, an important aspect of the machine allocation plan known for its complexity. Various studies have been performed to balance productivity and flexibility in the flexible manufacturing system (FMS). Many approaches have been developed by the researchers to determine the suitable balance between exploration (global improvement) and exploitation (local improvement). However, not much work has been focused on the domain of machine allocation problem that considers the effects of machine breakdowns. This paper develops a model to minimize the effect of machine breakdowns, thus increasing the productivity. The objectives are to minimize system unbalance and makespan as well as increase throughput while satisfying the technological constraints such as machine time availability. To examine the effectiveness of the proposed model, results for throughput, system unbalance and makespan on real industrial datasets were performed with applications of intelligence techniques, that is, a hybrid of genetic algorithm and harmony search. The result aims to obtain a feasible solution to the domain problem.
Development and Validation of an Automated High-Throughput System for Zebrafish In Vivo Screenings
Virto, Juan M.; Holgado, Olaia; Diez, Maria; Izpisua Belmonte, Juan Carlos; Callol-Massot, Carles
2012-01-01
The zebrafish is a vertebrate model compatible with the paradigms of drug discovery. The small size and transparency of zebrafish embryos make them amenable for the automation necessary in high-throughput screenings. We have developed an automated high-throughput platform for in vivo chemical screenings on zebrafish embryos that includes automated methods for embryo dispensation, compound delivery, incubation, imaging and analysis of the results. At present, two different assays to detect cardiotoxic compounds and angiogenesis inhibitors can be automatically run in the platform, showing the versatility of the system. A validation of these two assays with known positive and negative compounds, as well as a screening for the detection of unknown anti-angiogenic compounds, have been successfully carried out in the system developed. We present a totally automated platform that allows for high-throughput screenings in a vertebrate organism. PMID:22615792
Improvement of an automated protein crystal exchange system PAM for high-throughput data collection
Hiraki, Masahiko; Yamada, Yusuke; Chavas, Leonard M. G.; Wakatsuki, Soichi; Matsugaki, Naohiro
2013-01-01
Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable. PMID:24121334
The Integrated Air Transportation System Evaluation Tool
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Hees, Jing; Villani, James A.; Yackovetsky, Robert E. (Technical Monitor)
2002-01-01
Throughout U.S. history, our nation has generally enjoyed exceptional economic growth, driven in part by transportation advancements. Looking forward 25 years, when the national highway and skyway systems are saturated, the nation faces new challenges in creating transportation-driven economic growth and wealth. To meet the national requirement for an improved air traffic management system, NASA developed the goal of tripling throughput over the next 20 years, in all weather conditions while maintaining safety. Analysis of the throughput goal has primarily focused on major airline operations, primarily through the hub and spoke system.However, many suggested concepts to increase throughput may operate outside the hub and spoke system. Examples of such concepts include the Small Aircraft Transportation System, civil tiltrotor, and improved rotorcraft. Proper assessment of the potential contribution of these technologies to the domestic air transportation system requires a modeling capability that includes the country's numerous smaller airports, acting as a fundamental component of the National Air space System, and the demand for such concepts and technologies. Under this task for NASA, the Logistics Management Institute developed higher fidelity demand models that capture the interdependence of short-haul air travel with other transportation modes and explicitly consider the costs of commercial air and other transport modes. To accomplish this work, we generated forecasts of the distribution of general aviation based aircraft and GA itinerant operations at each of nearly 3.000 airport based on changes in economic conditions and demographic trends. We also built modules that estimate the demand for travel by different modes, particularly auto, commercial air, and GA. We examined GA demand from two perspectives: top-down and bottom-up, described in detail.
USDA-ARS?s Scientific Manuscript database
A high-throughput transformation system previously developed in our laboratory was used for the regeneration of transgenic plum plants without the use of antibiotic selection. The system was first tested with two experimental constructs, pGA482GGi and pCAMBIAgfp94(35S), that contain selective marke...
2010-01-01
Service quality on computer and network systems has become increasingly important as many conventional service transactions are moved online. Service quality of computer and network services can be measured by the performance of the service process in throughput, delay, and so on. On a computer and network system, competing service requests of users and associated service activities change the state of limited system resources which in turn affects the achieved service ...relations of service activities, system state and service
Prediction-based association control scheme in dense femtocell networks.
Sung, Nak Woon; Pham, Ngoc-Thai; Huynh, Thong; Hwang, Won-Joo; You, Ilsun; Choo, Kim-Kwang Raymond
2017-01-01
The deployment of large number of femtocell base stations allows us to extend the coverage and efficiently utilize resources in a low cost manner. However, the small cell size of femtocell networks can result in frequent handovers to the mobile user, and consequently throughput degradation. Thus, in this paper, we propose predictive association control schemes to improve the system's effective throughput. Our design focuses on reducing handover frequency without impacting on throughput. The proposed schemes determine handover decisions that contribute most to the network throughput and are proper for distributed implementations. The simulation results show significant gains compared with existing methods in terms of handover frequency and network throughput perspective.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-08-01
In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.
A comparison of high-throughput techniques for assaying circadian rhythms in plants.
Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony
2015-01-01
Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.
Reflective optical imaging method and circuit
Shafer, David R.
2001-01-01
An optical system compatible with short wavelength (extreme ultraviolet) radiation comprising four reflective elements for projecting a mask image onto a substrate. The four optical elements are characterized in order from object to image as convex, concave, convex and concave mirrors. The optical system is particularly suited for step and scan lithography methods. The invention increases the slit dimensions associated with ringfield scanning optics, improves wafer throughput and allows higher semiconductor device density.
Light-emitting device test systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, Mark; Brodie, Alan; George, James
Light-emitting devices, such as LEDs, are tested using a photometric unit. The photometric unit, which may be an integrating sphere, can measure flux, color, or other properties of the devices. The photometric unit may have a single port or both an inlet and outlet. Light loss through the port, inlet, or outlet can be reduced or calibrated for. These testing systems can provide increased reliability, improved throughput, and/or improved measurement accuracy.
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
2018-01-01
The development of high-yielding crops with drought tolerance is necessary to increase food, feed, fiber and fuel production. Methods that create similar environmental conditions for a large number of genotypes are essential to investigate plant responses to drought in gene discovery studies. Modern facilities that control water availability for each plant remain cost-prohibited to some sections of the research community. We present an alternative cost-effective automated irrigation system scalable for a high-throughput and controlled dry-down treatment of plants. This system was tested in sorghum using two experiments. First, four genotypes were subjected to ten days of dry-down to achieve three final Volumetric Water Content (VWC) levels: drought (0.10 and 0.20 m3 m-3) and control (0.30 m3 m-3). The final average VWC was 0.11, 0.22, and 0.31 m3 m-3, respectively, and significant differences in biomass accumulation were observed between control and drought treatments. Second, 42 diverse sorghum genotypes were subjected to a seven-day dry-down treatment for a final drought stress of 0.15 m3 m-3 VWC. The final average VWC was 0.17 m3 m-3, and plants presented significant differences in photosynthetic rate during the drought period. These results demonstrate that cost-effective automation systems can successfully control substrate water content for each plant, to accurately compare their phenotypic responses to drought, and be scaled up for high-throughput phenotyping studies. PMID:29870560
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Ortiz, Diego; Litvin, Alexander G; Salas Fernandez, Maria G
2018-01-01
The development of high-yielding crops with drought tolerance is necessary to increase food, feed, fiber and fuel production. Methods that create similar environmental conditions for a large number of genotypes are essential to investigate plant responses to drought in gene discovery studies. Modern facilities that control water availability for each plant remain cost-prohibited to some sections of the research community. We present an alternative cost-effective automated irrigation system scalable for a high-throughput and controlled dry-down treatment of plants. This system was tested in sorghum using two experiments. First, four genotypes were subjected to ten days of dry-down to achieve three final Volumetric Water Content (VWC) levels: drought (0.10 and 0.20 m3 m-3) and control (0.30 m3 m-3). The final average VWC was 0.11, 0.22, and 0.31 m3 m-3, respectively, and significant differences in biomass accumulation were observed between control and drought treatments. Second, 42 diverse sorghum genotypes were subjected to a seven-day dry-down treatment for a final drought stress of 0.15 m3 m-3 VWC. The final average VWC was 0.17 m3 m-3, and plants presented significant differences in photosynthetic rate during the drought period. These results demonstrate that cost-effective automation systems can successfully control substrate water content for each plant, to accurately compare their phenotypic responses to drought, and be scaled up for high-throughput phenotyping studies.
From big data analysis to personalized medicine for all: challenges and opportunities.
Alyass, Akram; Turcotte, Michelle; Meyre, David
2015-06-27
Recent advances in high-throughput technologies have led to the emergence of systems biology as a holistic science to achieve more precise modeling of complex diseases. Many predict the emergence of personalized medicine in the near future. We are, however, moving from two-tiered health systems to a two-tiered personalized medicine. Omics facilities are restricted to affluent regions, and personalized medicine is likely to widen the growing gap in health systems between high and low-income countries. This is mirrored by an increasing lag between our ability to generate and analyze big data. Several bottlenecks slow-down the transition from conventional to personalized medicine: generation of cost-effective high-throughput data; hybrid education and multidisciplinary teams; data storage and processing; data integration and interpretation; and individual and global economic relevance. This review provides an update of important developments in the analysis of big data and forward strategies to accelerate the global transition to personalized medicine.
Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki
2014-12-01
As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.
High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.
Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang
2018-04-01
An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav
2015-07-01
Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.
High-speed cell recognition algorithm for ultrafast flow cytometer imaging system
NASA Astrophysics Data System (ADS)
Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang
2018-04-01
An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.
Repurposing High-Throughput Image Assays Enables Biological Activity Prediction for Drug Discovery.
Simm, Jaak; Klambauer, Günter; Arany, Adam; Steijaert, Marvin; Wegner, Jörg Kurt; Gustin, Emmanuel; Chupakhin, Vladimir; Chong, Yolanda T; Vialard, Jorge; Buijnsters, Peter; Velter, Ingrid; Vapirev, Alexander; Singh, Shantanu; Carpenter, Anne E; Wuyts, Roel; Hochreiter, Sepp; Moreau, Yves; Ceulemans, Hugo
2018-05-17
In both academia and the pharmaceutical industry, large-scale assays for drug discovery are expensive and often impractical, particularly for the increasingly important physiologically relevant model systems that require primary cells, organoids, whole organisms, or expensive or rare reagents. We hypothesized that data from a single high-throughput imaging assay can be repurposed to predict the biological activity of compounds in other assays, even those targeting alternate pathways or biological processes. Indeed, quantitative information extracted from a three-channel microscopy-based screen for glucocorticoid receptor translocation was able to predict assay-specific biological activity in two ongoing drug discovery projects. In these projects, repurposing increased hit rates by 50- to 250-fold over that of the initial project assays while increasing the chemical structure diversity of the hits. Our results suggest that data from high-content screens are a rich source of information that can be used to predict and replace customized biological assays. Copyright © 2018 Elsevier Ltd. All rights reserved.
Comprehensive Analysis of Immunological Synapse Phenotypes Using Supported Lipid Bilayers.
Valvo, Salvatore; Mayya, Viveka; Seraia, Elena; Afrose, Jehan; Novak-Kotzer, Hila; Ebner, Daniel; Dustin, Michael L
2017-01-01
Supported lipid bilayers (SLB) formed on glass substrates have been a useful tool for study of immune cell signaling since the early 1980s. The mobility of lipid-anchored proteins in the system, first described for antibodies binding to synthetic phospholipid head groups, allows for the measurement of two-dimensional binding reactions and signaling processes in a single imaging plane over time or for fixed samples. The fragility of SLB and the challenges of building and validating individual substrates limit most experimenters to ~10 samples per day, perhaps increasing this few-fold when examining fixed samples. Successful experiments might then require further days to fully analyze. We present methods for automation of many steps in SLB formation, imaging in 96-well glass bottom plates, and analysis that enables >100-fold increase in throughput for fixed samples and wide-field fluorescence. This increased throughput will allow better coverage of relevant parameters and more comprehensive analysis of aspects of the immunological synapse that are well reconstituted by SLB.
Caenorhabditis elegans: An Emerging Model in Biomedical and Environmental Toxicology
Leung, Maxwell C. K.; Williams, Phillip L.; Benedetto, Alexandre; Au, Catherine; Helmcke, Kirsten J.; Aschner, Michael; Meyer, Joel N.
2008-01-01
The nematode Caenorhabditis elegans has emerged as an important animal model in various fields including neurobiology, developmental biology, and genetics. Characteristics of this animal model that have contributed to its success include its genetic manipulability, invariant and fully described developmental program, well-characterized genome, ease of maintenance, short and prolific life cycle, and small body size. These same features have led to an increasing use of C. elegans in toxicology, both for mechanistic studies and high-throughput screening approaches. We describe some of the research that has been carried out in the areas of neurotoxicology, genetic toxicology, and environmental toxicology, as well as high-throughput experiments with C. elegans including genome-wide screening for molecular targets of toxicity and rapid toxicity assessment for new chemicals. We argue for an increased role for C. elegans in complementing other model systems in toxicological research. PMID:18566021
Generalized type II hybrid ARQ scheme using punctured convolutional coding
NASA Astrophysics Data System (ADS)
Kallel, Samir; Haccoun, David
1990-11-01
A method is presented to construct rate-compatible convolutional (RCC) codes from known high-rate punctured convolutional codes, obtained from best-rate 1/2 codes. The construction method is rather simple and straightforward, and still yields good codes. Moreover, low-rate codes can be obtained without any limit on the lowest achievable code rate. Based on the RCC codes, a generalized type-II hybrid ARQ scheme, which combines the benefits of the modified type-II hybrid ARQ strategy of Hagenauer (1988) with the code-combining ARQ strategy of Chase (1985), is proposed and analyzed. With the proposed generalized type-II hybrid ARQ strategy, the throughput increases as the starting coding rate increases, and as the channel degrades, it tends to merge with the throughput of rate 1/2 type-II hybrid ARQ schemes with code combining, thus allowing the system to be flexible and adaptive to channel conditions, even under wide noise variations and severe degradations.
Throughput of Coded Optical CDMA Systems with AND Detectors
NASA Astrophysics Data System (ADS)
Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.
2012-09-01
Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.
Hospital economics of the hospitalist.
Gregory, Douglas; Baigelman, Walter; Wilson, Ira B
2003-06-01
To determine the economic impact on the hospital of a hospitalist program and to develop insights into the relative economic importance of variables such as reductions in mean length of stay and cost, improvements in throughput (patients discharged per unit time), payer methods of reimbursement, and the cost of the hospitalist program. The primary data source was Tufts-New England Medical Center in Boston. Patient demographics, utilization, cost, and revenue data were obtained from the hospital's cost accounting system and medical records. The hospitalist admitted and managed all patients during a six-week period on the general medical unit of Tufts-New England Medical Center. Reimbursement, cost, length of stay, and throughput outcomes during this period were contrasted with patients admitted to the unit in the same period in the prior year, in the preceding period, and in the following period. The hospitalist group compared with the control group demonstrated: length of stay reduced to 2.19 days from 3.45 days (p<.001); total hospital costs per admission reduced to 1,775 dollars from 2,332 dollars (p<.001); costs per day increased to 811 dollars from 679 dollars (p<.001); no differences for readmission within 30 days of discharge to extended care facilities. The hospital's expected incremental profitability with the hospitalist was -1.44 dollars per admission excluding incremental throughput effects, and it was most sensitive to changes in the ratio of per diem to case rate reimbursement. Incremental throughput with the hospitalist was estimated at 266 patients annually with an associated incremental profitability of 1.3 million dollars. Hospital interventions designed to reduce length of stay, such as the hospitalist, should be evaluated in terms of cost, throughput, and reimbursement effects. Excluding throughput effects, the hospitalist program was not economically viable due to the influence of per diem reimbursement. Throughput improvements occasioned by the hospitalist program with high baseline occupancy levels are substantial and tend to favor a hospitalist program.
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
Increased collection efficiency of LIFI high intensity electrodeless light source
NASA Astrophysics Data System (ADS)
Hafidi, Abdeslam; DeVincentis, Marc; Duelli, Markus; Gilliard, Richard
2008-02-01
Recently, RF driven electrodeless high intensity light sources have been implemented successfully in the projection display systems for HDTV and videowall applications. This paper presents advances made in the RF waveguide and electric field concentrator structures with the purpose of reducing effective arc size and increasing light collection. In addition, new optical designs are described that further improve system efficiency. The results of this work demonstrate that projection system light throughput is increased relative to previous implementations and performance is optimized for home theater and other front projector applications that maintain multi-year lifetime without re-lamping, complete spectral range, fast start times and high levels of dynamic contrast due to dimming flexibility in the light source system.
NASA Astrophysics Data System (ADS)
Mughal, A.; Newman, H.
2017-10-01
We review and demonstrate the design of efficient data transfer nodes (DTNs), from the perspective of the highest throughput over both local and wide area networks, as well as the highest performance per unit cost. A careful system-level design is required for the hardware, firmware, OS and software components. Furthermore, additional tuning of these components, and the identification and elimination of any remaining bottlenecks is needed once the system is assembled and commissioned, in order to obtain optimal performance. For high throughput data transfers, specialized software is used to overcome the traditional limits in performance caused by the OS, file system, file structures used, etc. Concretely, we will discuss and present the latest results using Fast Data Transfer (FDT), developed by Caltech. We present and discuss the design choices for three generations of Caltech DTNs. Their transfer capabilities range from 40 Gbps to 400 Gbps. Disk throughput is still the biggest challenge in the current generation of available hardware. However, new NVME drives combined with RDMA and a new NVME network fabric are expected to improve the overall data-transfer throughput and simultaneously reduce the CPU load on the end nodes.
Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concavemore » and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko
2010-06-23
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less
A Barcoding Strategy Enabling Higher-Throughput Library Screening by Microscopy.
Chen, Robert; Rishi, Harneet S; Potapov, Vladimir; Yamada, Masaki R; Yeh, Vincent J; Chow, Thomas; Cheung, Celia L; Jones, Austin T; Johnson, Terry D; Keating, Amy E; DeLoache, William C; Dueber, John E
2015-11-20
Dramatic progress has been made in the design and build phases of the design-build-test cycle for engineering cells. However, the test phase usually limits throughput, as many outputs of interest are not amenable to rapid analytical measurements. For example, phenotypes such as motility, morphology, and subcellular localization can be readily measured by microscopy, but analysis of these phenotypes is notoriously slow. To increase throughput, we developed microscopy-readable barcodes (MiCodes) composed of fluorescent proteins targeted to discernible organelles. In this system, a unique barcode can be genetically linked to each library member, making possible the parallel analysis of phenotypes of interest via microscopy. As a first demonstration, we MiCoded a set of synthetic coiled-coil leucine zipper proteins to allow an 8 × 8 matrix to be tested for specific interactions in micrographs consisting of mixed populations of cells. A novel microscopy-readable two-hybrid fluorescence localization assay for probing candidate interactions in the cytosol was also developed using a bait protein targeted to the peroxisome and a prey protein tagged with a fluorescent protein. This work introduces a generalizable, scalable platform for making microscopy amenable to higher-throughput library screening experiments, thereby coupling the power of imaging with the utility of combinatorial search paradigms.
A high throughput mechanical screening device for cartilage tissue engineering.
Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L
2014-06-27
Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.
A High Throughput Model of Post-Traumatic Osteoarthritis using Engineered Cartilage Tissue Analogs
Mohanraj, Bhavana; Meloni, Gregory R.; Mauck, Robert L.; Dodge, George R.
2014-01-01
(1) Objective A number of in vitro models of post-traumatic osteoarthritis (PTOA) have been developed to study the effect of mechanical overload on the processes that regulate cartilage degeneration. While such frameworks are critical for the identification therapeutic targets, existing technologies are limited in their throughput capacity. Here, we validate a test platform for high-throughput mechanical injury incorporating engineered cartilage. (2) Method We utilized a high throughput mechanical testing platform to apply injurious compression to engineered cartilage and determined their strain and strain rate dependent responses to injury. Next, we validated this response by applying the same injury conditions to cartilage explants. Finally, we conducted a pilot screen of putative PTOA therapeutic compounds. (3) Results Engineered cartilage response to injury was strain dependent, with a 2-fold increase in GAG loss at 75% compared to 50% strain. Extensive cell death was observed adjacent to fissures, with membrane rupture corroborated by marked increases in LDH release. Testing of established PTOA therapeutics showed that pan-caspase inhibitor (ZVF) was effective at reducing cell death, while the amphiphilic polymer (P188) and the free-radical scavenger (NAC) reduced GAG loss as compared to injury alone. (4) Conclusions The injury response in this engineered cartilage model replicated key features of the response from cartilage explants, validating this system for application of physiologically relevant injurious compression. This study establishes a novel tool for the discovery of mechanisms governing cartilage injury, as well as a screening platform for the identification of new molecules for the treatment of PTOA. PMID:24999113
Neuroprotective Small Molecules for the Treatment of Amyotrophic Lateral Sclerosis
2012-09-30
family history are absolute risk factors (2, 3). Noted recently is the fact that US military serving in the Persian Gulf War show an increased...Lox system prolongs survival in Tg SOD1G37R mice compared to their germline littermates (24, 25). Using a neuronal/glial co-culture system, we and...throughput screen system to screen compounds that just might be useful for the treatment of ALS. Several in vitro models of ALS do exist, however
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2007-10-01
A simple model is presented of a possible inspection regimen applied to each leg of a cargo containers' journey between its point of origin and destination. Several candidate modalities are proposed to be used at multiple remote locations to act as a pre-screen inspection as the target approaches a perimeter and as the primary inspection modality at the portal. Information from multiple data sets are fused to optimize the costs and performance of a network of such inspection systems. A series of image processing algorithms are presented that automatically process X-ray images of containerized cargo. The goal of this processing is to locate the container in a real time stream of traffic traversing a portal without impeding the flow of commerce. Such processing may facilitate the inclusion of unmanned/unattended inspection systems in such a network. Several samples of the processing applied to data collected from deployed systems are included. Simulated data from a notional cargo inspection system with multiple sensor modalities and advanced data fusion algorithms are also included to show the potential increased detection and throughput performance of such a configuration.
Comparative Microbial Modules Resource: Generation and Visualization of Multi-species Biclusters
Bate, Ashley; Eichenberger, Patrick; Bonneau, Richard
2011-01-01
The increasing abundance of large-scale, high-throughput datasets for many closely related organisms provides opportunities for comparative analysis via the simultaneous biclustering of datasets from multiple species. These analyses require a reformulation of how to organize multi-species datasets and visualize comparative genomics data analyses results. Recently, we developed a method, multi-species cMonkey, which integrates heterogeneous high-throughput datatypes from multiple species to identify conserved regulatory modules. Here we present an integrated data visualization system, built upon the Gaggle, enabling exploration of our method's results (available at http://meatwad.bio.nyu.edu/cmmr.html). The system can also be used to explore other comparative genomics datasets and outputs from other data analysis procedures – results from other multiple-species clustering programs or from independent clustering of different single-species datasets. We provide an example use of our system for two bacteria, Escherichia coli and Salmonella Typhimurium. We illustrate the use of our system by exploring conserved biclusters involved in nitrogen metabolism, uncovering a putative function for yjjI, a currently uncharacterized gene that we predict to be involved in nitrogen assimilation. PMID:22144874
Comparative microbial modules resource: generation and visualization of multi-species biclusters.
Kacmarczyk, Thadeous; Waltman, Peter; Bate, Ashley; Eichenberger, Patrick; Bonneau, Richard
2011-12-01
The increasing abundance of large-scale, high-throughput datasets for many closely related organisms provides opportunities for comparative analysis via the simultaneous biclustering of datasets from multiple species. These analyses require a reformulation of how to organize multi-species datasets and visualize comparative genomics data analyses results. Recently, we developed a method, multi-species cMonkey, which integrates heterogeneous high-throughput datatypes from multiple species to identify conserved regulatory modules. Here we present an integrated data visualization system, built upon the Gaggle, enabling exploration of our method's results (available at http://meatwad.bio.nyu.edu/cmmr.html). The system can also be used to explore other comparative genomics datasets and outputs from other data analysis procedures - results from other multiple-species clustering programs or from independent clustering of different single-species datasets. We provide an example use of our system for two bacteria, Escherichia coli and Salmonella Typhimurium. We illustrate the use of our system by exploring conserved biclusters involved in nitrogen metabolism, uncovering a putative function for yjjI, a currently uncharacterized gene that we predict to be involved in nitrogen assimilation. © 2011 Kacmarczyk et al.
Qiu, Guanglei; Zhang, Sui; Srinivasa Raghavan, Divya Shankari; Das, Subhabrata; Ting, Yen-Peng
2016-11-01
This work uncovers an important feature of the forward osmosis membrane bioreactor (FOMBR) process: the decoupling of contaminants retention time (CRT) and hydraulic retention time (HRT). Based on this concept, the capability of the hybrid microfiltration-forward osmosis membrane bioreactor (MF-FOMBR) in achieving high through-put treatment of municipal wastewater with enhanced phosphorus recovery was explored. High removal of TOC and NH4(+)-N (90% and 99%, respectively) was achieved with HRTs down to 47min, with the treatment capacity increased by an order of magnitude. Reduced HRT did not affect phosphorus removal and recovery. As a result, the phosphorus recovery capacity was also increased by the same order. Reduced HRT resulted in increased system loading rates and thus elevated concentrations of mixed liquor suspended solids and increased membrane fouling. 454-pyrosequecing suggested the thriving of Bacteroidetes and Proteobacteria (especially Sphingobacteriales Flavobacteriales and Thiothrix members), as well as the community succession and dynamics of ammonium oxidizing and nitrite oxidizing bacteria. Copyright © 2016 Elsevier Ltd. All rights reserved.
Next generation platforms for high-throughput biodosimetry.
Repin, Mikhail; Turner, Helen C; Garty, Guy; Brenner, David J
2014-06-01
Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of biodosimetry assays was described. These platforms can be used at different stages of biodosimetry assays starting from blood collection into microtubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multiwell and multichannel plates. Robotically friendly platforms can be used for different biodosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
High-throughput NGL electron-beam direct-write lithography system
NASA Astrophysics Data System (ADS)
Parker, N. William; Brodie, Alan D.; McCoy, John H.
2000-07-01
Electron beam lithography systems have historically had low throughput. The only practical solution to this limitation is an approach using many beams writing simultaneously. For single-column multi-beam systems, including projection optics (SCALPELR and PREVAIL) and blanked aperture arrays, throughput and resolution are limited by space-charge effects. Multibeam micro-column (one beam per column) systems are limited by the need for low voltage operation, electrical connection density and fabrication complexities. In this paper, we discuss a new multi-beam concept employing multiple columns each with multiple beams to generate a very large total number of parallel writing beams. This overcomes the limitations of space-charge interactions and low voltage operation. We also discuss a rationale leading to the optimum number of columns and beams per column. Using this approach we show how production throughputs >= 60 wafers per hour can be achieved at CDs
The development of multi-well microelectrode array (mwMEA) systems has increased in vitro screening throughput making them an effective method to screen and prioritize large sets of compounds for potential neurotoxicity. In the present experiments, a multiplexed approach was used...
Hyperspectral imaging using the single-pixel Fourier transform technique
NASA Astrophysics Data System (ADS)
Jin, Senlin; Hui, Wangwei; Wang, Yunlong; Huang, Kaicheng; Shi, Qiushuai; Ying, Cuifeng; Liu, Dongqi; Ye, Qing; Zhou, Wenyuan; Tian, Jianguo
2017-03-01
Hyperspectral imaging technology is playing an increasingly important role in the fields of food analysis, medicine and biotechnology. To improve the speed of operation and increase the light throughput in a compact equipment structure, a Fourier transform hyperspectral imaging system based on a single-pixel technique is proposed in this study. Compared with current imaging spectrometry approaches, the proposed system has a wider spectral range (400-1100 nm), a better spectral resolution (1 nm) and requires fewer measurement data (a sample rate of 6.25%). The performance of this system was verified by its application to the non-destructive testing of potatoes.
Yoshii, Yukie; Furukawa, Takako; Waki, Atsuo; Okuyama, Hiroaki; Inoue, Masahiro; Itoh, Manabu; Zhang, Ming-Rong; Wakizaka, Hidekatsu; Sogawa, Chizuru; Kiyono, Yasushi; Yoshii, Hiroshi; Fujibayashi, Yasuhisa; Saga, Tsuneo
2015-05-01
Anti-cancer drug development typically utilizes high-throughput screening with two-dimensional (2D) cell culture. However, 2D culture induces cellular characteristics different from tumors in vivo, resulting in inefficient drug development. Here, we report an innovative high-throughput screening system using nanoimprinting 3D culture to simulate in vivo conditions, thereby facilitating efficient drug development. We demonstrated that cell line-based nanoimprinting 3D screening can more efficiently select drugs that effectively inhibit cancer growth in vivo as compared to 2D culture. Metabolic responses after treatment were assessed using positron emission tomography (PET) probes, and revealed similar characteristics between the 3D spheroids and in vivo tumors. Further, we developed an advanced method to adopt cancer cells from patient tumor tissues for high-throughput drug screening with nanoimprinting 3D culture, which we termed Cancer tissue-Originated Uniformed Spheroid Assay (COUSA). This system identified drugs that were effective in xenografts of the original patient tumors. Nanoimprinting 3D spheroids showed low permeability and formation of hypoxic regions inside, similar to in vivo tumors. Collectively, the nanoimprinting 3D culture provides easy-handling high-throughput drug screening system, which allows for efficient drug development by mimicking the tumor environment. The COUSA system could be a useful platform for drug development with patient cancer cells. Copyright © 2015 Elsevier Ltd. All rights reserved.
SDN based millimetre wave radio over fiber (RoF) network
NASA Astrophysics Data System (ADS)
Amate, Ahmed; Milosavljevic, Milos; Kourtessis, Pandelis; Robinson, Matthew; Senior, John M.
2015-01-01
This paper introduces software-defined, millimeter Wave (mm-Wave) networks with Radio over Fiber (RoF) for the delivery of gigabit connectivity required to develop fifth generation (5G) mobile. This network will enable an effective open access system allowing providers to manage and lease the infrastructure to service providers through unbundling new business models. Exploiting the inherited benefits of RoF, complete base station functionalities are centralized at the edges of the metro and aggregation network, leaving remote radio heads (RRHs) with only tunable filtering and amplification. A Software Defined Network (SDN) Central Controller (SCC) is responsible for managing the resource across several mm-Wave Radio Access Networks (RANs) providing a global view of the several network segments. This ensures flexible resource allocation for reduced overall latency and increased throughput. The SDN based mm-Wave RAN also allows for inter edge node communication. Therefore, certain packets can be routed between different RANs supported by the same edge node, reducing latency. System level simulations of the complete network have shown significant improvement of the overall throughput and SINR for wireless users by providing effective resource allocation and coordination among interfering cells. A new Coordinated Multipoint (CoMP) algorithm exploiting the benefits of the SCC global network view for reduced delay in control message exchange is presented, accounting for a minimum packet delay and limited Channel State Information (CSI) in a Long Term Evolution-Advanced (LTE-A), Cloud RAN (CRAN) configuration. The algorithm does not require detailed CSI feedback from UEs but it rather considers UE location (determined by the eNB) as the required parameter. UE throughput in the target sector is represented using a Cumulative Distributive Function (CDF). The drawn characteristics suggest that there is a significant 60% improvement in UE cell edge throughput following the application, in the coordinating cells, of the new CoMP algorithm. Results also show a further improvement of 36% in cell edge UE throughput when eNBs are centralized in a CRAN backhaul architecture. The SINR distribution of UEs in the cooperating cells has also been evaluated using a box plot. As expected, UEs with CoMP perform better demonstrating an increase of over 2 dB at the median between the transmission scenarios.
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Fridén, Markus; Ducrozet, Frederic; Middleton, Brian; Antonsson, Madeleine; Bredberg, Ulf; Hammarlund-Udenaes, Margareta
2009-06-01
New, more efficient methods of estimating unbound drug concentrations in the central nervous system (CNS) combine the amount of drug in whole brain tissue samples measured by conventional methods with in vitro estimates of the unbound brain volume of distribution (V(u,brain)). Although the brain slice method is the most reliable in vitro method for measuring V(u,brain), it has not previously been adapted for the needs of drug discovery research. The aim of this study was to increase the throughput and optimize the experimental conditions of this method. Equilibrium of drug between the buffer and the brain slice within the 4 to 5 h of incubation is a fundamental requirement. However, it is difficult to meet this requirement for many of the extensively binding, lipophilic compounds in drug discovery programs. In this study, the dimensions of the incubation vessel and mode of stirring influenced the equilibration time, as did the amount of brain tissue per unit of buffer volume. The use of cassette experiments for investigating V(u,brain) in a linear drug concentration range increased the throughput of the method. The V(u,brain) for the model compounds ranged from 4 to 3000 ml . g brain(-1), and the sources of variability are discussed. The optimized setup of the brain slice method allows precise, robust estimation of V(u,brain) for drugs with diverse properties, including highly lipophilic compounds. This is a critical step forward for the implementation of relevant measurements of CNS exposure in the drug discovery setting.
Zmijan, Robert; Jonnalagadda, Umesh S.; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn
2015-01-01
We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint. PMID:29456838
Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept
NASA Astrophysics Data System (ADS)
Zhang, Yimin
This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.
Throughput increase by adjustment of the BARC drying time with coat track process
NASA Astrophysics Data System (ADS)
Brakensiek, Nickolas L.; Long, Ryan
2005-05-01
Throughput of a coater module within the coater track is related to the solvent evaporation rate from the material that is being coated. Evaporation rate is controlled by the spin dynamics of the wafer and airflow dynamics over the wafer. Balancing these effects is the key to achieving very uniform coatings across a flat unpatterned wafer. As today"s coat tracks are being pushed to higher throughputs to match the scanner, the coat module throughput must be increased as well. For chemical manufacturers the evaporation rate of the material depends on the solvent used. One measure of relative evaporation rates is to compare flash points of a solvent. The lower the flash point, the quicker the solvent will evaporate. It is possible to formulate products with these volatile solvents although at a price. Shipping and manufacturing a more flammable product increase chances of fire, thereby increasing insurance premiums. Also, the end user of these chemicals will have to take extra precautions in the fab and in storage of these more flammable chemicals. An alternative coat process is possible which would allow higher throughput in a distinct coat module without sacrificing safety. A tradeoff is required for this process, that being a more complicated coat process and a higher viscosity chemical. The coat process uses the fact that evaporation rate depends on the spin dynamics of the wafer by utilizing a series of spin speeds that first would set the thickness of the material followed by a high spin speed to remove the residual solvent. This new process can yield a throughput of over 150 wafers per hour (wph) given two coat modules. The thickness uniformity of less than 2 nm (3 sigma) is still excellent, while drying times are shorter than 10 seconds to achieve the 150 wph throughput targets.
Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich
2006-01-01
The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.
Analysis of Optical CDMA Signal Transmission: Capacity Limits and Simulation Results
NASA Astrophysics Data System (ADS)
Garba, Aminata A.; Yim, Raymond M. H.; Bajcsy, Jan; Chen, Lawrence R.
2005-12-01
We present performance limits of the optical code-division multiple-access (OCDMA) networks. In particular, we evaluate the information-theoretical capacity of the OCDMA transmission when single-user detection (SUD) is used by the receiver. First, we model the OCDMA transmission as a discrete memoryless channel, evaluate its capacity when binary modulation is used in the interference-limited (noiseless) case, and extend this analysis to the case when additive white Gaussian noise (AWGN) is corrupting the received signals. Next, we analyze the benefits of using nonbinary signaling for increasing the throughput of optical CDMA transmission. It turns out that up to a fourfold increase in the network throughput can be achieved with practical numbers of modulation levels in comparison to the traditionally considered binary case. Finally, we present BER simulation results for channel coded binary and[InlineEquation not available: see fulltext.]-ary OCDMA transmission systems. In particular, we apply turbo codes concatenated with Reed-Solomon codes so that up to several hundred concurrent optical CDMA users can be supported at low target bit error rates. We observe that unlike conventional OCDMA systems, turbo-empowered OCDMA can allow overloading (more active users than is the length of the spreading sequences) with good bit error rate system performance.
Digital Microwave System Design Guide.
1984-02-01
traffic analysis is a continuous effort, setting parameters for subsequent stages of expansion after the system design is finished. 2.1.3 Quality of...operational structure of the user for whom he is providing service. 2.2.3 Quality of Service. In digital communications, the basic performance parameter ...the basic interpretation of system performance is measured in terms of a single parameter , throughput. Throughput can be defined as the number of
Protocols and programs for high-throughput growth and aging phenotyping in yeast.
Jung, Paul P; Christian, Nils; Kay, Daniel P; Skupin, Alexander; Linster, Carole L
2015-01-01
In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass) correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS), a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.
Patients’ Heart Monitoring System Based on Wireless Sensor Network
NASA Astrophysics Data System (ADS)
Sollu, T. S.; Alamsyah; Bachtiar, M.; Sooai, A. G.
2018-04-01
Wireless sensor network (WSN) has been utilized to support the health field such as monitoring the patient’s heartbeat. Heart health monitoring is essential in maintaining health, especially in the elderly. Such an arrangement is needed to understand the patient’s heart characteristics. The increasing number of patients certainly will enhance the burdens of doctors or nurses in dealing with the condition of the patients. Therefore, required a solution that could help doctors or nurses in monitoring the progress of patients’ health at a real time. This research proposes a design and application of a patient heart monitoring system based on WSN. This system with using electrocardiograph (ECG) mounted on the patients’ body and sent to the server through the ZigBee. The results indicated that the retrieval of data for 15 seconds in male patients, with the age of 25 years was 17 times rate or equal to 68 bpm. For 884 data packets sent for 15 minutes using ZigBee produce a data as much as 4488 bytes, throughput of 2.39 Kbps, and 0.24486 seconds of average delay. The measurement of the communication coverage based on the open space conditions within 15 seconds through ZigBee resulting throughput value of 4.19 Kbps, packet loss of 0 %, and 6.667 seconds of average delay. While, the measurement of communication range based on closed space condition through ZigBee resulting throughput of 4.27 Kbps, packet loss of 0 %, and 6.55 seconds of average delay.
Research progress of plant population genomics based on high-throughput sequencing.
Wang, Yun-sheng
2016-08-01
Population genomics, a new paradigm for population genetics, combine the concepts and techniques of genomics with the theoretical system of population genetics and improve our understanding of microevolution through identification of site-specific effect and genome-wide effects using genome-wide polymorphic sites genotypeing. With the appearance and improvement of the next generation high-throughput sequencing technology, the numbers of plant species with complete genome sequences increased rapidly and large scale resequencing has also been carried out in recent years. Parallel sequencing has also been done in some plant species without complete genome sequences. These studies have greatly promoted the development of population genomics and deepened our understanding of the genetic diversity, level of linking disequilibium, selection effect, demographical history and molecular mechanism of complex traits of relevant plant population at a genomic level. In this review, I briely introduced the concept and research methods of population genomics and summarized the research progress of plant population genomics based on high-throughput sequencing. I also discussed the prospect as well as existing problems of plant population genomics in order to provide references for related studies.
Ehrenworth, Amy M; Claiborne, Tauris; Peralta-Yahya, Pamela
2017-10-17
Chemical biosensors, for which chemical detection triggers a fluorescent signal, have the potential to accelerate the screening of noncolorimetric chemicals produced by microbes, enabling the high-throughput engineering of enzymes and metabolic pathways. Here, we engineer a G-protein-coupled receptor (GPCR)-based sensor to detect serotonin produced by a producer microbe in the producer microbe's supernatant. Detecting a chemical in the producer microbe's supernatant is nontrivial because of the number of other metabolites and proteins present that could interfere with sensor performance. We validate the two-cell screening system for medium-throughput applications, opening the door to the rapid engineering of microbes for the increased production of serotonin. We focus on serotonin detection as serotonin levels limit the microbial production of hydroxystrictosidine, a modified alkaloid that could accelerate the semisynthesis of camptothecin-derived anticancer pharmaceuticals. This work shows the ease of generating GPCR-based chemical sensors and their ability to detect specific chemicals in complex aqueous solutions, such as microbial spent medium. In addition, this work sets the stage for the rapid engineering of serotonin-producing microbes.
Wu, Szu-Huei; Yao, Chun-Hsu; Hsieh, Chieh-Jui; Liu, Yu-Wei; Chao, Yu-Sheng; Song, Jen-Shin; Lee, Jinq-Chyi
2015-07-10
Sodium-dependent glucose co-transporter 2 (SGLT2) inhibitors are of current interest as a treatment for type 2 diabetes. Efforts have been made to discover phlorizin-related glycosides with good SGLT2 inhibitory activity. To increase structural diversity and better understand the role of non-glycoside SGLT2 inhibitors on glycemic control, we initiated a research program to identify non-glycoside hits from high-throughput screening. Here, we report the development of a novel, fluorogenic probe-based glucose uptake system based on a Cu(I)-catalyzed [3+2] cycloaddition. The safer processes and cheaper substances made the developed assay our first priority for large-scale primary screening as compared to the well-known [(14)C]-labeled α-methyl-D-glucopyranoside ([(14)C]-AMG) radioactive assay. This effort culminated in the identification of a benzimidazole, non-glycoside SGLT2 hit with an EC50 value of 0.62 μM by high-throughput screening of 41,000 compounds. Copyright © 2015 Elsevier B.V. All rights reserved.
Selecting the most appropriate time points to profile in high-throughput studies
Kleyman, Michael; Sefer, Emre; Nicola, Teodora; Espinoza, Celia; Chhabra, Divya; Hagood, James S; Kaminski, Naftali; Ambalavanan, Namasivayam; Bar-Joseph, Ziv
2017-01-01
Biological systems are increasingly being studied by high throughput profiling of molecular data over time. Determining the set of time points to sample in studies that profile several different types of molecular data is still challenging. Here we present the Time Point Selection (TPS) method that solves this combinatorial problem in a principled and practical way. TPS utilizes expression data from a small set of genes sampled at a high rate. As we show by applying TPS to study mouse lung development, the points selected by TPS can be used to reconstruct an accurate representation for the expression values of the non selected points. Further, even though the selection is only based on gene expression, these points are also appropriate for representing a much larger set of protein, miRNA and DNA methylation changes over time. TPS can thus serve as a key design strategy for high throughput time series experiments. Supporting Website: www.sb.cs.cmu.edu/TPS DOI: http://dx.doi.org/10.7554/eLife.18541.001 PMID:28124972
Experimental Evaluation of Adaptive Modulation and Coding in MIMO WiMAX with Limited Feedback
NASA Astrophysics Data System (ADS)
Mehlführer, Christian; Caban, Sebastian; Rupp, Markus
2007-12-01
We evaluate the throughput performance of an OFDM WiMAX (IEEE 802.16-2004, Section 8.3) transmission system with adaptive modulation and coding (AMC) by outdoor measurements. The standard compliant AMC utilizes a 3-bit feedback for SISO and Alamouti coded MIMO transmissions. By applying a 6-bit feedback and spatial multiplexing with individual AMC on the two transmit antennas, the data throughput can be increased significantly for large SNR values. Our measurements show that at small SNR values, a single antenna transmission often outperforms an Alamouti transmission. We found that this effect is caused by the asymmetric behavior of the wireless channel and by poor channel knowledge in the two-transmit-antenna case. Our performance evaluation is based on a measurement campaign employing the Vienna MIMO testbed. The measurement scenarios include typical outdoor-to-indoor NLOS, outdoor-to-outdoor NLOS, as well as outdoor-to-indoor LOS connections. We found that in all these scenarios, the measured throughput is far from its achievable maximum; the loss is mainly caused by a too simple convolutional coding.
Process in manufacturing high efficiency AlGaAs/GaAs solar cells by MO-CVD
NASA Technical Reports Server (NTRS)
Yeh, Y. C. M.; Chang, K. I.; Tandon, J.
1984-01-01
Manufacturing technology for mass producing high efficiency GaAs solar cells is discussed. A progress using a high throughput MO-CVD reactor to produce high efficiency GaAs solar cells is discussed. Thickness and doping concentration uniformity of metal oxide chemical vapor deposition (MO-CVD) GaAs and AlGaAs layer growth are discussed. In addition, new tooling designs are given which increase the throughput of solar cell processing. To date, 2cm x 2cm AlGaAs/GaAs solar cells with efficiency up to 16.5% were produced. In order to meet throughput goals for mass producing GaAs solar cells, a large MO-CVD system (Cambridge Instrument Model MR-200) with a susceptor which was initially capable of processing 20 wafers (up to 75 mm diameter) during a single growth run was installed. In the MR-200, the sequencing of the gases and the heating power are controlled by a microprocessor-based programmable control console. Hence, operator errors can be reduced, leading to a more reproducible production sequence.
Boosalis, Michael S; Sangerman, Jose I; White, Gary L; Wolf, Roman F; Shen, Ling; Dai, Yan; White, Emily; Makala, Levi H; Li, Biaoru; Pace, Betty S; Nouraie, Mehdi; Faller, Douglas V; Perrine, Susan P
2015-01-01
High-level fetal (γ) globin expression ameliorates clinical severity of the beta (β) hemoglobinopathies, and safe, orally-bioavailable γ-globin inducing agents would benefit many patients. We adapted a LCR-γ-globin promoter-GFP reporter assay to a high-throughput robotic system to evaluate five diverse chemical libraries for this activity. Multiple structurally- and functionally-diverse compounds were identified which activate the γ-globin gene promoter at nanomolar concentrations, including some therapeutics approved for other conditions. Three candidates with established safety profiles were further evaluated in erythroid progenitors, anemic baboons and transgenic mice, with significant induction of γ-globin expression observed in vivo. A lead candidate, Benserazide, emerged which demonstrated > 20-fold induction of γ-globin mRNA expression in anemic baboons and increased F-cell proportions by 3.5-fold in transgenic mice. Benserazide has been used chronically to inhibit amino acid decarboxylase to enhance plasma levels of L-dopa. These studies confirm the utility of high-throughput screening and identify previously unrecognized fetal globin inducing candidates which can be developed expediently for treatment of hemoglobinopathies.
Joint optimization of maintenance, buffers and machines in manufacturing lines
NASA Astrophysics Data System (ADS)
Nahas, Nabil; Nourelfath, Mustapha
2018-01-01
This article considers a series manufacturing line composed of several machines separated by intermediate buffers of finite capacity. The goal is to find the optimal number of preventive maintenance actions performed on each machine, the optimal selection of machines and the optimal buffer allocation plan that minimize the total system cost, while providing the desired system throughput level. The mean times between failures of all machines are assumed to increase when applying periodic preventive maintenance. To estimate the production line throughput, a decomposition method is used. The decision variables in the formulated optimal design problem are buffer levels, types of machines and times between preventive maintenance actions. Three heuristic approaches are developed to solve the formulated combinatorial optimization problem. The first heuristic consists of a genetic algorithm, the second is based on the nonlinear threshold accepting metaheuristic and the third is an ant colony system. The proposed heuristics are compared and their efficiency is shown through several numerical examples. It is found that the nonlinear threshold accepting algorithm outperforms the genetic algorithm and ant colony system, while the genetic algorithm provides better results than the ant colony system for longer manufacturing lines.
Dual Telecentric Lens System For Projection Onto Tilted Toroidal Screen
NASA Technical Reports Server (NTRS)
Gold, Ronald S.; Hudyma, Russell M.
1995-01-01
System of two optical assemblies for projecting image onto tilted toroidal screen. One projection lens optimized for red and green spectral region; other for blue. Dual-channel approach offers several advantages which include: simplified color filtering, simplified chromatic aberration corrections, less complex polarizing prism arrangement, and increased throughput of blue light energy. Used in conjunction with any source of imagery, designed especially to project images formed by reflection of light from liquid-crystal light valve (LCLV).
Simulation and Analysis of the AFLC Bulk Data Network Using Abstract Data Types.
1981-12-01
performs. Simulation is more expensive than queueing, but it is often ə the only way to study complex funtional relationships in a large system. Unlike... relationship between through- put, response and cost is shown in Figure 2. At a given cost level, additional throughput can be obtained at the expense...improved by adding resources, but this increases the total cost of the system. Network models are used to study the relationship between cost
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
NASA Astrophysics Data System (ADS)
Foronda, Augusto; Ohta, Chikara; Tamaki, Hisashi
Dirty paper coding (DPC) is a strategy to achieve the region capacity of multiple input multiple output (MIMO) downlink channels and a DPC scheduler is throughput optimal if users are selected according to their queue states and current rates. However, DPC is difficult to implement in practical systems. One solution, zero-forcing beamforming (ZFBF) strategy has been proposed to achieve the same asymptotic sum rate capacity as that of DPC with an exhaustive search over the entire user set. Some suboptimal user group selection schedulers with reduced complexity based on ZFBF strategy (ZFBF-SUS) and proportional fair (PF) scheduling algorithm (PF-ZFBF) have also been proposed to enhance the throughput and fairness among the users, respectively. However, they are not throughput optimal, fairness and throughput decrease if each user queue length is different due to different users channel quality. Therefore, we propose two different scheduling algorithms: a throughput optimal scheduling algorithm (ZFBF-TO) and a reduced complexity scheduling algorithm (ZFBF-RC). Both are based on ZFBF strategy and, at every time slot, the scheduling algorithms have to select some users based on user channel quality, user queue length and orthogonality among users. Moreover, the proposed algorithms have to produce the rate allocation and power allocation for the selected users based on a modified water filling method. We analyze the schedulers complexity and numerical results show that ZFBF-RC provides throughput and fairness improvements compared to the ZFBF-SUS and PF-ZFBF scheduling algorithms.
Application of an industrial robot to nuclear pharmacy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viola, J.
1994-12-31
Increased patient throughput and lengthened P.E.T. scan protocols have increased the radiation dose received by P.E.T. technologists. Automated methods of tracer infusion and blood sampling have been introduced to reduce direct contact with the radioisotopes, but significant radiation exposure still exists during the receipt and dispensing of the patient dose. To address this situation the authors have developed an automated robotic system which performs these tasks, thus limiting the physical contact between operator and radioisotope.
USDA-ARS?s Scientific Manuscript database
Recent developments in high-throughput sequencing technology have made low-cost sequencing an attractive approach for many genome analysis tasks. Increasing read lengths, improving quality and the production of increasingly larger numbers of usable sequences per instrument-run continue to make whole...
Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won
2015-08-21
We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-01-01
Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274
Mathematical and Computational Modeling in Complex Biological Systems
Li, Wenyang; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558
FPGA cluster for high-performance AO real-time control system
NASA Astrophysics Data System (ADS)
Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.
2006-06-01
Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.
Mathematical and Computational Modeling in Complex Biological Systems.
Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.
Shahini, Mehdi; Yeow, John T W
2011-08-12
We report on the enhancement of electrical cell lysis using carbon nanotubes (CNTs). Electrical cell lysis systems are widely utilized in microchips as they are well suited to integration into lab-on-a-chip devices. However, cell lysis based on electrical mechanisms has high voltage requirements. Here, we demonstrate that by incorporating CNTs into microfluidic electrolysis systems, the required voltage for lysis is reduced by half and the lysis throughput at low voltages is improved by ten times, compared to non-CNT microchips. In our experiment, E. coli cells are lysed while passing through an electric field in a microchannel. Based on the lightning rod effect, the electric field strengthened at the tip of the CNTs enhances cell lysis at lower voltage and higher throughput. This approach enables easy integration of cell lysis with other on-chip high-throughput sample-preparation processes.
QPatch: the missing link between HTS and ion channel drug discovery.
Mathes, Chris; Friis, Søren; Finley, Michael; Liu, Yi
2009-01-01
The conventional patch clamp has long been considered the best approach for studying ion channel function and pharmacology. However, its low throughput has been a major hurdle to overcome for ion channel drug discovery. The recent emergence of higher throughput, automated patch clamp technology begins to break this bottleneck by providing medicinal chemists with high-quality, information-rich data in a more timely fashion. As such, these technologies have the potential to bridge a critical missing link between high-throughput primary screening and meaningful ion channel drug discovery programs. One of these technologies, the QPatch automated patch clamp system developed by Sophion Bioscience, records whole-cell ion channel currents from 16 or 48 individual cells in a parallel fashion. Here, we review the general applicability of the QPatch to studying a wide variety of ion channel types (voltage-/ligand-gated cationic/anionic channels) in various expression systems. The success rate of gigaseals, formation of the whole-cell configuration and usable cells ranged from 40-80%, depending on a number of factors including the cell line used, ion channel expressed, assay development or optimization time and expression level in these studies. We present detailed analyses of the QPatch features and results in case studies in which secondary screening assays were successfully developed for a voltage-gated calcium channel and a ligand-gated TRP channel. The increase in throughput compared to conventional patch clamp with the same cells was approximately 10-fold. We conclude that the QPatch, combining high data quality and speed with user friendliness and suitability for a wide array of ion channels, resides on the cutting edge of automated patch clamp technology and plays a pivotal role in expediting ion channel drug discovery.
Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander
2015-01-01
Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.
Huang, Xiaojing; Lauer, Kenneth; Clark, Jesse N.; ...
2015-03-13
We report an experimental ptychography measurement performed in fly-scan mode. With a visible-light laser source, we demonstrate a 5-fold reduction of data acquisition time. By including multiple mutually incoherent modes into the incident illumination, high quality images were successfully reconstructed from blurry diffraction patterns. Thus, this approach significantly increases the throughput of ptychography, especially for three-dimensional applications and the visualization of dynamic systems.
A high throughput screen for biomining cellulase activity from metagenomic libraries.
Mewis, Keith; Taupp, Marcus; Hallam, Steven J
2011-02-01
Cellulose, the most abundant source of organic carbon on the planet, has wide-ranging industrial applications with increasing emphasis on biofuel production (1). Chemical methods to modify or degrade cellulose typically require strong acids and high temperatures. As such, enzymatic methods have become prominent in the bioconversion process. While the identification of active cellulases from bacterial and fungal isolates has been somewhat effective, the vast majority of microbes in nature resist laboratory cultivation. Environmental genomic, also known as metagenomic, screening approaches have great promise in bridging the cultivation gap in the search for novel bioconversion enzymes. Metagenomic screening approaches have successfully recovered novel cellulases from environments as varied as soils (2), buffalo rumen (3) and the termite hind-gut (4) using carboxymethylcellulose (CMC) agar plates stained with congo red dye (based on the method of Teather and Wood (5)). However, the CMC method is limited in throughput, is not quantitative and manifests a low signal to noise ratio (6). Other methods have been reported (7,8) but each use an agar plate-based assay, which is undesirable for high-throughput screening of large insert genomic libraries. Here we present a solution-based screen for cellulase activity using a chromogenic dinitrophenol (DNP)-cellobioside substrate (9). Our library was cloned into the pCC1 copy control fosmid to increase assay sensitivity through copy number induction (10). The method uses one-pot chemistry in 384-well microplates with the final readout provided as an absorbance measurement. This readout is quantitative, sensitive and automated with a throughput of up to 100X 384-well plates per day using a liquid handler and plate reader with attached stacking system.
Cai, Yingying; Xia, Miaomiao; Dong, Huina; Qian, Yuan; Zhang, Tongcun; Zhu, Beiwei; Wu, Jinchuan; Zhang, Dawei
2018-05-11
As a very important coenzyme in the cell metabolism, Vitamin B 12 (cobalamin, VB 12 ) has been widely used in food and medicine fields. The complete biosynthesis of VB 12 requires approximately 30 genes, but overexpression of these genes did not result in expected increase of VB 12 production. High-yield VB 12 -producing strains are usually obtained by mutagenesis treatments, thus developing an efficient screening approach is urgently needed. By the help of engineered strains with varied capacities of VB 12 production, a riboswitch library was constructed and screened, and the btuB element from Salmonella typhimurium was identified as the best regulatory device. A flow cytometry high-throughput screening system was developed based on the btuB riboswitch with high efficiency to identify positive mutants. Mutation of Sinorhizobium meliloti (S. meliloti) was optimized using the novel mutation technique of atmospheric and room temperature plasma (ARTP). Finally, the mutant S. meliloti MC5-2 was obtained and considered as a candidate for industrial applications. After 7 d's cultivation on a rotary shaker at 30 °C, the VB 12 titer of S. meliloti MC5-2 reached 156 ± 4.2 mg/L, which was 21.9% higher than that of the wild type strain S. meliloti 320 (128 ± 3.2 mg/L). The genome of S. meliloti MC5-2 was sequenced, and gene mutations were identified and analyzed. To our knowledge, it is the first time that a riboswitch element was used in S. meliloti. The flow cytometry high-throughput screening system was successfully developed and a high-yield VB 12 producing strain was obtained. The identified and analyzed gene mutations gave useful information for developing high-yield strains by metabolic engineering. Overall, this work provides a useful high-throughput screening method for developing high VB 12 -yield strains.
Baculovirus expression system and method for high throughput expression of genetic material
Clark, Robin; Davies, Anthony
2001-01-01
The present invention provides novel recombinant baculovirus expression systems for expressing foreign genetic material in a host cell. Such expression systems are readily adapted to an automated method for expression foreign genetic material in a high throughput manner. In other aspects, the present invention features a novel automated method for determining the function of foreign genetic material by transfecting the same into a host by way of the recombinant baculovirus expression systems according to the present invention.
Throughput Benefit Assessment for Tactical Runway Configuration Management (TRCM)
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Fenbert, James W.
2014-01-01
The System-Oriented Runway Management (SORM) concept is a collection of needed capabilities focused on a more efficient use of runways while considering all of the factors that affect runway use. Tactical Runway Configuration Management (TRCM), one of the SORM capabilities, provides runway configuration and runway usage recommendations, monitoring the active runway configuration for suitability given existing factors, based on a 90 minute planning horizon. This study evaluates the throughput benefits using a representative sample of today's traffic volumes at three airports: Memphis International Airport (MEM), Dallas-Fort Worth International Airport (DFW), and John F. Kennedy International Airport (JFK). Based on this initial assessment, there are statistical throughput benefits for both arrivals and departures at MEM with an average of 4% for arrivals, and 6% for departures. For DFW, there is a statistical benefit for arrivals with an average of 3%. Although there is an average of 1% benefit observed for departures, it is not statistically significant. For JFK, there is a 12% benefit for arrivals, but a 2% penalty for departures. The results obtained are for current traffic volumes and should show greater benefit for increased future demand. This paper also proposes some potential TRCM algorithm improvements for future research. A continued research plan is being worked to implement these improvements and to re-assess the throughput benefit for today and future projected traffic volumes.
NASA Astrophysics Data System (ADS)
Fang, Sheng-Po; Jao, PitFee; Senior, David E.; Kim, Kyoung-Tae; Yoon, Yong-Kyu
2017-12-01
High throughput nanomanufacturing of photopatternable nanofibers and subsequent photopatterning is reported. For the production of high density nanofibers, the tube nozzle electrospinning (TNE) process has been used, where an array of micronozzles on the sidewall of a plastic tube are used as spinnerets. By increasing the density of nozzles, the electric fields of adjacent nozzles confine the cone of electrospinning and give a higher density of nanofibers. With TNE, higher density nozzles are easily achievable compared to metallic nozzles, e.g. an inter-nozzle distance as small as 0.5 cm and an average semi-vertical repulsion angle of 12.28° for 8-nozzles were achieved. Nanofiber diameter distribution, mass throughput rate, and growth rate of nanofiber stacks in different operating conditions and with different numbers of nozzles, such as 2, 4 and 8 nozzles, and scalability with single and double tube configurations are discussed. Nanofibers made of SU-8, photopatternable epoxy, have been collected to a thickness of over 80 μm in 240 s of electrospinning and the production rate of 0.75 g/h is achieved using the 2 tube 8 nozzle systems, followed by photolithographic micropatterning. TNE is scalable to a large number of nozzles, and offers high throughput production, plug and play capability with standard electrospinning equipment, and little waste of polymer.
Simulation and Optimization of an Astrophotonic Reformatter
NASA Astrophysics Data System (ADS)
Anagnos, Th; Harris, R. J.; Corrigan, M. K.; Reeves, A. P.; Townson, M. J.; MacLachlan, D. G.; Thomson, R. R.; Morris, T. J.; Schwab, C.; Quirrenbach, A.
2018-05-01
Image slicing is a powerful technique in astronomy. It allows the instrument designer to reduce the slit width of the spectrograph, increasing spectral resolving power whilst retaining throughput. Conventionally this is done using bulk optics, such as mirrors and prisms, however more recently astrophotonic components known as PLs and photonic reformatters have also been used. These devices reformat the MM input light from a telescope into SM outputs, which can then be re-arranged to suit the spectrograph. The PD is one such device, designed to reduce the dependence of spectrograph size on telescope aperture and eliminate modal noise. We simulate the PD, by optimising the throughput and geometrical design using Soapy and BeamProp. The simulated device shows a transmission between 8 and 20 %, depending upon the type of AO correction applied, matching the experimental results well. We also investigate our idealised model of the PD and show that the barycentre of the slit varies only slightly with time, meaning that the modal noise contribution is very low when compared to conventional fibre systems. We further optimise our model device for both higher throughput and reduced modal noise. This device improves throughput by 6.4 % and reduces the movement of the slit output by 50%, further improving stability. This shows the importance of properly simulating such devices, including atmospheric effects. Our work complements recent work in the field and is essential for optimising future photonic reformatters.
Detecting adulterants in milk powder using high-throughput Raman chemical imaging
USDA-ARS?s Scientific Manuscript database
This study used a line-scan high-throughput Raman imaging system to authenticate milk powder. A 5 W 785 nm line laser (240 mm long and 1 mm wide) was used as a Raman excitation source. The system was used to acquire hyperspectral Raman images in a wavenumber range of 103–2881 cm-1 from the skim milk...
USDA-ARS?s Scientific Manuscript database
Milk is a vulnerable target for economically motivated adulteration. In this study, a line-scan high-throughput Raman imaging system was used to authenticate milk powder. A 5 W 785 nm line laser (240 mm long and 1 mm wide) was used as a Raman excitation source. The system was used to acquire hypersp...
Suzuki, Yasuhiro; Kagawa, Naoko; Fujino, Toru; Sumiya, Tsuyoshi; Andoh, Taichi; Ishikawa, Kumiko; Kimura, Rie; Kemmochi, Kiyokazu; Ohta, Tsutomu; Tanaka, Shigeo
2005-01-01
There is an increasing demand for easy, high-throughput (HTP) methods for protein engineering to support advances in the development of structural biology, bioinformatics and drug design. Here, we describe an N- and C-terminal cloning method utilizing Gateway cloning technology that we have adopted for chimeric and mutant genes production as well as domain shuffling. This method involves only three steps: PCR, in vitro recombination and transformation. All three processes consist of simple handling, mixing and incubation steps. We have characterized this novel HTP method on 96 targets with >90% success. Here, we also discuss an N- and C-terminal cloning method for domain shuffling and a combination of mutation and chimeragenesis with two types of plasmid vectors. PMID:16009811
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William F.; Yegian, Derek T.; Earnest, Thomas N.; Jaklevich, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2007-09-25
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2005-07-19
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
TeraSCREEN: multi-frequency multi-mode Terahertz screening for border checks
NASA Astrophysics Data System (ADS)
Alexander, Naomi E.; Alderman, Byron; Allona, Fernando; Frijlink, Peter; Gonzalo, Ramón; Hägelen, Manfred; Ibáñez, Asier; Krozer, Viktor; Langford, Marian L.; Limiti, Ernesto; Platt, Duncan; Schikora, Marek; Wang, Hui; Weber, Marc Andree
2014-06-01
The challenge for any security screening system is to identify potentially harmful objects such as weapons and explosives concealed under clothing. Classical border and security checkpoints are no longer capable of fulfilling the demands of today's ever growing security requirements, especially with respect to the high throughput generally required which entails a high detection rate of threat material and a low false alarm rate. TeraSCREEN proposes to develop an innovative concept of multi-frequency multi-mode Terahertz and millimeter-wave detection with new automatic detection and classification functionalities. The system developed will demonstrate, at a live control point, the safe automatic detection and classification of objects concealed under clothing, whilst respecting privacy and increasing current throughput rates. This innovative screening system will combine multi-frequency, multi-mode images taken by passive and active subsystems which will scan the subjects and obtain complementary spatial and spectral information, thus allowing for automatic threat recognition. The TeraSCREEN project, which will run from 2013 to 2016, has received funding from the European Union's Seventh Framework Programme under the Security Call. This paper will describe the project objectives and approach.
Poland, Gregory A.; Kennedy, Richard B.; McKinney, Brett A.; Ovsyannikova, Inna G.; Lambert, Nathaniel D.; Jacobson, Robert M.; Oberg, Ann L.
2013-01-01
Vaccines, like drugs and medical procedures, are increasingly amenable to individualization or personalization, often based on novel data resulting from high throughput “omics” technologies. As a result of these technologies, 21st century vaccinology will increasingly see the abandonment of a “one size fits all” approach to vaccine dosing and delivery, as well as the abandonment of the empiric “isolate–inactivate–inject” paradigm for vaccine development. In this review, we discuss the immune response network theory and its application to the new field of vaccinomics and adversomics, and illustrate how vaccinomics can lead to new vaccine candidates, new understandings of how vaccines stimulate immune responses, new biomarkers for vaccine response, and facilitate the understanding of what genetic and other factors might be responsible for rare side effects due to vaccines. Perhaps most exciting will be the ability, at a systems biology level, to integrate increasingly complex high throughput data into descriptive and predictive equations for immune responses to vaccines. Herein, we discuss the above with a view toward the future of vaccinology. PMID:23755893
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.
2002-02-01
Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.
High-Throughput Density Measurement Using Magnetic Levitation.
Ge, Shencheng; Wang, Yunzhe; Deshler, Nicolas J; Preston, Daniel J; Whitesides, George M
2018-06-20
This work describes the development of an integrated analytical system that enables high-throughput density measurements of diamagnetic particles (including cells) using magnetic levitation (MagLev), 96-well plates, and a flatbed scanner. MagLev is a simple and useful technique with which to carry out density-based analysis and separation of a broad range of diamagnetic materials with different physical forms (e.g., liquids, solids, gels, pastes, gums, etc.); one major limitation, however, is the capacity to perform high-throughput density measurements. This work addresses this limitation by (i) re-engineering the shape of the magnetic fields so that the MagLev system is compatible with 96-well plates, and (ii) integrating a flatbed scanner (and simple optical components) to carry out imaging of the samples that levitate in the system. The resulting system is compatible with both biological samples (human erythrocytes) and nonbiological samples (simple liquids and solids, such as 3-chlorotoluene, cholesterol crystals, glass beads, copper powder, and polymer beads). The high-throughput capacity of this integrated MagLev system will enable new applications in chemistry (e.g., analysis and separation of materials) and biochemistry (e.g., cellular responses under environmental stresses) in a simple and label-free format on the basis of a universal property of all matter, i.e., density.
NASA Technical Reports Server (NTRS)
Hansen, R. G.
1983-01-01
Various cryogenic techniques were used to evaluate state of the art electro-optic devices. As research, development, and production demands require more sensitive testing techniques, faster test results, and higher production throughput, the emphasis on supporting cryogenic systems increases. The three traditional methods currently utilized in electro-optic device testing are discussed: (1) liquid contaiment dewars; (2) liquid transfer systems; and (3) closed cycle refrigeration systems. Advantages, disadvantages, and the current state of the art of each of these cryogenic techniques is discussed.
Development of the beam extraction synchronization system at the Fermilab Booster
Seiya, K.; Chaurize, S.; Drennan, C. C.; ...
2015-07-28
The new beam extraction synchronization control system called “Magnetic Cogging” was developed at the Fermilab Booster and it replaces a system called “RF Cogging” as part of the Proton Improvement Plan (PIP). [1] The flux throughput goal for the PIP is 2.2×10 17 protons per hour, which is double the present flux. Thus, the flux increase will be accomplished by doubling the number of beam cycles which, in turn, will double the beam loss in the Booster accelerator if nothing else is done.
Automated crystallographic system for high-throughput protein structure determination.
Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F
2003-07-01
High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.
Handheld Fluorescence Microscopy based Flow Analyzer.
Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva
2016-03-01
Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.
Wang, Jiguang; Sun, Yidan; Zheng, Si; Zhang, Xiang-Sun; Zhou, Huarong; Chen, Luonan
2013-01-01
Synergistic interactions among transcription factors (TFs) and their cofactors collectively determine gene expression in complex biological systems. In this work, we develop a novel graphical model, called Active Protein-Gene (APG) network model, to quantify regulatory signals of transcription in complex biomolecular networks through integrating both TF upstream-regulation and downstream-regulation high-throughput data. Firstly, we theoretically and computationally demonstrate the effectiveness of APG by comparing with the traditional strategy based only on TF downstream-regulation information. We then apply this model to study spontaneous type 2 diabetic Goto-Kakizaki (GK) and Wistar control rats. Our biological experiments validate the theoretical results. In particular, SP1 is found to be a hidden TF with changed regulatory activity, and the loss of SP1 activity contributes to the increased glucose production during diabetes development. APG model provides theoretical basis to quantitatively elucidate transcriptional regulation by modelling TF combinatorial interactions and exploiting multilevel high-throughput information.
Wang, Jiguang; Sun, Yidan; Zheng, Si; Zhang, Xiang-Sun; Zhou, Huarong; Chen, Luonan
2013-01-01
Synergistic interactions among transcription factors (TFs) and their cofactors collectively determine gene expression in complex biological systems. In this work, we develop a novel graphical model, called Active Protein-Gene (APG) network model, to quantify regulatory signals of transcription in complex biomolecular networks through integrating both TF upstream-regulation and downstream-regulation high-throughput data. Firstly, we theoretically and computationally demonstrate the effectiveness of APG by comparing with the traditional strategy based only on TF downstream-regulation information. We then apply this model to study spontaneous type 2 diabetic Goto-Kakizaki (GK) and Wistar control rats. Our biological experiments validate the theoretical results. In particular, SP1 is found to be a hidden TF with changed regulatory activity, and the loss of SP1 activity contributes to the increased glucose production during diabetes development. APG model provides theoretical basis to quantitatively elucidate transcriptional regulation by modelling TF combinatorial interactions and exploiting multilevel high-throughput information. PMID:23346354
Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri
2017-01-15
As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
CRISPR-enabled tools for engineering microbial genomes and phenotypes.
Tarasava, Katia; Oh, Eun Joong; Eckert, Carrie A; Gill, Ryan T
2018-06-19
In recent years CRISPR-Cas technologies have revolutionized microbial engineering approaches. Genome editing and non-editing applications of various CRISPR-Cas systems have expanded the throughput and scale of engineering efforts, as well as opened up new avenues for manipulating genomes of non-model organisms. As we expand the range of organisms used for biotechnological applications, we need to develop better, more versatile tools for manipulation of these systems. Here we summarize the current advances in microbial gene editing using CRISPR-Cas based tools, and highlight state-of-the-art methods for high-throughput, efficient genome-scale engineering in model organisms Escherichia coli and Saccharomyces cerevisiae. We also review non-editing CRISPR-Cas applications available for gene expression manipulation, epigenetic remodeling, RNA editing, labeling and synthetic gene circuit design. Finally, we point out the areas of research that need further development in order to expand the range of applications and increase the utility of these new methods. This article is protected by copyright. All rights reserved.
Ghose, Sanchayita; Nagrath, Deepak; Hubbard, Brian; Brooks, Clayton; Cramer, Steven M
2004-01-01
The effect of an alternate strategy employing two different flowrates during loading was explored as a means of increasing system productivity in Protein-A chromatography. The effect of such a loading strategy was evaluated using a chromatographic model that was able to accurately predict experimental breakthrough curves for this Protein-A system. A gradient-based optimization routine is carried out to establish the optimal loading conditions (initial and final flowrates and switching time). The two-step loading strategy (using a higher flowrate during the initial stages followed by a lower flowrate) was evaluated for an Fc-fusion protein and was found to result in significant improvements in process throughput. In an extension of this optimization routine, dynamic loading capacity and productivity were simultaneously optimized using a weighted objective function, and this result was compared to that obtained with the single flowrate. Again, the dual-flowrate strategy was found to be superior.
Impact of automation on mass spectrometry.
Zhang, Yan Victoria; Rockwood, Alan
2015-10-23
Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.
Schwanke, Christoph; Stein, Helge Sören; Xi, Lifei; Sliozberg, Kirill; Schuhmann, Wolfgang; Ludwig, Alfred; Lange, Kathrin M.
2017-01-01
High-throughput characterization by soft X-ray absorption spectroscopy (XAS) and electrochemical characterization is used to establish a correlation between electronic structure and catalytic activity of oxygen evolution reaction (OER) catalysts. As a model system a quasi-ternary materials library of Ni1-y-zFeyCrzOx was synthesized by combinatorial reactive magnetron sputtering, characterized by XAS, and an automated scanning droplet cell. The presence of Cr was found to increase the OER activity in the investigated compositional range. The electronic structure of NiII and CrIII remains unchanged over the investigated composition spread. At the Fe L-edge a linear combination of two spectra was observed. These spectra were assigned to FeIII in Oh symmetry and FeIII in Td symmetry. The ratio of FeIII Oh to FeIII Td increases with the amount of Cr and a correlation between the presence of the FeIII Oh and a high OER activity is found. PMID:28287134
NASA Astrophysics Data System (ADS)
Schwanke, Christoph; Stein, Helge Sören; Xi, Lifei; Sliozberg, Kirill; Schuhmann, Wolfgang; Ludwig, Alfred; Lange, Kathrin M.
2017-03-01
High-throughput characterization by soft X-ray absorption spectroscopy (XAS) and electrochemical characterization is used to establish a correlation between electronic structure and catalytic activity of oxygen evolution reaction (OER) catalysts. As a model system a quasi-ternary materials library of Ni1-y-zFeyCrzOx was synthesized by combinatorial reactive magnetron sputtering, characterized by XAS, and an automated scanning droplet cell. The presence of Cr was found to increase the OER activity in the investigated compositional range. The electronic structure of NiII and CrIII remains unchanged over the investigated composition spread. At the Fe L-edge a linear combination of two spectra was observed. These spectra were assigned to FeIII in Oh symmetry and FeIII in Td symmetry. The ratio of FeIII Oh to FeIII Td increases with the amount of Cr and a correlation between the presence of the FeIII Oh and a high OER activity is found.
MIPHENO: Data normalization for high throughput metabolic analysis.
High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...
NASA Astrophysics Data System (ADS)
Valasek, John; Henrickson, James V.; Bowden, Ezekiel; Shi, Yeyin; Morgan, Cristine L. S.; Neely, Haly L.
2016-05-01
As small unmanned aircraft systems become increasingly affordable, reliable, and formally recognized under federal regulation, they become increasingly attractive as novel platforms for civil applications. This paper details the development and demonstration of fixed-wing unmanned aircraft systems for precision agriculture tasks. Tasks such as soil moisture content and high throughput phenotyping are considered. Rationale for sensor, vehicle, and ground equipment selections are provided, in addition to developed flight operation procedures for minimal numbers of crew. Preliminary imagery results are presented and analyzed, and these results demonstrate that fixed-wing unmanned aircraft systems modified to carry non-traditional sensors at extended endurance durations can provide high quality data that is usable for serious scientific analysis.
Huang, Kailong; Zhang, Xu-Xiang; Shi, Peng; Wu, Bing; Ren, Hongqiang
2014-11-01
In order to comprehensively investigate bacterial virulence in drinking water, 454 pyrosequencing and Illumina high-throughput sequencing were used to detect potential pathogenic bacteria and virulence factors (VFs) in a full-scale drinking water treatment and distribution system. 16S rRNA gene pyrosequencing revealed high bacterial diversity in the drinking water (441-586 operational taxonomic units). Bacterial diversity decreased after chlorine disinfection, but increased after pipeline distribution. α-Proteobacteria was the most dominant taxonomic class. Alignment against the established pathogen database showed that several types of putative pathogens were present in the drinking water and Pseudomonas aeruginosa had the highest abundance (over 11‰ of total sequencing reads). Many pathogens disappeared after chlorine disinfection, but P. aeruginosa and Leptospira interrogans were still detected in the tap water. High-throughput sequencing revealed prevalence of various pathogenicity islands and virulence proteins in the drinking water, and translocases, transposons, Clp proteases and flagellar motor switch proteins were the predominant VFs. Both diversity and abundance of the detectable VFs increased after the chlorination, and decreased after the pipeline distribution. This study indicates that joint use of 454 pyrosequencing and Illumina sequencing can comprehensively characterize environmental pathogenesis, and several types of putative pathogens and various VFs are prevalent in drinking water. Copyright © 2014 Elsevier Inc. All rights reserved.
Neto, A I; Correia, C R; Oliveira, M B; Rial-Hermida, M I; Alvarez-Lorenzo, C; Reis, R L; Mano, J F
2015-04-01
We propose a novel hanging spherical drop system for anchoring arrays of droplets of cell suspension based on the use of biomimetic superhydrophobic flat substrates, with controlled positional adhesion and minimum contact with a solid substrate. By facing down the platform, it was possible to generate independent spheroid bodies in a high throughput manner, in order to mimic in vivo tumour models on the lab-on-chip scale. To validate this system for drug screening purposes, the toxicity of the anti-cancer drug doxorubicin in cell spheroids was tested and compared to cells in 2D culture. The advantages presented by this platform, such as feasibility of the system and the ability to control the size uniformity of the spheroid, emphasize its potential to be used as a new low cost toolbox for high-throughput drug screening and in cell or tissue engineering.
Durable silver thin film coating for diffraction gratings
Wolfe, Jesse D [Discovery Bay, CA; Britten, Jerald A [Oakley, CA; Komashko, Aleksey M [San Diego, CA
2006-05-30
A durable silver film thin film coated non-planar optical element has been developed to replace Gold as a material for fabricating such devices. Such a coating and resultant optical element has an increased efficiency and is resistant to tarnishing, can be easily stripped and re-deposited without modifying underlying grating structure, improves the throughput and power loading of short pulse compressor designs for ultra-fast laser systems, and can be utilized in variety of optical and spectrophotometric systems, particularly high-end spectrometers that require maximized efficiency.
Test and Evaluation of WiMAX Performance Using Open-Source Modeling and Simulation Software Tools
2010-12-01
specific needs. For instance, one may seek to maximize the system throughput while maximizing the number of trans- mitted data packets with hard...seeking to maximize the throughput of the system (Yu 2008; Pishdad and Rabiee 2008; Piro et al. 2010; Wongthavarawat and Ganz 2003; Mohammadi, Akl, and...testing environment provides tools to allow for setting up and running test environments over multiple systems (buildbot) and provides classes to
Zebrafish Development: High-throughput Test Systems to Assess Developmental Toxicity
Abstract Because of its developmental concordance, ease of handling and rapid development, the small teleost, zebrafish (Danio rerio), is frequently promoted as a vertebrate model for medium-throughput developmental screens. This present chapter discusses zebrafish as an altern...
A transmission imaging spectrograph and microfabricated channel system for DNA analysis.
Simpson, J W; Ruiz-Martinez, M C; Mulhern, G T; Berka, J; Latimer, D R; Ball, J A; Rothberg, J M; Went, G T
2000-01-01
In this paper we present the development of a DNA analysis system using a microfabricated channel device and a novel transmission imaging spectrograph which can be efficiently incorporated into a high throughput genomics facility for both sizing and sequencing of DNA fragments. The device contains 48 channels etched on a glass substrate. The channels are sealed with a flat glass plate which also provides a series of apertures for sample loading and contact with buffer reservoirs. Samples can be easily loaded in volumes up to 640 nL without band broadening because of an efficient electrokinetic stacking at the electrophoresis channel entrance. The system uses a dual laser excitation source and a highly sensitive charge-coupled device (CCD) detector allowing for simultaneous detection of many fluorescent dyes. The sieving matrices for the separation of single-stranded DNA fragments are polymerized in situ in denaturing buffer systems. Examples of separation of single-stranded DNA fragments up to 500 bases in length are shown, including accurate sizing of GeneCalling fragments, and sequencing samples prepared with a reduced amount of dye terminators. An increase in sample throughput has been achieved by color multiplexing.
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
Advanced Integrated Display System V/STOL Program Performance Specification. Volume I.
1980-06-01
sensor inputs required before the sensor can be designated acceptable. The reactivation count of each sensor parameter which satisfies its veri...129 3.5.2 AIDS Configuration Parameters .............. 133 3.5.3 AIDS Throughput Requirements ............... 133 4 QUALITY ASSURANCE...lists the adaptation parameters of the AIDS software; these parameters include the throughput and memory requirements of the software. 3.2 SYSTEM
Lossless compression algorithm for REBL direct-write e-beam lithography system
NASA Astrophysics Data System (ADS)
Cramer, George; Liu, Hsin-I.; Zakhor, Avideh
2010-03-01
Future lithography systems must produce microchips with smaller feature sizes, while maintaining throughputs comparable to those of today's optical lithography systems. This places stringent constraints on the effective data throughput of any maskless lithography system. In recent years, we have developed a datapath architecture for direct-write lithography systems, and have shown that compression plays a key role in reducing throughput requirements of such systems. Our approach integrates a low complexity hardware-based decoder with the writers, in order to decompress a compressed data layer in real time on the fly. In doing so, we have developed a spectrum of lossless compression algorithms for integrated circuit layout data to provide a tradeoff between compression efficiency and hardware complexity, the latest of which is Block Golomb Context Copy Coding (Block GC3). In this paper, we present a modified version of Block GC3 called Block RGC3, specifically tailored to the REBL direct-write E-beam lithography system. Two characteristic features of the REBL system are a rotary stage resulting in arbitrarily-rotated layout imagery, and E-beam corrections prior to writing the data, both of which present significant challenges to lossless compression algorithms. Together, these effects reduce the effectiveness of both the copy and predict compression methods within Block GC3. Similar to Block GC3, our newly proposed technique Block RGC3, divides the image into a grid of two-dimensional "blocks" of pixels, each of which copies from a specified location in a history buffer of recently-decoded pixels. However, in Block RGC3 the number of possible copy locations is significantly increased, so as to allow repetition to be discovered along any angle of orientation, rather than horizontal or vertical. Also, by copying smaller groups of pixels at a time, repetition in layout patterns is easier to find and take advantage of. As a side effect, this increases the total number of copy locations to transmit; this is combated with an extra region-growing step, which enforces spatial coherence among neighboring copy locations, thereby improving compression efficiency. We characterize the performance of Block RGC3 in terms of compression efficiency and encoding complexity on a number of rotated Metal 1, Poly, and Via layouts at various angles, and show that Block RGC3 provides higher compression efficiency than existing lossless compression algorithms, including JPEG-LS, ZIP, BZIP2, and Block GC3.
NASA Astrophysics Data System (ADS)
Zhang, Xuanni; Zhang, Chunmin
2013-01-01
A polarization interference imaging spectrometer based on Savart polariscope was presented. Its optical throughput was analyzed by Jones calculus. The throughput expression was given, and clearly showed that the optical throughput mainly depended on the intensity of incident light, transmissivity, refractive index and the layout of optical system. The simulation and analysis gave the optimum layout in view of both optical throughput and interference fringe visibility, and verified that the layout of our former design was optimum. The simulation showed that a small deviation from the optimum layout influenced interference fringe visibility little for the optimum one, but influenced severely for others, so a small deviation is admissible in the optimum, and this can mitigate the manufacture difficulty. These results pave the way for further research and engineering design.
TCP Throughput Profiles Using Measurements over Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less
AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.
As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...
Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes
2012-01-01
Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.
Leung, Ka-Ngo; Gough, Richard A.; Ji, Qing; Lee, Yung-Hee Yvette
1999-01-01
A focused ion beam (FIB) system produces a final beam spot size down to 0.1 .mu.m or less and an ion beam output current on the order of microamps. The FIB system increases ion source brightness by properly configuring the first (plasma) and second (extraction) electrodes. The first electrode is configured to have a high aperture diameter to electrode thickness aspect ratio. Additional accelerator and focusing electrodes are used to produce the final beam. As few as five electrodes can be used, providing a very compact FIB system with a length down to only 20 mm. Multibeamlet arrangements with a single ion source can be produced to increase throughput. The FIB system can be used for nanolithography and doping applications for fabrication of semiconductor devices with minimum feature sizes of 0.1 .mu.m or less.
Leung, K.; Gough, R.A.; Ji, Q.; Lee, Y.Y.
1999-08-31
A focused ion beam (FIB) system produces a final beam spot size down to 0.1 {mu}m or less and an ion beam output current on the order of microamps. The FIB system increases ion source brightness by properly configuring the first (plasma) and second (extraction) electrodes. The first electrode is configured to have a high aperture diameter to electrode thickness aspect ratio. Additional accelerator and focusing electrodes are used to produce the final beam. As few as five electrodes can be used, providing a very compact FIB system with a length down to only 20 mm. Multibeamlet arrangements with a single ion source can be produced to increase throughput. The FIB system can be used for nanolithography and doping applications for fabrication of semiconductor devices with minimum feature sizes of 0.1 m or less. 13 figs.
A direct-to-drive neural data acquisition system.
Kinney, Justin P; Bernstein, Jacob G; Meyer, Andrew J; Barber, Jessica B; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T; Kopell, Nancy J; Boyden, Edward S
2015-01-01
Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future.
A direct-to-drive neural data acquisition system
Kinney, Justin P.; Bernstein, Jacob G.; Meyer, Andrew J.; Barber, Jessica B.; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T.; Kopell, Nancy J.; Boyden, Edward S.
2015-01-01
Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future. PMID:26388740
Xu, Like; Ouyang, Weiying; Qian, Yanyun; Su, Chao; Su, Jianqiang; Chen, Hong
2016-06-01
Antibiotic resistance genes (ARGs) are present in surface water and often cannot be completely eliminated by drinking water treatment plants (DWTPs). Improper elimination of the ARG-harboring microorganisms contaminates the water supply and would lead to animal and human disease. Therefore, it is of utmost importance to determine the most effective ways by which DWTPs can eliminate ARGs. Here, we tested water samples from two DWTPs and distribution systems and detected the presence of 285 ARGs, 8 transposases, and intI-1 by utilizing high-throughput qPCR. The prevalence of ARGs differed in the two DWTPs, one of which employed conventional water treatments while the other had advanced treatment processes. The relative abundance of ARGs increased significantly after the treatment with biological activated carbon (BAC), raising the number of detected ARGs from 76 to 150. Furthermore, the final chlorination step enhanced the relative abundance of ARGs in the finished water generated from both DWTPs. The total enrichment of ARGs varied from 6.4-to 109.2-fold in tap water compared to finished water, among which beta-lactam resistance genes displayed the highest enrichment. Six transposase genes were detected in tap water samples, with the transposase gene TnpA-04 showing the greatest enrichment (up to 124.9-fold). We observed significant positive correlations between ARGs and mobile genetic elements (MGEs) during the distribution systems, indicating that transposases and intI-1 may contribute to antibiotic resistance in drinking water. To our knowledge, this is the first study to investigate the diversity and abundance of ARGs in drinking water treatment systems utilizing high-throughput qPCR techniques in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
A high-throughput, multi-channel photon-counting detector with picosecond timing
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.
2009-06-01
High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.
Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...
Integrated Device for Circulating Tumor Cell Capture, Characterization and Lens-Free Microscopy
2012-08-01
peripheral blood of breast cancer patients indicates high metastatic potential and increased morbidity. Development of a cost - effective CTC detection and...microfilter platform captures CTC from the cancer patients’ blood cost effectively , where the larger CTC are preferentially retained on the membrane...development of a cost - effective and high-throughput CTC analysis system would revolutionize the field of CTC detection, prognosis, and therapeutic
Programming adaptive control to evolve increased metabolite production.
Chou, Howard H; Keasling, Jay D
2013-01-01
The complexity inherent in biological systems challenges efforts to rationally engineer novel phenotypes, especially those not amenable to high-throughput screens and selections. In nature, increased mutation rates generate diversity in a population that can lead to the evolution of new phenotypes. Here we construct an adaptive control system that increases the mutation rate in order to generate diversity in the population, and decreases the mutation rate as the concentration of a target metabolite increases. This system is called feedback-regulated evolution of phenotype (FREP), and is implemented with a sensor to gauge the concentration of a metabolite and an actuator to alter the mutation rate. To evolve certain novel traits that have no known natural sensors, we develop a framework to assemble synthetic transcription factors using metabolic enzymes and construct four different sensors that recognize isopentenyl diphosphate in bacteria and yeast. We verify FREP by evolving increased tyrosine and isoprenoid production.
High-throughput and automated SAXS/USAXS experiment for industrial use at BL19B2 in SPring-8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osaka, Keiichi, E-mail: k-osaka@spring8.or.jp; Inoue, Daisuke; Sato, Masugu
A highly automated system combining a sample transfer robot with focused SR beam has been established for small-angle and ultra small-angle X-ray scattering (SAXS/USAXS) measurement at BL19B2 for industrial use of SPring-8. High-throughput data collection system can be realized by means of X-ray beam of high photon flux density concentrated by a cylindrical mirror, and a two-dimensional pixel detector PILATUS-2M. For SAXS measurement, we can obtain high-quality data within 1 minute for one exposure using this system. The sample transfer robot has a capacity of 90 samples with a large variety of shapes. The fusion of high-throughput and robotic systemmore » has enhanced the usability of SAXS/USAXS capability for industrial application.« less
A high throughput spectral image microscopy system
NASA Astrophysics Data System (ADS)
Gesley, M.; Puri, R.
2018-01-01
A high throughput spectral image microscopy system is configured for rapid detection of rare cells in large populations. To overcome flow cytometry rates and use of fluorophore tags, a system architecture integrates sample mechanical handling, signal processors, and optics in a non-confocal version of light absorption and scattering spectroscopic microscopy. Spectral images with native contrast do not require the use of exogeneous stain to render cells with submicron resolution. Structure may be characterized without restriction to cell clusters of differentiation.
Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...
Streamlined approaches that use in vitro experimental data to predict chemical toxicokinetics (TK) are increasingly being used to perform risk-based prioritization based upon dosimetric adjustment of high-throughput screening (HTS) data across thousands of chemicals. However, ass...
Current patents and future development underlying marker-assisted breeding in major grain crops.
Utomo, Herry S; Linscombe, Steve D
2009-01-01
Genomics and molecular markers provide new tools to assemble and mobilize important traits from different genetic backgrounds, including breeding lines and cultivars from different parts of the world and their related wild ancestors, to improve the quality and yield of the existing commercial cultivars to meet the increasing challenges of global food demand. The basic techniques of marker-assisted breeding, such as isolating DNA, amplifying DNA of interest using publicly available primers, and visualizing DNA fragments using standard polyacrylamid gel, have been described in the literature and, therefore, are available to scientists and breeders without any restrictions. A more sophisticated high-throughput system that includes proprietary chemicals and reagents, parts and equipments, software, and methods or processes, has been a subject of intensive patents and trade secrets. The high-throughput systems offer a more efficient way to discover associated QTLs for traits of economic importance. Therefore, an increasing number of patents of highly valued genes and QTLs is expected. This paper will discuss and review current patents associated with genes and QTLs utilized in marker-assisted breeding in major grain crops. The availability of molecular markers for important agronomic traits combined with more efficient marker detection systems will help reach the full benefit of MAS in the breeding effort to reassemble potential genes and recapture critical genes among the breeding lines that were lost during domestication to help boost crop production worldwide.
Hard-tip, soft-spring lithography.
Shim, Wooyoung; Braunschweig, Adam B; Liao, Xing; Chai, Jinan; Lim, Jong Kuk; Zheng, Gengfeng; Mirkin, Chad A
2011-01-27
Nanofabrication strategies are becoming increasingly expensive and equipment-intensive, and consequently less accessible to researchers. As an alternative, scanning probe lithography has become a popular means of preparing nanoscale structures, in part owing to its relatively low cost and high resolution, and a registration accuracy that exceeds most existing technologies. However, increasing the throughput of cantilever-based scanning probe systems while maintaining their resolution and registration advantages has from the outset been a significant challenge. Even with impressive recent advances in cantilever array design, such arrays tend to be highly specialized for a given application, expensive, and often difficult to implement. It is therefore difficult to imagine commercially viable production methods based on scanning probe systems that rely on conventional cantilevers. Here we describe a low-cost and scalable cantilever-free tip-based nanopatterning method that uses an array of hard silicon tips mounted onto an elastomeric backing. This method-which we term hard-tip, soft-spring lithography-overcomes the throughput problems of cantilever-based scanning probe systems and the resolution limits imposed by the use of elastomeric stamps and tips: it is capable of delivering materials or energy to a surface to create arbitrary patterns of features with sub-50-nm resolution over centimetre-scale areas. We argue that hard-tip, soft-spring lithography is a versatile nanolithography strategy that should be widely adopted by academic and industrial researchers for rapid prototyping applications.
Improving bed turnover time with a bed management system.
Tortorella, Frank; Ukanowicz, Donna; Douglas-Ntagha, Pamela; Ray, Robert; Triller, Maureen
2013-01-01
Efficient patient throughput requires a high degree of coordination and communication. Opportunities abound to improve the patient experience by eliminating waste from the process and improving communication among the multiple disciplines involved in facilitating patient flow. In this article, we demonstrate how an interdisciplinary team at a large tertiary cancer center implemented an electronic bed management system to improve the bed turnover component of the patient throughput process.
In Vitro Toxicity Screening Technique for Volatile Substances Using Flow-Through System#
In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However the cha...
High-throughput countercurrent microextraction in passive mode.
Xie, Tingliang; Xu, Cong
2018-05-15
Although microextraction is much more efficient than conventional macroextraction, its practical application has been limited by low throughputs and difficulties in constructing robust countercurrent microextraction (CCME) systems. In this work, a robust CCME process was established based on a novel passive microextractor with four units without any moving parts. The passive microextractor has internal recirculation and can efficiently mix two immiscible liquids. The hydraulic characteristics as well as the extraction and back-extraction performance of the passive CCME were investigated experimentally. The recovery efficiencies of the passive CCME were 1.43-1.68 times larger than the best values achieved using cocurrent extraction. Furthermore, the total throughput of the passive CCME developed in this work was about one to three orders of magnitude higher than that of other passive CCME systems reported in the literature. Therefore, a robust CCME process with high throughputs has been successfully constructed, which may promote the application of passive CCME in a wide variety of fields.
NASA Technical Reports Server (NTRS)
Jandebeur, T. S.
1980-01-01
The effect of sample concentration on throughput and resolution in a modified continuous particle electrophoresis (CPE) system with flow in an upward direction is investigated. Maximum resolution is achieved at concentrations ranging from 2 x 10 to the 8th power cells/ml to 8 x 10 to the 8th power cells/ml. The widest peak separation is at 2 x 10 to the 8th power cells/ml; however, the sharpest peaks and least overlap between cell populations is at 8 x 10 to the 8th power cells/ml. Apparently as a result of improved electrophoresis cell performance due to coasting the chamber with bovine serum albumin, changing the electrode membranes and rinse, and lowering buffer temperatures, sedimentation effects attending to higher concentrations are diminished. Throughput as measured by recovery of fixed cells is diminished at the concentrations judged most likely to yield satisfactory resolution. The tradeoff appears to be improved recovery/throughput at the expense of resolution.
Turetschek, Reinhard; Lyon, David; Desalegn, Getinet; Kaul, Hans-Peter; Wienkoop, Stefanie
2016-01-01
The proteomic study of non-model organisms, such as many crop plants, is challenging due to the lack of comprehensive genome information. Changing environmental conditions require the study and selection of adapted cultivars. Mutations, inherent to cultivars, hamper protein identification and thus considerably complicate the qualitative and quantitative comparison in large-scale systems biology approaches. With this workflow, cultivar-specific mutations are detected from high-throughput comparative MS analyses, by extracting sequence polymorphisms with de novo sequencing. Stringent criteria are suggested to filter for confidential mutations. Subsequently, these polymorphisms complement the initially used database, which is ready to use with any preferred database search algorithm. In our example, we thereby identified 26 specific mutations in two cultivars of Pisum sativum and achieved an increased number (17 %) of peptide spectrum matches.
Process Control for Precipitation Prevention in Space Water Recovery Systems
NASA Technical Reports Server (NTRS)
Sargusingh, Miriam; Callahan, Michael R.; Muirhead, Dean
2015-01-01
The ability to recover and purify water through physiochemical processes is crucial for realizing long-term human space missions, including both planetary habitation and space travel. Because of their robust nature, rotary distillation systems have been actively pursued by NASA as one of the technologies for water recovery from wastewater primarily comprised of human urine. A specific area of interest is the prevention of the formation of solids that could clog fluid lines and damage rotating equipment. To mitigate the formation of solids, operational constraints are in place that limits such that the concentration of key precipitating ions in the wastewater brine are below the theoretical threshold. This control in effected by limiting the amount of water recovered such that the risk of reaching the precipitation threshold is within acceptable limits. The water recovery limit is based on an empirically derived worst case wastewater composition. During the batch process, water recovery is estimated by monitoring the throughput of the system. NASA Johnson Space Center is working on means of enhancing the process controls to increase water recovery. Options include more precise prediction of the precipitation threshold. To this end, JSC is developing a means of more accurately measuring the constituent of the brine and/or wastewater. Another means would be to more accurately monitor the throughput of the system. In spring of 2015, testing will be performed to test strategies for optimizing water recovery without increasing the risk of solids formation in the brine.
High throughput single cell counting in droplet-based microfluidics.
Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie
2017-05-02
Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.
Li, Fumin; Wang, Jun; Jenkins, Rand
2016-05-01
There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.
Evaluation of the National Throughput Benefits of the Civil Tilt Rotor
NASA Technical Reports Server (NTRS)
Johnson, Jesse; Stouffer, Virginia; Long, Dou; Gribko, Joana; Yackovetsky, Robert (Technical Monitor)
2001-01-01
The air transportation system is a key part of the U.S. and global economic infrastructure. In recent years, this system, by any measure of usage - operations, enplanements, or revenue passenger miles (RPMs) - has grown rapidly. The rapid growth in demand has not been matched; however, by commensurate increases in the ability of airports and the airspace system to handle the additional traffic. As a result, the air transportation system is approaching capacity and airlines will face excessive delays or significant constraints on service unless capacity is expanded. To expand capacity, the air traffic management system must be improved. To improve the air traffic management system, the National Aeronautics and Space Administration (NASA) Aerospace Technology Enterprise developed the strategic goal of tripling air traffic throughput over the next 10 years, in all weather conditions, while at least maintaining current safety standards. As the first step in meeting that goal, the NASA Intercenter Systems Analysis Team (ISAT) is evaluating the contribution of existing programs to meet that goal. A major part of the study is an examination of the ability of the National Airspace System (NAS) to meet the predicted growth in travel demand and the potential benefits of technology infusion to expand NAS capacity. We previously analyzed the effects of the addition of two technology elements - Terminal Area Productivity (TAP) and Advanced Air Transportation Technologies (AATT). The next program we must analyze is not specific to airspace or aircraft technology. The program incorporates a fundamentally different vehicle to improve throughput: the civil tilt rotor (CTR). The CTR has the unique operating characteristic of being able to take off and land like a rotorcraft (vertical take off and landing, or VTOL, capability) but cruises like a traditional fixed-wing aircraft. The CTR also can operate in a short take off and landing (STOL) mode; generally, with a greater payload capacity (i.e., more passengers) than when operating in the VTOL mode. CTR could expand access to major airports without interfering with fixed-wing aircraft operating on congested runways and it could add service to new markets without the infrastructure support needed for fixed-wing aircraft. During FY 1999, we preliminarily assessed the feasibility of operating CTRs at two major U.S. airports as part of the annual review of NASA aerospace goals by the ISAT. This current study expands the analysis and concepts of that study to the complete NAS to quantify the national throughput effects of the CTR.
High-Throughput Light Sheet Microscopy for the Automated Live Imaging of Larval Zebrafish
NASA Astrophysics Data System (ADS)
Baker, Ryan; Logan, Savannah; Dudley, Christopher; Parthasarathy, Raghuveer
The zebrafish is a model organism with a variety of useful properties; it is small and optically transparent, it reproduces quickly, it is a vertebrate, and there are a large variety of transgenic animals available. Because of these properties, the zebrafish is well suited to study using a variety of optical technologies including light sheet fluorescence microscopy (LSFM), which provides high-resolution three-dimensional imaging over large fields of view. Research progress, however, is often not limited by optical techniques but instead by the number of samples one can examine over the course of an experiment, which in the case of light sheet imaging has so far been severely limited. Here we present an integrated fluidic circuit and microscope which provides rapid, automated imaging of zebrafish using several imaging modes, including LSFM, Hyperspectral Imaging, and Differential Interference Contrast Microscopy. Using this system, we show that we can increase our imaging throughput by a factor of 10 compared to previous techniques. We also show preliminary results visualizing zebrafish immune response, which is sensitive to gut microbiota composition, and which shows a strong variability between individuals that highlights the utility of high throughput imaging. National Science Foundation, Award No. DBI-1427957.
Henchoz, Yveline; Guillarme, Davy; Martel, Sophie; Rudaz, Serge; Veuthey, Jean-Luc; Carrupt, Pierre-Alain
2009-08-01
Ultra-high-pressure liquid chromatography (UHPLC) systems able to work with columns packed with sub-2 microm particles offer very fast methods to determine the lipophilicity of new chemical entities. The careful development of the most suitable experimental conditions presented here will help medicinal chemists for high-throughput screening (HTS) log P(oct) measurements. The approach was optimized using a well-balanced set of 38 model compounds and a series of 28 basic compounds such as beta-blockers, local anesthetics, piperazines, clonidine, and derivatives. Different organic modifiers and hybrid stationary phases packed with 1.7-microm particles were evaluated in isocratic as well as gradient modes, and the advantages and limitations of tested conditions pointed out. The UHPLC approach offered a significant enhancement over the classical HPLC methods, by a factor 50 in the lipophilicity determination throughput. The hyphenation of UHPLC with MS detection allowed a further increase in the throughput. Data and results reported herein prove that the UHPLC-MS method can represent a progress in the HTS-measurement of lipophilicity due to its speed (at least a factor of 500 with respect to HPLC approaches) and to an extended field of application.
Boosalis, Michael S.; Sangerman, Jose I.; White, Gary L.; Wolf, Roman F.; Shen, Ling; Dai, Yan; White, Emily; Makala, Levi H.; Li, Biaoru; Pace, Betty S.; Nouraie, Mehdi; Faller, Douglas V.; Perrine, Susan P.
2015-01-01
High-level fetal (γ) globin expression ameliorates clinical severity of the beta (β) hemoglobinopathies, and safe, orally-bioavailable γ-globin inducing agents would benefit many patients. We adapted a LCR-γ-globin promoter-GFP reporter assay to a high-throughput robotic system to evaluate five diverse chemical libraries for this activity. Multiple structurally- and functionally-diverse compounds were identified which activate the γ-globin gene promoter at nanomolar concentrations, including some therapeutics approved for other conditions. Three candidates with established safety profiles were further evaluated in erythroid progenitors, anemic baboons and transgenic mice, with significant induction of γ-globin expression observed in vivo. A lead candidate, Benserazide, emerged which demonstrated > 20-fold induction of γ-globin mRNA expression in anemic baboons and increased F-cell proportions by 3.5-fold in transgenic mice. Benserazide has been used chronically to inhibit amino acid decarboxylase to enhance plasma levels of L-dopa. These studies confirm the utility of high-throughput screening and identify previously unrecognized fetal globin inducing candidates which can be developed expediently for treatment of hemoglobinopathies. PMID:26713848
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tojo, H.; Hatae, T.; Hamano, T.
2013-09-15
Collection optics for core measurements in a JT-60SA Thomson scattering system were designed. The collection optics will be installed in a limited space and have a wide field of view and wide wavelength range. Two types of the optics are therefore suggested: refraction and reflection types. The reflection system, with a large primary mirror, avoids large chromatic aberrations. Because the size limit of the primary mirror and vignetting due to the secondary mirror affect the total collection throughput, conditions that provide the high throughput are found through an optimization. A refraction system with four lenses forming an Ernostar system ismore » also employed. The use of high-refractive-index glass materials enhances the freedom of the lens curvatures, resulting in suppression of the spherical and coma aberration. Moreover, sufficient throughput can be achieved, even with smaller lenses than that of a previous design given in [H. Tojo, T. Hatae, T. Sakuma, T. Hamano, K. Itami, Y. Aida, S. Suitoh, and D. Fujie, Rev. Sci. Instrum. 81, 10D539 (2010)]. The optical resolutions of the reflection and refraction systems are both sufficient for understanding the spatial structures in plasma. In particular, the spot sizes at the image of the optics are evaluated as ∼0.3 mm and ∼0.4 mm, respectively. The throughput for the two systems, including the pupil size and transmissivity, are also compared. The results show that good measurement accuracy (<10%) even at high electron temperatures (<30 keV) can be expected in the refraction system.« less
Tojo, H; Hatae, T; Hamano, T; Sakuma, T; Itami, K
2013-09-01
Collection optics for core measurements in a JT-60SA Thomson scattering system were designed. The collection optics will be installed in a limited space and have a wide field of view and wide wavelength range. Two types of the optics are therefore suggested: refraction and reflection types. The reflection system, with a large primary mirror, avoids large chromatic aberrations. Because the size limit of the primary mirror and vignetting due to the secondary mirror affect the total collection throughput, conditions that provide the high throughput are found through an optimization. A refraction system with four lenses forming an Ernostar system is also employed. The use of high-refractive-index glass materials enhances the freedom of the lens curvatures, resulting in suppression of the spherical and coma aberration. Moreover, sufficient throughput can be achieved, even with smaller lenses than that of a previous design given in [H. Tojo, T. Hatae, T. Sakuma, T. Hamano, K. Itami, Y. Aida, S. Suitoh, and D. Fujie, Rev. Sci. Instrum. 81, 10D539 (2010)]. The optical resolutions of the reflection and refraction systems are both sufficient for understanding the spatial structures in plasma. In particular, the spot sizes at the image of the optics are evaluated as ~0.3 mm and ~0.4 mm, respectively. The throughput for the two systems, including the pupil size and transmissivity, are also compared. The results show that good measurement accuracy (<10%) even at high electron temperatures (<30 keV) can be expected in the refraction system.
Performance evaluation of hybrid VLC using device cost and power over data throughput criteria
NASA Astrophysics Data System (ADS)
Lee, C. C.; Tan, C. S.; Wong, H. Y.; Yahya, M. B.
2013-09-01
Visible light communication (VLC) technology has attained its attention in both academic and industry lately. It is determined by the development of light emitting diode (LED) technology for solid-state lighting (SSL).It has great potential to gradually replace radio frequency (RF) wireless technology because it offers unregulated and unlicensed bandwidth to withstand future demand of indoor wireless access to real-time bandwidth-demanding applications. However, it was found to provide intrusive uplink channel that give rise to unpleasant irradiance from the user device which could interfere with the downlink channel of VLC and hence limit mobility to users as a result of small coverage (field of view of VLC).To address this potential problem, a Hybrid VLC system which integrates VLC (for downlink) and RF (for uplink) technology is proposed. It offers a non-intrusive RF back channel that provides high throughput VLC and maintains durability with conventional RF devices. To deploy Hybrid VLC system in the market, it must be energy and cost saving to attain its equivalent economical advantage by comparing to existing architecture that employs fluorescent or LED lights with RF technology. In this paper, performance evaluation on the proposed hybrid system was carried out in terms of device cost and power consumption against data throughput. Based on our simulation, Hybrid VLC system was found to reduce device cost by 3% and power consumption by 68% when compares to fluorescent lights with RF technology. Nevertheless, when it is compared to LED lights with RF technology, our proposed hybrid system is found to achieve device cost saving as high as 47% and reduced power consumption by 49%. Such promising results have demonstrated that Hybrid VLC system is a feasible solution and has paved the way for greater cost saving and energy efficient compares with the current RF architecture even with the increasing requirement of indoor area coverage.
A high throughput array microscope for the mechanical characterization of biomaterials
NASA Astrophysics Data System (ADS)
Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard
2015-02-01
In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.
NASA Astrophysics Data System (ADS)
Pfeiffer, Hans
1999-12-01
Projection reduction exposure with variable axis immersion lenses (PREVAIL) represents the high throughput e-beam projection approach to next generation lithography (NGL), which IBM is pursuing in cooperation with Nikon Corporation as an alliance partner. This paper discusses the challenges and accomplishments of the PREVAIL project. The supreme challenge facing all e-beam lithography approaches has been and still is throughput. Since the throughput of e-beam projection systems is severely limited by the available optical field size, the key to success is the ability to overcome this limitation. The PREVAIL technique overcomes field-limiting off-axis aberrations through the use of variable axis lenses, which electronically shift the optical axis simultaneously with the deflected beam, so that the beam effectively remains on axis. The resist images obtained with the proof-of-concept (POC) system demonstrate that PREVAIL effectively eliminates off-axis aberrations affecting both the resolution and placement accuracy of pixels. As part of the POC system a high emittance gun has been developed to provide uniform illumination of the patterned subfield, and to fill the large numerical aperture projection optics designed to significantly reduce beam blur caused by Coulombinteraction.
THE RABIT: A RAPID AUTOMATED BIODOSIMETRY TOOL FOR RADIOLOGICAL TRIAGE
Garty, Guy; Chen, Youhua; Salerno, Alessio; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Brenner, David J.
2010-01-01
In response to the recognized need for high throughput biodosimetry methods for use after large scale radiological events, a logical approach is complete automation of standard biodosimetric assays that are currently performed manually. We describe progress to date on the RABIT (Rapid Automated BIodosimetry Tool), designed to score micronuclei or γ-H2AX fluorescence in lymphocytes derived from a single drop of blood from a fingerstick. The RABIT system is designed to be completely automated, from the input of the capillary blood sample into the machine, to the output of a dose estimate. Improvements in throughput are achieved through use of a single drop of blood, optimization of the biological protocols for in-situ analysis in multi-well plates, implementation of robotic plate and liquid handling, and new developments in high-speed imaging. Automating well-established bioassays represents a promising approach to high-throughput radiation biodosimetry, both because high throughputs can be achieved, but also because the time to deployment is potentially much shorter than for a new biological assay. Here we describe the development of each of the individual modules of the RABIT system, and show preliminary data from key modules. Ongoing is system integration, followed by calibration and validation. PMID:20065685
Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott
2018-05-01
The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.
Direct assembling methodologies for high-throughput bioscreening
Rodríguez-Dévora, Jorge I.; Shi, Zhi-dong; Xu, Tao
2012-01-01
Over the last few decades, high-throughput (HT) bioscreening, a technique that allows rapid screening of biochemical compound libraries against biological targets, has been widely used in drug discovery, stem cell research, development of new biomaterials, and genomics research. To achieve these ambitions, scaffold-free (or direct) assembly of biological entities of interest has become critical. Appropriate assembling methodologies are required to build an efficient HT bioscreening platform. The development of contact and non-contact assembling systems as a practical solution has been driven by a variety of essential attributes of the bioscreening system, such as miniaturization, high throughput, and high precision. The present article reviews recent progress on these assembling technologies utilized for the construction of HT bioscreening platforms. PMID:22021162
A Simple Model of Nitrogen Concentration, Throughput, and Denitrification in Estuaries
The Estuary Nitrogen Model (ENM) is a mass balance model that includes calculation of nitrogen losses within bays and estuaries using system flushing time. The model has been used to demonstrate the dependence of throughput and denitrification of nitrogen in bays and estuaries on...
Use of High-Throughput Testing and Approaches for Evaluating Chemical Risk-Relevance to Humans
ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput screening (HTS) and computational models that integrate knowledge of biological systems and in vivo toxicities. Many of these assays probe signaling pathways and cellular processes critical to...
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
High Throughput and Mechano-Active Platforms to Promote Cartilage Regeneration and Repair
NASA Astrophysics Data System (ADS)
Mohanraj, Bhavana
Traumatic joint injuries initiate acute degenerative changes in articular cartilage that can lead to progressive loss of load-bearing function. As a result, patients often develop post-traumatic osteoarthritis (PTOA), a condition for which there currently exists no biologic interventions. To address this need, tissue engineering aims to mimic the structure and function of healthy, native counterparts. These constructs can be used to not only replace degenerated tissue, but also build in vitro, pre-clinical models of disease. Towards this latter goal, this thesis focuses on the design of a high throughput system to screen new therapeutics in a micro-engineered model of PTOA, and the development of a mechanically-responsive drug delivery system to augment tissue-engineered approaches for cartilage repair. High throughput screening is a powerful tool for drug discovery that can be adapted to include 3D tissue constructs. To facilitate this process for cartilage repair, we built a high throughput mechanical injury platform to create an engineered cartilage model of PTOA. Compressive injury of functionally mature constructs increased cell death and proteoglycan loss, two hallmarks of injury observed in vivo. Comparison of this response to that of native cartilage explants, and evaluation of putative therapeutics, validated this model for subsequent use in small molecule screens. A primary screen of 118 compounds identified a number of 'hits' and relevant pathways that may modulate pathologic signaling post-injury. To complement this process of therapeutic discovery, a stimuli-responsive delivery system was designed that used mechanical inputs as the 'trigger' mechanism for controlled release. The failure thresholds of these mechanically-activated microcapsules (MAMCs) were influenced by physical properties and composition, as well as matrix mechanical properties in 3D environments. TGF-beta released from the system upon mechano-activation stimulated stem cell chondrogenesis, demonstrating the potential of MAMCs to actively deliver therapeutics within demanding mechanical environments. Taken together, this work advances our capacity to identify and deliver new compounds of clinical relevance to modulate disease progression following traumatic injury using state-of-the-art micro-engineered screening tools and a novel mechanically-activated delivery system. These platforms advance strategies for cartilage repair and regeneration in PTOA and provide new options for the treatment of this debilitating condition.
Economic feeder for recharging and ``topping off''
NASA Astrophysics Data System (ADS)
Fickett, Bryan; Mihalik, G.
2000-04-01
Increasing the size of the melt charge significantly increases yield and reduces costs. Siemens Solar Industries is optimizing a method to charge additional material after meltdown (top-off) using an external feeder system. A prototype feeder system was fabricated consisting of a hopper and feed delivery system. The low-cost feeder is designed for simple operation and maintenance. The system is capable of introducing up to 60 kg of granular silicon while under vacuum. An isolation valve permits refilling of the hopper while maintaining vacuum in the growth furnace. Using the feeder system in conjunction with Siemens Solar Industries' energy efficient hot zone dramatically reduces power and argon consumption. Throughput is also improved as faster pull speeds can be attained. The increased pull speeds have an even greater impact when the charge size is increased. Further cost reduction can be achieved by refilling the crucible after crystal growth and pulling a second ingot run. Siemens Solar Industries is presently testing the feeder in production.
High-throughput discovery of rare human nucleotide polymorphisms by Ecotilling
Till, Bradley J.; Zerr, Troy; Bowers, Elisabeth; Greene, Elizabeth A.; Comai, Luca; Henikoff, Steven
2006-01-01
Human individuals differ from one another at only ∼0.1% of nucleotide positions, but these single nucleotide differences account for most heritable phenotypic variation. Large-scale efforts to discover and genotype human variation have been limited to common polymorphisms. However, these efforts overlook rare nucleotide changes that may contribute to phenotypic diversity and genetic disorders, including cancer. Thus, there is an increasing need for high-throughput methods to robustly detect rare nucleotide differences. Toward this end, we have adapted the mismatch discovery method known as Ecotilling for the discovery of human single nucleotide polymorphisms. To increase throughput and reduce costs, we developed a universal primer strategy and implemented algorithms for automated band detection. Ecotilling was validated by screening 90 human DNA samples for nucleotide changes in 5 gene targets and by comparing results to public resequencing data. To increase throughput for discovery of rare alleles, we pooled samples 8-fold and found Ecotilling to be efficient relative to resequencing, with a false negative rate of 5% and a false discovery rate of 4%. We identified 28 new rare alleles, including some that are predicted to damage protein function. The detection of rare damaging mutations has implications for models of human disease. PMID:16893952
Advancements in zebrafish applications for 21st century toxicology.
Garcia, Gloria R; Noyes, Pamela D; Tanguay, Robert L
2016-05-01
The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. Copyright © 2016 Elsevier Inc. All rights reserved.
Advancements in zebrafish applications for 21st century toxicology
Garcia, Gloria R.; Noyes, Pamela D.; Tanguay, Robert L.
2016-01-01
The zebrafish model is the only available high-throughput vertebrate assessment system, and it is uniquely suited for studies of in vivo cell biology. A sequenced and annotated genome has revealed a large degree of evolutionary conservation in comparison to the human genome. Due to our shared evolutionary history, the anatomical and physiological features of fish are highly homologous to humans, which facilitates studies relevant to human health. In addition, zebrafish provide a very unique vertebrate data stream that allows researchers to anchor hypotheses at the biochemical, genetic, and cellular levels to observations at the structural, functional, and behavioral level in a high-throughput format. In this review, we will draw heavily from toxicological studies to highlight advances in zebrafish high-throughput systems. Breakthroughs in transgenic/reporter lines and methods for genetic manipulation, such as the CRISPR-Cas9 system, will be comprised of reports across diverse disciplines. PMID:27016469
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks
Claver, Jose M.; Ezpeleta, Santiago
2017-01-01
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments. PMID:28684666
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks.
Pérez-Solano, Juan J; Claver, Jose M; Ezpeleta, Santiago
2017-07-06
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments.
Tsiliyannis, Christos Aristeides
2013-09-01
Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh
2006-01-01
We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.
High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limi...
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.
Wang, Heng; Qian, Xiangjie; Zhang, Lan; Xu, Sailong; Li, Haifeng; Xia, Xiaojian; Dai, Liankui; Xu, Liang; Yu, Jingquan; Liu, Xu
2018-01-01
We present a high throughput crop physiology condition monitoring system and corresponding monitoring method. The monitoring system can perform large-area chlorophyll fluorescence imaging and multispectral imaging. The monitoring method can determine the crop current condition continuously and non-destructively. We choose chlorophyll fluorescence parameters and relative reflectance of multispectral as the indicators of crop physiological status. Using tomato as experiment subject, the typical crop physiological stress, such as drought, nutrition deficiency and plant disease can be distinguished by the monitoring method. Furthermore, we have studied the correlation between the physiological indicators and the degree of stress. Besides realizing the continuous monitoring of crop physiology, the monitoring system and method provide the possibility of machine automatic diagnosis of the plant physiology. Highlights: A newly designed high throughput crop physiology monitoring system and the corresponding monitoring method are described in this study. Different types of stress can induce distinct fluorescence and spectral characteristics, which can be used to evaluate the physiological status of plants.
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Graphics Processing Units for HEP trigger systems
NASA Astrophysics Data System (ADS)
Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.
2016-07-01
General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.
2011-01-01
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments. PMID:22136293
Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E
2011-12-02
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.
Jiang, Yutao; Hascall, Daniel; Li, Delia; Pease, Joseph H
2015-09-11
In this paper, we introduce a high throughput LCMS/UV/CAD/CLND system that improves upon previously reported systems by increasing both the quantitation accuracy and the range of compounds amenable to testing, in particular, low molecular weight "fragment" compounds. This system consists of a charged aerosol detector (CAD) and chemiluminescent nitrogen detector (CLND) added to a LCMS/UV system. Our results show that the addition of CAD and CLND to LCMS/UV is more reliable for concentration determination for a wider range of compounds than either detector alone. Our setup also allows for the parallel analysis of each sample by all four detectors and so does not significantly increase run time per sample. Copyright © 2015 Elsevier B.V. All rights reserved.
Inventory management and reagent supply for automated chemistry.
Kuzniar, E
1999-08-01
Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.
Approaches to automated protein crystal harvesting
Deller, Marc C.; Rupp, Bernhard
2014-01-01
The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746
Niu, Ye; Zhang, Xu; Si, Ting; Zhang, Yuntian; Qi, Lin; Zhao, Gang; Xu, Ronald X; He, Xiaoming; Zhao, Yi
2017-12-01
Geometric and mechanical characterizations of hydrogel materials at the microscale are attracting increasing attention due to their importance in tissue engineering, regenerative medicine, and drug delivery applications. Contemporary approaches for measuring the these properties of hydrogel microbeads suffer from low-throughput, complex system configuration, and measurement inaccuracy. In this work, a continuous-flow device is developed to measure geometric and viscoelastic properties of hydrogel microbeads by flowing the microbeads through a tapered microchannel with an array of interdigitated microelectrodes patterned underneath the channel. The viscoelastic properties are derived from the trajectories of microbeads using a quasi-linear viscoelastic model. The measurement is independent of the applied volumetric flow rate. The results show that the geometric and viscoelastic properties of Ca-alginate hydrogel microbeads can be determined independently and simultaneously. The bulky high-speed optical systems are eliminated, simplifying the system configuration and making it a truly miniaturized device. A throughput of up to 394 microbeads min -1 is achieved. This study may provide a powerful tool for mechanical profiling of hydrogel microbeads to support their wide applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Systems Proteomics for Translational Network Medicine
Arrell, D. Kent; Terzic, Andre
2012-01-01
Universal principles underlying network science, and their ever-increasing applications in biomedicine, underscore the unprecedented capacity of systems biology based strategies to synthesize and resolve massive high throughput generated datasets. Enabling previously unattainable comprehension of biological complexity, systems approaches have accelerated progress in elucidating disease prediction, progression, and outcome. Applied to the spectrum of states spanning health and disease, network proteomics establishes a collation, integration, and prioritization algorithm to guide mapping and decoding of proteome landscapes from large-scale raw data. Providing unparalleled deconvolution of protein lists into global interactomes, integrative systems proteomics enables objective, multi-modal interpretation at molecular, pathway, and network scales, merging individual molecular components, their plurality of interactions, and functional contributions for systems comprehension. As such, network systems approaches are increasingly exploited for objective interpretation of cardiovascular proteomics studies. Here, we highlight network systems proteomic analysis pipelines for integration and biological interpretation through protein cartography, ontological categorization, pathway and functional enrichment and complex network analysis. PMID:22896016
Aircraft Wake Vortex Spacing System (AVOSS) Performance Update and Validation Study
NASA Technical Reports Server (NTRS)
Rutishauser, David K.; OConnor, Cornelius J.
2001-01-01
An analysis has been performed on data generated from the two most recent field deployments of the Aircraft Wake VOrtex Spacing System (AVOSS). The AVOSS provides reduced aircraft spacing criteria for wake vortex avoidance as compared to the FAA spacing applied under Instrument Flight Rules (IFR). Several field deployments culminating in a system demonstration at Dallas Fort Worth (DFW) International Airport in the summer of 2000 were successful in showing a sound operational concept and the system's potential to provide a significant benefit to airport operations. For DFW, a predicted average throughput increase of 6% was observed. This increase implies 6 or 7 more aircraft on the ground in a one-hour period for DFW operations. Several studies of performance correlations to system configuration options, design options, and system inputs are also reported. The studies focus on the validation performance of the system.
High-throughput sequencing: a failure mode analysis.
Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A
2005-01-04
Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.
High throughput chemical munitions treatment system
Haroldsen, Brent L [Manteca, CA; Stofleth, Jerome H [Albuquerque, NM; Didlake, Jr., John E.; Wu, Benjamin C-P [San Ramon, CA
2011-11-01
A new High-Throughput Explosive Destruction System is disclosed. The new system is comprised of two side-by-side detonation containment vessels each comprising first and second halves that feed into a single agent treatment vessel. Both detonation containment vessels further comprise a surrounding ventilation facility. Moreover, the detonation containment vessels are designed to separate into two half-shells, wherein one shell can be moved axially away from the fixed, second half for ease of access and loading. The vessels are closed by means of a surrounding, clam-shell type locking seal mechanisms.
Optoelectronic image processing for cervical cancer screening
NASA Astrophysics Data System (ADS)
Narayanswamy, Ramkumar; Sharpe, John P.; Johnson, Kristina M.
1994-05-01
Automation of the Pap-smear cervical screening method is highly desirable as it relieves tedium for the human operators, reduces cost and should increase accuracy and provide repeatability. We present here the design for a high-throughput optoelectronic system which forms the first stage of a two stage system to automate pap-smear screening. We use a mathematical morphological technique called the hit-or-miss transform to identify the suspicious areas on a pap-smear slide. This algorithm is implemented using a VanderLugt architecture and a time-sequential ANDing smart pixel array.
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; ...
2016-02-24
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
HPC AND GRID COMPUTING FOR INTEGRATIVE BIOMEDICAL RESEARCH
Kurc, Tahsin; Hastings, Shannon; Kumar, Vijay; Langella, Stephen; Sharma, Ashish; Pan, Tony; Oster, Scott; Ervin, David; Permar, Justin; Narayanan, Sivaramakrishnan; Gil, Yolanda; Deelman, Ewa; Hall, Mary; Saltz, Joel
2010-01-01
Integrative biomedical research projects query, analyze, and integrate many different data types and make use of datasets obtained from measurements or simulations of structure and function at multiple biological scales. With the increasing availability of high-throughput and high-resolution instruments, the integrative biomedical research imposes many challenging requirements on software middleware systems. In this paper, we look at some of these requirements using example research pattern templates. We then discuss how middleware systems, which incorporate Grid and high-performance computing, could be employed to address the requirements. PMID:20107625
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...
Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing
In vitro high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. ...
The toxicity-testing paradigm has evolved to include high-throughput (HT) methods for addressing the increasing need to screen hundreds to thousands of chemicals rapidly. Approaches that involve in vitro screening assays, in silico predictions of exposure concentrations, and phar...
Human ATAD5 is an excellent biomarker for identifying genotoxic compounds because ATADS protein levels increase post-transcriptionally following exposure to a variety of DNA damaging agents. Here we report a novel quantitative high-throughput ATAD5-Iuciferase assay that can moni...
Multi-level scanning method for defect inspection
Bokor, Jeffrey; Jeong, Seongtae
2002-01-01
A method for performing scanned defect inspection of a collection of contiguous areas using a specified false-alarm-rate and capture-rate within an inspection system that has characteristic seek times between inspection locations. The multi-stage method involves setting an increased false-alarm-rate for a first stage of scanning, wherein subsequent stages of scanning inspect only the detected areas of probable defects at lowered values for the false-alarm-rate. For scanning inspection operations wherein the seek time and area uncertainty is favorable, the method can substantially increase inspection throughput.
Problems in processing Rheinische Braunkohle (soft coal) (in German)
DOE Office of Scientific and Technical Information (OSTI.GOV)
von Hartmann, G.B.
At Wesseling, difficulties were encountered with the hydrogenation of Rhine brown coal. The hydrogenation reaction was proceeding too rapidly at 600 atm pressure under relatively low temperature and throughput conditions. This caused a build-up of ''caviar'' deposits containing ash and asphalts. This flocculation of asphalt seemed to arise because the rapid reaction produced a liquid medium unable to hold the heavy asphalt particles in suspension. A stronger paraffinic character of the oil was also a result. To obtain practical, problem-free yields, throughput had to be increased (from .4 kg/liter/hr to more than .5), and temperature had to be increased (frommore » 24.0 MV to 24,8 MV). Further, a considerable increase in sludge recycling was recommended. The Wesseling plant was unable to increase the temperature and throughput. However, more sludge was recycled, producing a paste better able to hold higher-molecular-weight particles in suspension. If this were not to solve the ''caviar'' deposit problems, further recommendations were suggested including addition of more heavy oil.« less
A Memory Efficient Network Encryption Scheme
NASA Astrophysics Data System (ADS)
El-Fotouh, Mohamed Abo; Diepold, Klaus
In this paper, we studied the two widely used encryption schemes in network applications. Shortcomings have been found in both schemes, as these schemes consume either more memory to gain high throughput or low memory with low throughput. The need has aroused for a scheme that has low memory requirements and in the same time possesses high speed, as the number of the internet users increases each day. We used the SSM model [1], to construct an encryption scheme based on the AES. The proposed scheme possesses high throughput together with low memory requirements.
An analysis of the development of port operation in Da Nang Port, Vietnam
NASA Astrophysics Data System (ADS)
Nguyen, T. D. H.; Cools, M.
2018-04-01
This paper presents the current operating status in Da Nang Port, Vietnam in the period 2012-2016. The port operation had positive changes that were reflected by a significant increase in total throughputs, especially containerized cargo volumes. Classical decomposition techniques are used to find trend-cycle and seasonal components of monthly throughput flows. Appropriate predictive models of different kinds of throughputs are proposed. Finally, a development strategy towards containerization and investment policies in facilities, equipment, and infrastructure are suggested based on the predictive results.
NASA Technical Reports Server (NTRS)
Watson, James F., III; Desrochers, Alan A.
1991-01-01
Generalized stochastic Petri nets (GSPNs) are applied to flexible manufacturing systems (FMSs). Throughput subnets and s-transitions are presented. Two FMS examples containing nonexponential distributions which were analyzed in previous papers by queuing theory and probability theory, respectively, are treated using GSPNs developed using throughput subnets and s-transitions. The GSPN results agree with the previous results, and developing and analyzing the GSPN models are straightforward and relatively easy compared to other methodologies.
Spectral efficiency in crosstalk-impaired multi-core fiber links
NASA Astrophysics Data System (ADS)
Luís, Ruben S.; Puttnam, Benjamin J.; Rademacher, Georg; Klaus, Werner; Agrell, Erik; Awaji, Yoshinari; Wada, Naoya
2018-02-01
We review the latest advances on ultra-high throughput transmission using crosstalk-limited single-mode multicore fibers and compare these with the theoretical spectral efficiency of such systems. We relate the crosstalkimposed spectral efficiency limits with fiber parameters, such as core diameter, core pitch, and trench design. Furthermore, we investigate the potential of techniques such as direction interleaving and high-order MIMO to improve the throughput or reach of these systems when using various modulation formats.
High throughput and miniaturised systems for biodegradability assessments.
Cregut, Mickael; Jouanneau, Sulivan; Brillet, François; Durand, Marie-José; Sweetlove, Cyril; Chenèble, Jean-Charles; L'Haridon, Jacques; Thouand, Gérald
2014-01-01
The society demands safer products with a better ecological profile. Regulatory criteria have been developed to prevent risks for human health and the environment, for example, within the framework of the European regulation REACH (Regulation (EC) No 1907, 2006). This has driven industry to consider the development of high throughput screening methodologies for assessing chemical biodegradability. These new screening methodologies must be scalable for miniaturisation, reproducible and as reliable as existing procedures for enhanced biodegradability assessment. Here, we evaluate two alternative systems that can be scaled for high throughput screening and conveniently miniaturised to limit costs in comparison with traditional testing. These systems are based on two dyes as follows: an invasive fluorescent dyes that serves as a cellular activity marker (a resazurin-like dye reagent) and a noninvasive fluorescent oxygen optosensor dye (an optical sensor). The advantages and limitations of these platforms for biodegradability assessment are presented. Our results confirm the feasibility of these systems for evaluating and screening chemicals for ready biodegradability. The optosensor is a miniaturised version of a component already used in traditional ready biodegradability testing, whereas the resazurin dye offers an interesting new screening mechanism for chemical concentrations greater than 10 mg/l that are not amenable to traditional closed bottle tests. The use of these approaches allows generalisation of high throughput screening methodologies to meet the need of developing new compounds with a favourable ecological profile and also assessment for regulatory purpose.
Jiang, Guangli; Liu, Leibo; Zhu, Wenping; Yin, Shouyi; Wei, Shaojun
2015-09-04
This paper proposes a real-time feature extraction VLSI architecture for high-resolution images based on the accelerated KAZE algorithm. Firstly, a new system architecture is proposed. It increases the system throughput, provides flexibility in image resolution, and offers trade-offs between speed and scaling robustness. The architecture consists of a two-dimensional pipeline array that fully utilizes computational similarities in octaves. Secondly, a substructure (block-serial discrete-time cellular neural network) that can realize a nonlinear filter is proposed. This structure decreases the memory demand through the removal of data dependency. Thirdly, a hardware-friendly descriptor is introduced in order to overcome the hardware design bottleneck through the polar sample pattern; a simplified method to realize rotation invariance is also presented. Finally, the proposed architecture is designed in TSMC 65 nm CMOS technology. The experimental results show a performance of 127 fps in full HD resolution at 200 MHz frequency. The peak performance reaches 181 GOPS and the throughput is double the speed of other state-of-the-art architectures.
Recent advances in inkjet dispensing technologies: applications in drug discovery.
Zhu, Xiangcheng; Zheng, Qiang; Yang, Hu; Cai, Jin; Huang, Lei; Duan, Yanwen; Xu, Zhinan; Cen, Peilin
2012-09-01
Inkjet dispensing technology is a promising fabrication methodology widely applied in drug discovery. The automated programmable characteristics and high-throughput efficiency makes this approach potentially very useful in miniaturizing the design patterns for assays and drug screening. Various custom-made inkjet dispensing systems as well as specialized bio-ink and substrates have been developed and applied to fulfill the increasing demands of basic drug discovery studies. The incorporation of other modern technologies has further exploited the potential of inkjet dispensing technology in drug discovery and development. This paper reviews and discusses the recent developments and practical applications of inkjet dispensing technology in several areas of drug discovery and development including fundamental assays of cells and proteins, microarrays, biosensors, tissue engineering, basic biological and pharmaceutical studies. Progression in a number of areas of research including biomaterials, inkjet mechanical systems and modern analytical techniques as well as the exploration and accumulation of profound biological knowledge has enabled different inkjet dispensing technologies to be developed and adapted for high-throughput pattern fabrication and miniaturization. This in turn presents a great opportunity to propel inkjet dispensing technology into drug discovery.
Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D
2015-11-01
The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.
Arranging computer architectures to create higher-performance controllers
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
1988-01-01
Techniques for integrating microprocessors, array processors, and other intelligent devices in control systems are reviewed, with an emphasis on the (re)arrangement of components to form distributed or parallel processing systems. Consideration is given to the selection of the host microprocessor, increasing the power and/or memory capacity of the host, multitasking software for the host, array processors to reduce computation time, the allocation of real-time and non-real-time events to different computer subsystems, intelligent devices to share the computational burden for real-time events, and intelligent interfaces to increase communication speeds. The case of a helicopter vibration-suppression and stabilization controller is analyzed as an example, and significant improvements in computation and throughput rates are demonstrated.
Under-sampling in a Multiple-Channel Laser Vibrometry System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corey, Jordan
2007-03-01
Laser vibrometry is a technique used to detect vibrations on objects using the interference of coherent light with itself. Most vibrometry systems process only one target location at a time, but processing multiple locations simultaneously provides improved detection capabilities. Traditional laser vibrometry systems employ oversampling to sample the incoming modulated-light signal, however as the number of channels increases in these systems, certain issues arise such a higher computational cost, excessive heat, increased power requirements, and increased component cost. This thesis describes a novel approach to laser vibrometry that utilizes undersampling to control the undesirable issues associated with over-sampled systems. Undersamplingmore » allows for significantly less samples to represent the modulated-light signals, which offers several advantages in the overall system design. These advantages include an improvement in thermal efficiency, lower processing requirements, and a higher immunity to the relative intensity noise inherent in laser vibrometry applications. A unique feature of this implementation is the use of a parallel architecture to increase the overall system throughput. This parallelism is realized using a hierarchical multi-channel architecture based on off-the-shelf programmable logic devices (PLDs).« less
NASA Astrophysics Data System (ADS)
Baggett, R.
2004-11-01
Next Generation Electric Propulsion (NGEP) technology development tasks are working towards advancing solar-powered electric propulsion systems and components to levels ready for transition to flight systems. Current tasks within NGEP include NASA's Evolutionary Xenon Thruster (NEXT), Carbon Based Ion Optics (CBIO), NSTAR Extended Life Test (ELT) and low-power Hall Effect thrusters. The growing number of solar electric propulsion options provides reduced cost and flexibility to capture a wide range of Solar System exploration missions. Benefits of electric propulsion systems over state-of-the-art chemical systems include increased launch windows, which reduce mission risk; increased deliverable payload mass for more science; and a reduction in launch vehicle size-- all of which increase the opportunities for New Frontiers and Discovery class missions. The Dawn Discovery mission makes use of electric propulsion for sequential rendezvous with two large asteroids (Vesta then Ceres), something not possible using chemical propulsion. NEXT components and thruster system under development have NSTAR heritage with significant increases in maximum power and Isp along with deep throttling capability to accommodate changes in input power over the mission trajectory. NEXT will produce engineering model system components that will be validated (through qualification-level and integrated system testing) and ready for transition to flight system development. NEXT offers Discovery, New Frontiers, Mars Exploration and outer-planet missions a larger deliverable payload mass and a smaller launch vehicle size. CBIO addresses the need to further extend ion thruster lifetime by using low erosion carbon-based materials. Testing of 30-cm Carbon-Carbon and Pyrolytic graphite grids using a lab model NSTAR thruster are complete. In addition, JPL completed a 1000 hr. life test on 30-cm Carbon-Carbon grids. The NSTAR ELT was a life time qualification test started in 1999 with a goal of 88 kg throughput of Xenon propellant. The test was intentionally terminated in 2003 after accumulating 233 kg throughput. The thruster has been completely disassembled and the conditions of all components documented. Because most of the NSTAR design features have been used in the NEXT thruster, the success of the ELT goes a long way toward qualifying NEXT by similarity Recent mission analyses for Discovery and New Frontiers class missions have also identified potential benefits of low-power, high thrust Hall Effect thrusters. Estimated to be ready for mission implementation by 2008, low-power Hall systems could increase mission capture for electric propulsion by greatly reducing propulsion cost, mass and complexity.
Lane, Darius J. R.; Lawen, Alfons
2014-01-01
Vitamin C (ascorbate) plays numerous important roles in cellular metabolism, many of which have only come to light in recent years. For instance, within the brain, ascorbate acts in a neuroprotective and neuromodulatory manner that involves ascorbate cycling between neurons and vicinal astrocytes - a relationship that appears to be crucial for brain ascorbate homeostasis. Additionally, emerging evidence strongly suggests that ascorbate has a greatly expanded role in regulating cellular and systemic iron metabolism than is classically recognized. The increasing recognition of the integral role of ascorbate in normal and deregulated cellular and organismal physiology demands a range of medium-throughput and high-sensitivity analytic techniques that can be executed without the need for highly expensive specialist equipment. Here we provide explicit instructions for a medium-throughput, specific and relatively inexpensive microplate assay for the determination of both intra- and extracellular ascorbate in cell culture. PMID:24747535
NASA Technical Reports Server (NTRS)
Bremer, J. C.
1982-01-01
Physical models are developed for establishing criteria to decide on the acceptable contamination level of optical devices in space-borne conditions. Optical systems can be degraded in terms of decreased throughput, i.e., transmissivity or reflectivity, or increases in the total integrated scatter (TIS). Performance losses can be caused by particulate accretion, molecular film accretion, and impact cratering. A quantitative relationship is defined for film thickness and loss of throughput. Formulas are also developed for cases where induced surface defects are larger than the desired viewing wavelengths, or smaller or of the same order of the observed wavelengths. The techniques are used to quantify the degradation of a VUV solar coronagraph, a VUV stellar telescope, and a solar cell due to TIS. Applications are projected for estimating the contamination sensitivity of specific instruments, assessing the contamination hazard from known particulates, or to define clean room standards.
Matrix-vector multiplication using digital partitioning for more accurate optical computing
NASA Technical Reports Server (NTRS)
Gary, C. K.
1992-01-01
Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.
Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent
De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle
2018-01-01
Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770
Anti-jamming communication for body area network using chaotic frequency hopping.
Gopalakrishnan, Balamurugan; Bhagyaveni, Marcharla Anjaneyulu
2017-12-01
The healthcare industries research trends focus on patient reliable communication and security is a paramount requirement of healthcare applications. Jamming in wireless communication medium has become a major research issue due to the ease of blocking communication in wireless networks and throughput degradation. The most commonly used technique to overcome jamming is frequency hopping (FH). However, in traditional FH pre-sharing of key for channel selection and a high-throughput overhead is required. So to overcome this pre-sharing of key and to increase the security chaotic frequency hopping (CFH) has been proposed. The design of chaos-based hop selection is a new development that offers improved performance in transmission of information without pre-shared key and also increases the security. The authors analysed the performance of proposed CFH system under different reactive jamming durations. The percentage of error reduction by the reactive jamming for jamming duration 0.01 and 0.05 s for FH and CFH is 55.03 and 84.24%, respectively. The obtained result shows that CFH is more secure and difficult to jam by the reactive jammer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, Jon Llyod
This Small Business Innovative Research (SBIR) Phase I project will demonstrate the feasibility of an innovative temperature control technology for Metal-Organic Chemical Vapor Deposition (MOCVD) process used in the fabrication of Multi-Quantum Well (MQW) LEDs. The proposed control technology has the strong potential to improve both throughput and performance quality of the manufactured LED. The color of the light emitted by an LED is a strong function of the substrate temperature during the deposition process. Hence, accurate temperature control of the MOCVD process is essential for ensuring that the LED performance matches the design specification. The Gallium Nitride (GaN) epitaxymore » process involves depositing multiple layers at different temperatures. Much of the recipe time is spent ramping from one process temperature to another, adding significant overhead to the production time. To increase throughput, the process temperature must transition over a range of several hundred degrees Centigrade many times with as little overshoot and undershoot as possible, in the face of several sources of process disturbance such as changing emissivities. Any throughput increase achieved by faster ramping must also satisfy the constraint of strict temperature uniformity across the carrier so that yield is not affected. SC Solutions is a leading supplier of embedded real-time temperature control technology for MOCVD systems used in LED manufacturing. SC’s Multiple Input Multiple Output (MIMO) temperature controllers use physics-based models to achieve the performance demanded by our customers. However, to meet DOE’s ambitious goals of cost reduction of LED products, a new generation of temperature controllers has to be developed. SC believes that the proposed control technology will be made feasible by the confluence of mathematical formulation as a convex optimization problem, new efficient and scalable algorithms, and the increase in computational power available for real-time control.« less
High-throughput sample adaptive offset hardware architecture for high-efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin
2018-03-01
A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.
Restructuring of the Aquatic Bacterial Community by Hydric Dynamics Associated with Superstorm Sandy
Ulrich, Nikea; Rosenberger, Abigail; Brislawn, Colin; Wright, Justin; Kessler, Collin; Toole, David; Solomon, Caroline; Strutt, Steven; McClure, Erin
2016-01-01
ABSTRACT Bacterial community composition and longitudinal fluctuations were monitored in a riverine system during and after Superstorm Sandy to better characterize inter- and intracommunity responses associated with the disturbance associated with a 100-year storm event. High-throughput sequencing of the 16S rRNA gene was used to assess microbial community structure within water samples from Muddy Creek Run, a second-order stream in Huntingdon, PA, at 12 different time points during the storm event (29 October to 3 November 2012) and under seasonally matched baseline conditions. High-throughput sequencing of the 16S rRNA gene was used to track changes in bacterial community structure and divergence during and after Superstorm Sandy. Bacterial community dynamics were correlated to measured physicochemical parameters and fecal indicator bacteria (FIB) concentrations. Bioinformatics analyses of 2.1 million 16S rRNA gene sequences revealed a significant increase in bacterial diversity in samples taken during peak discharge of the storm. Beta-diversity analyses revealed longitudinal shifts in the bacterial community structure. Successional changes were observed, in which Betaproteobacteria and Gammaproteobacteria decreased in 16S rRNA gene relative abundance, while the relative abundance of members of the Firmicutes increased. Furthermore, 16S rRNA gene sequences matching pathogenic bacteria, including strains of Legionella, Campylobacter, Arcobacter, and Helicobacter, as well as bacteria of fecal origin (e.g., Bacteroides), exhibited an increase in abundance after peak discharge of the storm. This study revealed a significant restructuring of in-stream bacterial community structure associated with hydric dynamics of a storm event. IMPORTANCE In order to better understand the microbial risks associated with freshwater environments during a storm event, a more comprehensive understanding of the variations in aquatic bacterial diversity is warranted. This study investigated the bacterial communities during and after Superstorm Sandy to provide fine time point resolution of dynamic changes in bacterial composition. This study adds to the current literature by revealing the variation in bacterial community structure during the course of a storm. This study employed high-throughput DNA sequencing, which generated a deep analysis of inter- and intracommunity responses during a significant storm event. This study has highlighted the utility of applying high-throughput sequencing for water quality monitoring purposes, as this approach enabled a more comprehensive investigation of the bacterial community structure. Altogether, these data suggest a drastic restructuring of the stream bacterial community during a storm event and highlight the potential of high-throughput sequencing approaches for assessing the microbiological quality of our environment. PMID:27060115
Ohyama, Tomoko; Jovanic, Tihana; Denisov, Gennady; Dang, Tam C.; Hoffmann, Dominik; Kerr, Rex A.; Zlatic, Marta
2013-01-01
All organisms react to noxious and mechanical stimuli but we still lack a complete understanding of cellular and molecular mechanisms by which somatosensory information is transformed into appropriate motor outputs. The small number of neurons and excellent genetic tools make Drosophila larva an especially tractable model system in which to address this problem. We developed high throughput assays with which we can simultaneously expose more than 1,000 larvae per man-hour to precisely timed noxious heat, vibration, air current, or optogenetic stimuli. Using this hardware in combination with custom software we characterized larval reactions to somatosensory stimuli in far greater detail than possible previously. Each stimulus evoked a distinctive escape strategy that consisted of multiple actions. The escape strategy was context-dependent. Using our system we confirmed that the nociceptive class IV multidendritic neurons were involved in the reactions to noxious heat. Chordotonal (ch) neurons were necessary for normal modulation of head casting, crawling and hunching, in response to mechanical stimuli. Consistent with this we observed increases in calcium transients in response to vibration in ch neurons. Optogenetic activation of ch neurons was sufficient to evoke head casting and crawling. These studies significantly increase our understanding of the functional roles of larval ch neurons. More generally, our system and the detailed description of wild type reactions to somatosensory stimuli provide a basis for systematic identification of neurons and genes underlying these behaviors. PMID:23977118
Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F
2010-06-01
The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.
Fredlake, Christopher P; Hert, Daniel G; Kan, Cheuk-Wai; Chiesl, Thomas N; Root, Brian E; Forster, Ryan E; Barron, Annelise E
2008-01-15
To realize the immense potential of large-scale genomic sequencing after the completion of the second human genome (Venter's), the costs for the complete sequencing of additional genomes must be dramatically reduced. Among the technologies being developed to reduce sequencing costs, microchip electrophoresis is the only new technology ready to produce the long reads most suitable for the de novo sequencing and assembly of large and complex genomes. Compared with the current paradigm of capillary electrophoresis, microchip systems promise to reduce sequencing costs dramatically by increasing throughput, reducing reagent consumption, and integrating the many steps of the sequencing pipeline onto a single platform. Although capillary-based systems require approximately 70 min to deliver approximately 650 bases of contiguous sequence, we report sequencing up to 600 bases in just 6.5 min by microchip electrophoresis with a unique polymer matrix/adsorbed polymer wall coating combination. This represents a two-thirds reduction in sequencing time over any previously published chip sequencing result, with comparable read length and sequence quality. We hypothesize that these ultrafast long reads on chips can be achieved because the combined polymer system engenders a recently discovered "hybrid" mechanism of DNA electromigration, in which DNA molecules alternate rapidly between repeating through the intact polymer network and disrupting network entanglements to drag polymers through the solution, similar to dsDNA dynamics we observe in single-molecule DNA imaging studies. Most importantly, these results reveal the surprisingly powerful ability of microchip electrophoresis to provide ultrafast Sanger sequencing, which will translate to increased system throughput and reduced costs.
Fredlake, Christopher P.; Hert, Daniel G.; Kan, Cheuk-Wai; Chiesl, Thomas N.; Root, Brian E.; Forster, Ryan E.; Barron, Annelise E.
2008-01-01
To realize the immense potential of large-scale genomic sequencing after the completion of the second human genome (Venter's), the costs for the complete sequencing of additional genomes must be dramatically reduced. Among the technologies being developed to reduce sequencing costs, microchip electrophoresis is the only new technology ready to produce the long reads most suitable for the de novo sequencing and assembly of large and complex genomes. Compared with the current paradigm of capillary electrophoresis, microchip systems promise to reduce sequencing costs dramatically by increasing throughput, reducing reagent consumption, and integrating the many steps of the sequencing pipeline onto a single platform. Although capillary-based systems require ≈70 min to deliver ≈650 bases of contiguous sequence, we report sequencing up to 600 bases in just 6.5 min by microchip electrophoresis with a unique polymer matrix/adsorbed polymer wall coating combination. This represents a two-thirds reduction in sequencing time over any previously published chip sequencing result, with comparable read length and sequence quality. We hypothesize that these ultrafast long reads on chips can be achieved because the combined polymer system engenders a recently discovered “hybrid” mechanism of DNA electromigration, in which DNA molecules alternate rapidly between reptating through the intact polymer network and disrupting network entanglements to drag polymers through the solution, similar to dsDNA dynamics we observe in single-molecule DNA imaging studies. Most importantly, these results reveal the surprisingly powerful ability of microchip electrophoresis to provide ultrafast Sanger sequencing, which will translate to increased system throughput and reduced costs. PMID:18184818
Telemetry Options for LDB Payloads
NASA Technical Reports Server (NTRS)
Stilwell, Bryan D.; Field, Christopher J.
2016-01-01
The Columbia Scientific Balloon Facility provides Telemetry and Command systems necessary for balloon operations and science support. There are various Line-Of-Sight (LOS) and Over-The-Horizon (OTH) systems and interfaces that provide communications to and from a science payload. This presentation will discuss the current data throughput options available and future capabilities that may be incorporated in the LDB Support Instrumentation Package (SIP) such as doubling the TDRSS data rate. We will also explore some new technologies that could potentially expand the data throughput of OTH communications.
Modulation and coding for throughput-efficient optical free-space links
NASA Technical Reports Server (NTRS)
Georghiades, Costas N.
1993-01-01
Optical direct-detection systems are currently being considered for some high-speed inter-satellite links, where data-rates of a few hundred megabits per second are evisioned under power and pulsewidth constraints. In this paper we investigate the capacity, cutoff-rate and error-probability performance of uncoded and trellis-coded systems for various modulation schemes and under various throughput and power constraints. Modulation schemes considered are on-off keying (OOK), pulse-position modulation (PPM), overlapping PPM (OPPM) and multi-pulse (combinatorial) PPM (MPPM).
Schulthess, Pascal; van Wijk, Rob C; Krekels, Elke H J; Yates, James W T; Spaink, Herman P; van der Graaf, Piet H
2018-04-25
To advance the systems approach in pharmacology, experimental models and computational methods need to be integrated from early drug discovery onward. Here, we propose outside-in model development, a model identification technique to understand and predict the dynamics of a system without requiring prior biological and/or pharmacological knowledge. The advanced data required could be obtained by whole vertebrate, high-throughput, low-resource dose-exposure-effect experimentation with the zebrafish larva. Combinations of these innovative techniques could improve early drug discovery. © 2018 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Recent progress in liquid crystal projection displays
NASA Astrophysics Data System (ADS)
Hamada, Hiroshi
1997-05-01
An LC-projector usually contains 3 monochrome TFT-LCDs with a 3-channel dichroic system or a single TFT-LCD with a micro color filter. The liquid crystal operation mode adopted in a TFT-LCD is TN. The optical throughput of an LC-projector is reduced by a pair of polarizers, an aperture ratio of a TFT- LCD and a color filter in a single-LCD projector. In order to eliminate absorption loss by a color filter, a single LCD projection system which consists of a monochrome LCD with a microlens array and a color splitting system using tilted dichroic mirrors or another optical element such as a holographic optical element or a blazed grating has been developed. And LC rear projection TVs have started to challenge CRT-based rear projection TVs. In addition to this system, new technologies to improve optical throughput have been developed to the practical stage such as an active- matrix-addressed PDLC and a reflective type LCD on a Si-LSI chip. Merits and technical issues of newly developed systems and conventional systems including a-Si TFT-LCDs and p-Si TFT-LCDs are discussed mainly in terms of optical throughput.
An investigation into the impact of magnesium stearate on powder feeding during roller compaction.
Dawes, Jason; Gamble, John F; Greenwood, Richard; Robbins, Phil; Tobyn, Mike
2012-01-01
A systematic evaluation on the effect of magnesium stearate on the transmission of a placebo formulation from the hopper to the rolls during screw fed roller compaction has been carried out. It is demonstrated that, for a system with two 'knurled' rollers, addition of 0.5% w/w magnesium stearate can lead to a significant increase in ribbon mass throughput, with a consequential increase in roll gap, compared to an unlubricated formulation (manufactured at equivalent process conditions). However, this effect is reduced if one of the rollers is smooth. Roller compaction of a lubricated formulation using two smooth rollers was found to be ineffective due to a reduction in friction at the powder/roll interface, i.e. powder was not drawn through the rollers leading to a blockage in the feeding system. An increase in ribbon mass throughput could also be achieved if the equipment surfaces were pre-lubricated. However this increase was found to be temporary suggesting that the residual magnesium stearate layer was removed from the equipment surfaces. Powder sticking to the equipment surfaces, which is common during pharmaceutical manufacturing, was prevented if magnesium stearate was present either in the blend, or at the roll surface. It is further demonstrated that the influence of the hopper stirrer, which is primarily used to prevent bridge formation in the hopper and help draw powder more evenly into the auger chamber, can lead to further mixing of the formulation, and could therefore affect a change in the lubricity of the carefully blended input material.
High-throughput analysis of yeast replicative aging using a microfluidic system
Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong
2015-01-01
Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317
NASA Astrophysics Data System (ADS)
Rohde, Christopher B.; Zeng, Fei; Gilleland, Cody; Samara, Chrysanthi; Yanik, Mehmet F.
2009-02-01
In recent years, the advantages of using small invertebrate animals as model systems for human disease have become increasingly apparent and have resulted in three Nobel Prizes in medicine or chemistry during the last six years for studies conducted on the nematode Caenorhabditis elegans (C. elegans). The availability of a wide array of species-specific genetic techniques, along with the transparency of the worm and its ability to grow in minute volumes make C. elegans an extremely powerful model organism. We present a suite of technologies for complex high-throughput whole-animal genetic and drug screens. We demonstrate a high-speed microfluidic sorter that can isolate and immobilize C. elegans in a well-defined geometry, an integrated chip containing individually addressable screening chambers for incubation and exposure of individual animals to biochemical compounds, and a device for delivery of compound libraries in standard multiwell plates to microfluidic devices. The immobilization stability obtained by these devices is comparable to that of chemical anesthesia and the immobilization process does not affect lifespan, progeny production, or other aspects of animal health. The high-stability enables the use of a variety of key optical techniques. We use this to demonstrate femtosecond-laser nanosurgery and three-dimensional multiphoton microscopy. Used alone or in various combinations these devices facilitate a variety of high-throughput assays using whole animals, including mutagenesis and RNAi and drug screens at subcellular resolution, as well as high-throughput high-precision manipulations such as femtosecond-laser nanosurgery for large-scale in vivo neural degeneration and regeneration studies.
PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.
Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R
2017-10-01
High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.
Development and Performance of the ACTS High Speed VSAT
NASA Technical Reports Server (NTRS)
Quintana, J.; Tran, Q.; Dendy, R.
1999-01-01
The Advanced Communication Technology Satellite (ACTS), developed by the U.S. National Aeronautics and Space Administration (NASA) has demonstrated the breakthrough technologies of Ka-band, spot beam antennas, and on-board processing. These technologies have enabled the development of very small aperture terminals (VSAT) and ultra-small aperture terminals (USAT) which have capabilities greater than were previously possible with conventional satellite technologies. However, the ACTS baseband processor (BBP) is designed using a time division multiple access (TDMA) scheme, which requires each earth station using the BBP to transmit data at a burst rate which is much higher than the user throughput data rate. This tends to mitigate the advantage of the new technologies by requiring a larger earth station antenna and/or a higher-powered uplink amplifier than would be necessary for a continuous transmission at the user data rate. Conversely, the user data rate is much less than the rate that can be supported by the antenna size and amplifier. For example, the ACTS TI VSAT operates at a burst rate of 27.5 Mbps, but the maximum user data rate is 1.792 Mbps. The throughput efficiency is slightly more than 6.5%. For an operational network, this level of overhead will greatly increase the cost of the user earth stations, and that increased cost must be repeated thousands of times, which may ultimately reduce the market for such a system. The ACTS High Speed VSAT (HS VSAT) is an effort to experimentally demonstrate the maximum user throughput data rate which can be achieved using the technologies developed and implemented on ACTS. Specifically, this was done by operating the system uplinks as frequency division multiple access (FDMA), essentially assigning all available TDMA time slots to a single user on each of two uplink frequencies. Preliminary results show that using a 1.2-m antenna in this mode, the HS VSAT can achieve between 22 and 24 Mbps out of the 27.5 Mbps burst rate, for a throughput efficiency of 80-88%. This paper describes the modifications made to the TI VSAT to enable it to operate at high speed, including hardware considerations, interface modifications, and software modifications. In addition, it describes the results of NASA HS VSAT experiments, continuing work on an improved user interface, and plans for future experiments.
Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.
Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian
2016-07-05
This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.
NASA Astrophysics Data System (ADS)
Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina
2016-02-01
Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver.
Adaptation to high throughput batch chromatography enhances multivariate screening.
Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried
2015-09-01
High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina
2016-01-01
Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver. PMID:26864633
Rapid high-throughput cloning and stable expression of antibodies in HEK293 cells.
Spidel, Jared L; Vaessen, Benjamin; Chan, Yin Yin; Grasso, Luigi; Kline, J Bradford
2016-12-01
Single-cell based amplification of immunoglobulin variable regions is a rapid and powerful technique for cloning antigen-specific monoclonal antibodies (mAbs) for purposes ranging from general laboratory reagents to therapeutic drugs. From the initial screening process involving small quantities of hundreds or thousands of mAbs through in vitro characterization and subsequent in vivo experiments requiring large quantities of only a few, having a robust system for generating mAbs from cloning through stable cell line generation is essential. A protocol was developed to decrease the time, cost, and effort required by traditional cloning and expression methods by eliminating bottlenecks in these processes. Removing the clonal selection steps from the cloning process using a highly efficient ligation-independent protocol and from the stable cell line process by utilizing bicistronic plasmids to generate stable semi-clonal cell pools facilitated an increased throughput of the entire process from plasmid assembly through transient transfections and selection of stable semi-clonal cell pools. Furthermore, the time required by a single individual to clone, express, and select stable cell pools in a high-throughput format was reduced from 4 to 6months to only 4 to 6weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; Matalgah, Mustafa M; Bobrek, Miljko
Traditional encryption techniques require packet overhead, produce processing time delay, and suffer from severe quality of service deterioration due to fades and interference in wireless channels. These issues reduce the effective transmission data rate (throughput) considerably in wireless communications, where data rate with limited bandwidth is the main constraint. In this paper, performance evaluation analyses are conducted for an integrated signaling-encryption mechanism that is secure and enables improved throughput and probability of bit-error in wireless channels. This mechanism eliminates the drawbacks stated herein by encrypting only a small portion of an entire transmitted frame, while the rest is not subjectmore » to traditional encryption but goes through a signaling process (designed transformation) with the plaintext of the portion selected for encryption. We also propose to incorporate error correction coding solely on the small encrypted portion of the data to drastically improve the overall bit-error rate performance while not noticeably increasing the required bit-rate. We focus on validating the signaling-encryption mechanism utilizing Hamming and convolutional error correction coding by conducting an end-to-end system-level simulation-based study. The average probability of bit-error and throughput of the encryption mechanism are evaluated over standard Gaussian and Rayleigh fading-type channels and compared to the ones of the conventional advanced encryption standard (AES).« less
Factors that determine the optimum dose for sub-20nm resist systems: DUV, EUV, and e-beam options
NASA Astrophysics Data System (ADS)
Preil, Moshe
2012-03-01
As EUV and e-beam direct write (EBDW) technologies move closer to insertion into pilot production, questions regarding cost effectiveness take on increasing importance. One of the most critical questions is determining the optimum dose which balances the requirements for cost-effective throughput vs. imaging performance. To date most of the dose requirements have been dictated by the hardware side of the industry. The exposure tool manufacturers have a vested interest in specifying the fastest resists possible in order to maximize the throughput even if it comes at the expense of optimum resist performance. This is especially true for both EUV and EBDW where source power is severely limited. We will explore the cost-benefit tradeoffs which drive the equipment side of the industry, and show how these considerations lead to the current throughput and dose requirements for volume production tools. We will then show how the resulting low doses may lead to shot noise problems and a resulting penalty in resist performance. By comparison to the history of 248 nm DUV resist development we will illustrate how setting unrealistic initial targets for resist dose may lead to unacceptable tradeoffs in resist performance and subsequently long delays in the development of production worthy resists.
High-Throughput Sequencing: A Roadmap Toward Community Ecology
Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique
2013-01-01
High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines. PMID:23610649
Shibata, Kazuhiro; Itoh, Masayoshi; Aizawa, Katsunori; Nagaoka, Sumiharu; Sasaki, Nobuya; Carninci, Piero; Konno, Hideaki; Akiyama, Junichi; Nishi, Katsuo; Kitsunai, Tokuji; Tashiro, Hideo; Itoh, Mari; Sumi, Noriko; Ishii, Yoshiyuki; Nakamura, Shin; Hazama, Makoto; Nishine, Tsutomu; Harada, Akira; Yamamoto, Rintaro; Matsumoto, Hiroyuki; Sakaguchi, Sumito; Ikegami, Takashi; Kashiwagi, Katsuya; Fujiwake, Syuji; Inoue, Kouji; Togawa, Yoshiyuki; Izawa, Masaki; Ohara, Eiji; Watahiki, Masanori; Yoneda, Yuko; Ishikawa, Tomokazu; Ozawa, Kaori; Tanaka, Takumi; Matsuura, Shuji; Kawai, Jun; Okazaki, Yasushi; Muramatsu, Masami; Inoue, Yorinao; Kira, Akira; Hayashizaki, Yoshihide
2000-01-01
The RIKEN high-throughput 384-format sequencing pipeline (RISA system) including a 384-multicapillary sequencer (the so-called RISA sequencer) was developed for the RIKEN mouse encyclopedia project. The RISA system consists of colony picking, template preparation, sequencing reaction, and the sequencing process. A novel high-throughput 384-format capillary sequencer system (RISA sequencer system) was developed for the sequencing process. This system consists of a 384-multicapillary auto sequencer (RISA sequencer), a 384-multicapillary array assembler (CAS), and a 384-multicapillary casting device. The RISA sequencer can simultaneously analyze 384 independent sequencing products. The optical system is a scanning system chosen after careful comparison with an image detection system for the simultaneous detection of the 384-capillary array. This scanning system can be used with any fluorescent-labeled sequencing reaction (chain termination reaction), including transcriptional sequencing based on RNA polymerase, which was originally developed by us, and cycle sequencing based on thermostable DNA polymerase. For long-read sequencing, 380 out of 384 sequences (99.2%) were successfully analyzed and the average read length, with more than 99% accuracy, was 654.4 bp. A single RISA sequencer can analyze 216 kb with >99% accuracy in 2.7 h (90 kb/h). For short-read sequencing to cluster the 3′ end and 5′ end sequencing by reading 350 bp, 384 samples can be analyzed in 1.5 h. We have also developed a RISA inoculator, RISA filtrator and densitometer, RISA plasmid preparator which can handle throughput of 40,000 samples in 17.5 h, and a high-throughput RISA thermal cycler which has four 384-well sites. The combination of these technologies allowed us to construct the RISA system consisting of 16 RISA sequencers, which can process 50,000 DNA samples per day. One haploid genome shotgun sequence of a higher organism, such as human, mouse, rat, domestic animals, and plants, can be revealed by seven RISA systems within one month. PMID:11076861
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...
Gene expression with ontologic enrichment and connectivity mapping tools is widely used to infer modes of action (MOA) for therapeutic drugs. Despite progress in high-throughput (HT) genomic systems, strategies suitable to identify industrial chemical MOA are needed. The L1000 is...
Efficient and accurate adverse outcome pathway (AOP) based high-throughput screening (HTS) methods use a systems biology based approach to computationally model in vitro cellular and molecular data for rapid chemical prioritization; however, not all HTS assays are grounded by rel...
Architectural design and simulation of a virtual memory
NASA Technical Reports Server (NTRS)
Kwok, G.; Chu, Y.
1971-01-01
Virtual memory is an imaginary main memory with a very large capacity which the programmer has at his disposal. It greatly contributes to the solution of the dynamic storage allocation problem. The architectural design of a virtual memory is presented which implements by hardware the idea of queuing and scheduling the page requests to a paging drum in such a way that the access of the paging drum is increased many times. With the design, an increase of up to 16 times in page transfer rate is achievable when the virtual memory is heavily loaded. This in turn makes feasible a great increase in the system throughput.
A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.
Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang
2009-11-21
We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.
Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
A 0.13-µm implementation of 5 Gb/s and 3-mW folded parallel architecture for AES algorithm
NASA Astrophysics Data System (ADS)
Rahimunnisa, K.; Karthigaikumar, P.; Kirubavathy, J.; Jayakumar, J.; Kumar, S. Suresh
2014-02-01
A new architecture for encrypting and decrypting the confidential data using Advanced Encryption Standard algorithm is presented in this article. This structure combines the folded structure with parallel architecture to increase the throughput. The whole architecture achieved high throughput with less power. The proposed architecture is implemented in 0.13-µm Complementary metal-oxide-semiconductor (CMOS) technology. The proposed structure is compared with different existing structures, and from the result it is proved that the proposed structure gives higher throughput and less power compared to existing works.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992
A compact imaging spectroscopic system for biomolecular detections on plasmonic chips.
Lo, Shu-Cheng; Lin, En-Hung; Wei, Pei-Kuen; Tsai, Wan-Shao
2016-10-17
In this study, we demonstrate a compact imaging spectroscopic system for high-throughput detection of biomolecular interactions on plasmonic chips, based on a curved grating as the key element of light diffraction and light focusing. Both the curved grating and the plasmonic chips are fabricated on flexible plastic substrates using a gas-assisted thermal-embossing method. A fiber-coupled broadband light source and a camera are included in the system. Spectral resolution within 1 nm is achieved in sensing environmental index solutions and protein bindings. The detected sensitivities of the plasmonic chip are comparable with a commercial spectrometer. An extra one-dimensional scanning stage enables high-throughput detection of protein binding on a designed plasmonic chip consisting of several nanoslit arrays with different periods. The detected resonance wavelengths match well with the grating equation under an air environment. Wavelength shifts between 1 and 9 nm are detected for antigens of various concentrations binding with antibodies. A simple, mass-productive and cost-effective method has been demonstrated on the imaging spectroscopic system for real-time, label-free, highly sensitive and high-throughput screening of biomolecular interactions.
A scientific operations plan for the NASA space telescope. [ground support systems, project planning
NASA Technical Reports Server (NTRS)
West, D. K.; Costa, S. R.
1975-01-01
A ground system is described which is compatible with the operational requirements of the space telescope. The goal of the ground system is to minimize the cost of post launch operations without seriously compromising the quality and total throughput of space telescope science, or jeopardizing the safety of the space telescope in orbit. The resulting system is able to accomplish this goal through optimum use of existing and planned resources and institutional facilities. Cost is also reduced and efficiency in operation increased by drawing on existing experience in interfacing guest astronomers with spacecraft as well as mission control experience obtained in the operation of present astronomical spacecraft.
Network coding multiuser scheme for indoor visible light communications
NASA Astrophysics Data System (ADS)
Zhang, Jiankun; Dang, Anhong
2017-12-01
Visible light communication (VLC) is a unique alternative for indoor data transfer and developing beyond point-to-point. However, for realizing high-capacity networks, VLC is facing challenges including the constrained bandwidth of the optical access point and random occlusion. A network coding scheme for VLC (NC-VLC) is proposed, with increased throughput and system robustness. Based on the Lambertian illumination model, theoretical decoding failure probability of the multiuser NC-VLC system is derived, and the impact of the system parameters on the performance is analyzed. Experiments demonstrate the proposed scheme successfully in the indoor multiuser scenario. These results indicate that the NC-VLC system shows a good performance under the link loss and random occlusion.
Industrial-scale spray layer-by-layer assembly for production of biomimetic photonic systems.
Krogman, K C; Cohen, R E; Hammond, P T; Rubner, M F; Wang, B N
2013-12-01
Layer-by-layer assembly is a powerful and flexible thin film process that has successfully reproduced biomimetic photonic systems such as structural colour. While most of the seminal work has been carried out using slow and ultimately unscalable immersion assembly, recent developments using spray layer-by-layer assembly provide a platform for addressing challenges to scale-up and manufacturability. A series of manufacturing systems has been developed to increase production throughput by orders of magnitude, making commercialized structural colour possible. Inspired by biomimetic photonic structures we developed and demonstrated a heat management system that relies on constructive reflection of near infrared radiation to bring about dramatic reductions in heat content.
Hultman, Charles Scott; Gilland, Wendell G; Weir, Samuel
2015-06-01
Inefficient patient throughput in a surgery practice can result in extended new patient backlogs, excessively long cycle times in the outpatient clinics, poor patient satisfaction, decreased physician productivity, and loss of potential revenue. This project assesses the efficacy of multiple throughput interventions in an academic, plastic surgery practice at a public university. We implemented a Patient Access and Efficiency (PAcE) initiative, funded and sponsored by our health care system, to improve patient throughput in the outpatient surgery clinic. Interventions included: (1) creation of a multidisciplinary team, led by a project redesign manager, that met weekly; (2) definition of goals, metrics, and target outcomes; 3) revision of clinic templates to reflect actual demand; 4) working down patient backlog through group visits; 5) booking new patients across entire practice; 6) assigning a physician's assistant to the preoperative clinic; and 7) designating a central scheduler to coordinate flow of information. Main outcome measures included: patient satisfaction using Press-Ganey surveys; complaints reported to patient relations; time to third available appointment; size of patient backlog; monthly clinic volumes with utilization rates and supply/demand curves; "chaos" rate (cancellations plus reschedules, divided by supply, within 48 hours of booked clinic date); patient cycle times with bottleneck analysis; physician productivity measured by work Relative Value Units (wRVUs); and downstream financial effects on billing, collection, accounts receivable (A/R), and payer mix. We collected, managed, and analyzed the data prospectively, comparing the pre-PAcE period (6 months) with the PAcE period (6 months). The PAcE initiative resulted in multiple improvements across the entire plastic surgery practice. Patient satisfaction increased only slightly from 88.5% to 90.0%, but the quarterly number of complaints notably declined from 17 to 9. Time to third available new patient appointment dropped from 52 to 38 days, whereas the same metric for a preoperative appointment plunged from 46 to 16 days. The size of the new patient backlog fell from 169 to 110 patients, and total monthly clinic volume climbed from 574 to 766 patients. Our "chaos" rate dropped from 12.3% to 1.8%. Mean patient cycle time in the clinic decreased dramatically from 127 to 44 minutes. Mean monthly productivity for the practice increased from 2479 to 2702 RVUs. Although our collection rate did not change, days in A/R dropped from 66 to 57 days. Mean monthly charges increased from U.S. $535,213 to U.S. $583,193, and mean monthly collections improved from U.S. $181,967 to U.S. $210,987. Payer mix remained unchanged. Implementation of a PAcE initiative, focusing on outpatient clinic throughput, yields significant improvements in access to care, patient satisfaction as measured by complaints, physician productivity, and financial performance. An academic, university-based, plastic surgery practice can use throughput interventions to deliver timely care and to enhance financial viability.
Lin, Frank Yeong-Sung; Hsiao, Chiu-Han; Yen, Hong-Hsu; Hsieh, Yu-Jen
2013-01-01
One of the important applications in Wireless Sensor Networks (WSNs) is video surveillance that includes the tasks of video data processing and transmission. Processing and transmission of image and video data in WSNs has attracted a lot of attention in recent years. This is known as Wireless Visual Sensor Networks (WVSNs). WVSNs are distributed intelligent systems for collecting image or video data with unique performance, complexity, and quality of service challenges. WVSNs consist of a large number of battery-powered and resource constrained camera nodes. End-to-end delay is a very important Quality of Service (QoS) metric for video surveillance application in WVSNs. How to meet the stringent delay QoS in resource constrained WVSNs is a challenging issue that requires novel distributed and collaborative routing strategies. This paper proposes a Near-Optimal Distributed QoS Constrained (NODQC) routing algorithm to achieve an end-to-end route with lower delay and higher throughput. A Lagrangian Relaxation (LR)-based routing metric that considers the “system perspective” and “user perspective” is proposed to determine the near-optimal routing paths that satisfy end-to-end delay constraints with high system throughput. The empirical results show that the NODQC routing algorithm outperforms others in terms of higher system throughput with lower average end-to-end delay and delay jitter. In this paper, for the first time, the algorithm shows how to meet the delay QoS and at the same time how to achieve higher system throughput in stringently resource constrained WVSNs.
Towards High-Throughput, Simultaneous Characterization of Thermal and Thermoelectric Properties
NASA Astrophysics Data System (ADS)
Miers, Collier Stephen
The extension of thermoelectric generators to more general markets requires that the devices be affordable and practical (low $/Watt) to implement. A key challenge in this pursuit is the quick and accurate characterization of thermoelectric materials, which will allow researchers to tune and modify the material properties quickly. The goal of this thesis is to design and fabricate a high-throughput characterization system for the simultaneous characterization of thermal, electrical, and thermoelectric properties for device scale material samples. The measurement methodology presented in this thesis combines a custom designed measurement system created specifically for high-throughput testing with a novel device structure that permits simultaneous characterization of the material properties. The measurement system is based upon the 3o method for thermal conductivity measurements, with the addition of electrodes and voltage probes to measure the electrical conductivity and Seebeck coefficient. A device designed and optimized to permit the rapid characterization of thermoelectric materials is also presented. This structure is optimized to ensure 1D heat transfer within the sample, thus permitting rapid data analysis and fitting using a MATLAB script. Verification of the thermal portion of the system is presented using fused silica and sapphire materials for benchmarking. The fused silica samples yielded a thermal conductivity of 1.21 W/(m K), while a thermal conductivity of 31.2 W/(m K) was measured for the sapphire samples. The device and measurement system designed and developed in this thesis provide insight and serve as a foundation for the development of high throughput, simultaneous measurement platforms.
Ramanathan, Ragu; Ghosal, Anima; Ramanathan, Lakshmi; Comstock, Kate; Shen, Helen; Ramanathan, Dil
2018-05-01
Evaluation of HPLC-high-resolution mass spectrometry (HPLC-HRMS) full scan with polarity switching for increasing throughput of human in vitro cocktail drug-drug interaction assay. Microsomal incubates were analyzed using a high resolution and high mass accuracy Q-Exactive mass spectrometer to collect integrated qualitative and quantitative (qual/quant) data. Within assay, positive-to-negative polarity switching HPLC-HRMS method allowed quantification of eight and two probe compounds in the positive and negative ionization modes, respectively, while monitoring for LOR and its metabolites. LOR-inhibited CYP2C19 and showed higher activity for CYP2D6, CYP2E1 and CYP3A4. Overall, LC-HRMS-based nontargeted full scan quantitation allowed to improve the throughput of the in vitro cocktail drug-drug interaction assay.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, D.; Edwards, T.
High-level waste (HLW) throughput (i.e., the amount of waste processed per unit of time) is primarily a function of two critical parameters: waste loading (WL) and melt rate. For the Defense Waste Processing Facility (DWPF), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). Significant increases in waste throughput have been achieved at DWPF since initial radioactive operations began in 1996. Key technical and operational initiatives that supported increased waste throughput included improvements in facility attainment, the Chemical Processing Cell (CPC) flowsheet, process control models and frit formulations. As a resultmore » of these key initiatives, DWPF increased WLs from a nominal 28% for Sludge Batch 2 (SB2) to {approx}34 to 38% for SB3 through SB6 while maintaining or slightly improving canister fill times. Although considerable improvements in waste throughput have been obtained, future contractual waste loading targets are nominally 40%, while canister production rates are also expected to increase (to a rate of 325 to 400 canisters per year). Although implementation of bubblers have made a positive impact on increasing melt rate for recent sludge batches targeting WLs in the mid30s, higher WLs will ultimately make the feeds to DWPF more challenging to process. Savannah River Remediation (SRR) recently requested the Savannah River National Laboratory (SRNL) to perform a paper study assessment using future sludge projections to evaluate whether the current Process Composition Control System (PCCS) algorithms would provide projected operating windows to allow future contractual WL targets to be met. More specifically, the objective of this study was to evaluate future sludge batch projections (based on Revision 16 of the HLW Systems Plan) with respect to projected operating windows using current PCCS models and associated constraints. Based on the assessments, the waste loading interval over which a glass system (i.e., a projected sludge composition with a candidate frit) is predicted to be acceptable can be defined (i.e., the projected operating window) which will provide insight into the ability to meet future contractual WL obligations. In this study, future contractual WL obligations are assumed to be 40%, which is the goal after all flowsheet enhancements have been implemented to support DWPF operations. For a system to be considered acceptable, candidate frits must be identified that provide access to at least 40% WL while accounting for potential variation in the sludge resulting from differences in batch-to-batch transfers into the Sludge Receipt and Adjustment Tank (SRAT) and/or analytical uncertainties. In more general terms, this study will assess whether or not the current glass formulation strategy (based on the use of the Nominal and Variation Stage assessments) and current PCCS models will allow access to compositional regions required to targeted higher WLs for future operations. Some of the key questions to be considered in this study include: (1) If higher WLs are attainable with current process control models, are the models valid in these compositional regions? If the higher WL glass regions are outside current model development or validation ranges, is there existing data that could be used to demonstrate model applicability (or lack thereof)? If not, experimental data may be required to revise current models or serve as validation data with the existing models. (2) Are there compositional trends in frit space that are required by the PCCS models to obtain access to these higher WLs? If so, are there potential issues with the compositions of the associated frits (e.g., limitations on the B{sub 2}O{sub 3} and/or Li{sub 2}O concentrations) as they are compared to model development/validation ranges or to the term 'borosilicate' glass? If limitations on the frit compositional range are realized, what is the impact of these restrictions on other glass properties such as the ability to suppress nepheline formation or influence melt rate? The model based assessments being performed make the assumption that the process control models are applicable over the glass compositional regions being evaluated. Although the glass compositional region of interest is ultimately defined by the specific frit, sludge, and WL interval used, there is no prescreening of these compositional regions with respect to the model development or validation ranges which is consistent with current DWPF operations.« less
Microarray profiling of chemical-induced effects is being increasingly used in medium and high-throughput formats. In this study, we describe computational methods to identify molecular targets from whole-genome microarray data using as an example the estrogen receptor α (ERα), ...
Zhu, Bo; Mizoguchi, Takuro; Kojima, Takaaki; Nakano, Hideo
2015-01-01
The C1a isoenzyme of horseradish peroxidase (HRP) is an industrially important heme-containing enzyme that utilizes hydrogen peroxide to oxidize a wide variety of inorganic and organic compounds for practical applications, including synthesis of fine chemicals, medical diagnostics, and bioremediation. To develop a ultra-high-throughput screening system for HRP, we successfully produced active HRP in an Escherichia coli cell-free protein synthesis system, by adding disulfide bond isomerase DsbC and optimizing the concentrations of hemin and calcium ions and the temperature. The biosynthesized HRP was fused with a single-chain Cro (scCro) DNA-binding tag at its N-terminal and C-terminal sites. The addition of the scCro-tag at both ends increased the solubility of the protein. Next, HRP and its fusion proteins were successfully synthesized in a water droplet emulsion by using hexadecane as the oil phase and SunSoft No. 818SK as the surfactant. HRP fusion proteins were displayed on microbeads attached with double-stranded DNA (containing the scCro binding sequence) via scCro-DNA interactions. The activities of the immobilized HRP fusion proteins were detected with a tyramide-based fluorogenic assay using flow cytometry. Moreover, a model microbead library containing wild type hrp (WT) and inactive mutant (MUT) genes was screened using fluorescence-activated cell-sorting, thus efficiently enriching the WT gene from the 1:100 (WT:MUT) library. The technique described here could serve as a novel platform for the ultra-high-throughput discovery of more useful HRP mutants and other heme-containing peroxidases. PMID:25993095
Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.
Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy
2011-03-01
The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.
Wetmore, Kelly M.; Price, Morgan N.; Waters, Robert J.; Lamson, Jacob S.; He, Jennifer; Hoover, Cindi A.; Blow, Matthew J.; Bristow, James; Butland, Gareth
2015-01-01
ABSTRACT Transposon mutagenesis with next-generation sequencing (TnSeq) is a powerful approach to annotate gene function in bacteria, but existing protocols for TnSeq require laborious preparation of every sample before sequencing. Thus, the existing protocols are not amenable to the throughput necessary to identify phenotypes and functions for the majority of genes in diverse bacteria. Here, we present a method, random bar code transposon-site sequencing (RB-TnSeq), which increases the throughput of mutant fitness profiling by incorporating random DNA bar codes into Tn5 and mariner transposons and by using bar code sequencing (BarSeq) to assay mutant fitness. RB-TnSeq can be used with any transposon, and TnSeq is performed once per organism instead of once per sample. Each BarSeq assay requires only a simple PCR, and 48 to 96 samples can be sequenced on one lane of an Illumina HiSeq system. We demonstrate the reproducibility and biological significance of RB-TnSeq with Escherichia coli, Phaeobacter inhibens, Pseudomonas stutzeri, Shewanella amazonensis, and Shewanella oneidensis. To demonstrate the increased throughput of RB-TnSeq, we performed 387 successful genome-wide mutant fitness assays representing 130 different bacterium-carbon source combinations and identified 5,196 genes with significant phenotypes across the five bacteria. In P. inhibens, we used our mutant fitness data to identify genes important for the utilization of diverse carbon substrates, including a putative d-mannose isomerase that is required for mannitol catabolism. RB-TnSeq will enable the cost-effective functional annotation of diverse bacteria using mutant fitness profiling. PMID:25968644
Design of an electron projection system with slider lenses and multiple beams
NASA Astrophysics Data System (ADS)
Moonen, Daniel; Leunissen, Peter L. H. A.; de Jager, Patrick W.; Kruit, Pieter; Bleeker, Arno J.; Van der Mast, Karel D.
2002-07-01
The commercial applicability of electron beam projection lithography systems may be limited at high resolution because of low throughput. The main limitations to the throughput are: (i) Beam current. The Coulomb interaction between electrons result in an image blue. Therefore less beam current can be allowed at higher resolution, impacting the illuminate time of the wafer. (ii) Exposure field size. Early attempts to improve throughput with 'full chip' electron beam projection systems failed, because the system suffered from large off-axis aberrations of the electron optics, which severely restricted the useful field size. This has impact on the overhead time. A new type of projection optics will be proposed in this paper to overcome both limits. A slider lens is proposed that allows an effective field that is much larger than schemes proposed by SCALPEL and PREVAIL. The full width of the die can be exposed without mechanical scanning by sliding the beam through the slit-like bore of the lens. Locally, at the beam position, a 'round'-lens field is created with a combination of a rectangular magnetic field and quadruples that are positioned inside the lens. A die can now be exposed during a single mechanical scan as in state-of-the-art light optical tools. The total beam current can be improved without impact on the Coulomb interaction blur by combining several beams in a single lithography system if these beams do not interfere with each other. Several optical layouts have been proposed that combined up to 5 beams in a projection system consisting of a doublet of slider lenses. This type of projection optics has a potential throughput of 50 WPH at 45 nm with a resist sensitivity of 6 (mu) C/cm2.
PChopper: high throughput peptide prediction for MRM/SRM transition design.
Afzal, Vackar; Huang, Jeffrey T-J; Atrih, Abdel; Crowther, Daniel J
2011-08-15
The use of selective reaction monitoring (SRM) based LC-MS/MS analysis for the quantification of phosphorylation stoichiometry has been rapidly increasing. At the same time, the number of sites that can be monitored in a single LC-MS/MS experiment is also increasing. The manual processes associated with running these experiments have highlighted the need for computational assistance to quickly design MRM/SRM candidates. PChopper has been developed to predict peptides that can be produced via enzymatic protein digest; this includes single enzyme digests, and combinations of enzymes. It also allows digests to be simulated in 'batch' mode and can combine information from these simulated digests to suggest the most appropriate enzyme(s) to use. PChopper also allows users to define the characteristic of their target peptides, and can automatically identify phosphorylation sites that may be of interest. Two application end points are available for interacting with the system; the first is a web based graphical tool, and the second is an API endpoint based on HTTP REST. Service oriented architecture was used to rapidly develop a system that can consume and expose several services. A graphical tool was built to provide an easy to follow workflow that allows scientists to quickly and easily identify the enzymes required to produce multiple peptides in parallel via enzymatic digests in a high throughput manner.
Beamforming transmission in IEEE 802.11ac under time-varying channels.
Yu, Heejung; Kim, Taejoon
2014-01-01
The IEEE 802.11ac wireless local area network (WLAN) standard has adopted beamforming (BF) schemes to improve spectral efficiency and throughput with multiple antennas. To design the transmit beam, a channel sounding process to feedback channel state information (CSI) is required. Due to sounding overhead, throughput increases with the amount of transmit data under static channels. Under practical channel conditions with mobility, however, the mismatch between the transmit beam and the channel at transmission time causes performance loss when transmission duration after channel sounding is too long. When the fading rate, payload size, and operating signal-to-noise ratio are given, the optimal transmission duration (i.e., packet length) can be determined to maximize throughput. The relationship between packet length and throughput is also investigated for single-user and multiuser BF modes.
Beamforming Transmission in IEEE 802.11ac under Time-Varying Channels
2014-01-01
The IEEE 802.11ac wireless local area network (WLAN) standard has adopted beamforming (BF) schemes to improve spectral efficiency and throughput with multiple antennas. To design the transmit beam, a channel sounding process to feedback channel state information (CSI) is required. Due to sounding overhead, throughput increases with the amount of transmit data under static channels. Under practical channel conditions with mobility, however, the mismatch between the transmit beam and the channel at transmission time causes performance loss when transmission duration after channel sounding is too long. When the fading rate, payload size, and operating signal-to-noise ratio are given, the optimal transmission duration (i.e., packet length) can be determined to maximize throughput. The relationship between packet length and throughput is also investigated for single-user and multiuser BF modes. PMID:25152927
Analysis of data throughput in communication between PLCs and HMI/SCADA systems
NASA Astrophysics Data System (ADS)
Mikolajek, Martin; Koziorek, Jiri
2016-09-01
This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.
From drug to protein: using yeast genetics for high-throughput target discovery.
Armour, Christopher D; Lum, Pek Yee
2005-02-01
The budding yeast Saccharomyces cerevisiae has long been an effective eukaryotic model system for understanding basic cellular processes. The genetic tractability and ease of manipulation in the laboratory make yeast well suited for large-scale chemical and genetic screens. Several recent studies describing the use of yeast genetics for high-throughput drug target identification are discussed in this review.
Automatic cassette to cassette radiant impulse processor
NASA Astrophysics Data System (ADS)
Sheets, Ronald E.
1985-01-01
Single wafer rapid annealing using high temperature isothermal processing has become increasingly popular in recent years. In addition to annealing, this process is also being investigated for suicide formation, passivation, glass reflow and alloying. Regardless of the application, there is a strong necessity to automate in order to maintain process control, repeatability, cleanliness and throughput. These requirements have been carefully addressed during the design and development of the Model 180 Radiant Impulse Processor which is a totally automatic cassette to cassette wafer processing system. Process control and repeatability are maintained by a closed loop optical pyrometer system which maintains the wafer at the programmed temperature-time conditions. Programmed recipes containing up to 10 steps may be easily entered on the computer keyboard or loaded in from a recipe library stored on a standard 5 {1}/{4″} floppy disk. Cold wall heating chamber construction, controlled environment (N 2, A, forming gas) and quartz wafer carriers prevent contamination of the wafer during high temperature processing. Throughputs of 150-240 wafers per hour are achieved by quickly heating the wafer to temperature (450-1400°C) in 3-6 s with a high intensity, uniform (± 1%) radiant flux of 100 {W}/{cm 2}, parallel wafer handling system and a wafer cool down stage.
A novel LTE scheduling algorithm for green technology in smart grid.
Hindia, Mohammad Nour; Reza, Ahmed Wasif; Noordin, Kamarul Ariffin; Chayon, Muhammad Hasibur Rashid
2015-01-01
Smart grid (SG) application is being used nowadays to meet the demand of increasing power consumption. SG application is considered as a perfect solution for combining renewable energy resources and electrical grid by means of creating a bidirectional communication channel between the two systems. In this paper, three SG applications applicable to renewable energy system, namely, distribution automation (DA), distributed energy system-storage (DER) and electrical vehicle (EV), are investigated in order to study their suitability in Long Term Evolution (LTE) network. To compensate the weakness in the existing scheduling algorithms, a novel bandwidth estimation and allocation technique and a new scheduling algorithm are proposed. The technique allocates available network resources based on application's priority, whereas the algorithm makes scheduling decision based on dynamic weighting factors of multi-criteria to satisfy the demands (delay, past average throughput and instantaneous transmission rate) of quality of service. Finally, the simulation results demonstrate that the proposed mechanism achieves higher throughput, lower delay and lower packet loss rate for DA and DER as well as provide a degree of service for EV. In terms of fairness, the proposed algorithm shows 3%, 7 % and 9% better performance compared to exponential rule (EXP-Rule), modified-largest weighted delay first (M-LWDF) and exponential/PF (EXP/PF), respectively.
A Novel LTE Scheduling Algorithm for Green Technology in Smart Grid
Hindia, Mohammad Nour; Reza, Ahmed Wasif; Noordin, Kamarul Ariffin; Chayon, Muhammad Hasibur Rashid
2015-01-01
Smart grid (SG) application is being used nowadays to meet the demand of increasing power consumption. SG application is considered as a perfect solution for combining renewable energy resources and electrical grid by means of creating a bidirectional communication channel between the two systems. In this paper, three SG applications applicable to renewable energy system, namely, distribution automation (DA), distributed energy system-storage (DER) and electrical vehicle (EV), are investigated in order to study their suitability in Long Term Evolution (LTE) network. To compensate the weakness in the existing scheduling algorithms, a novel bandwidth estimation and allocation technique and a new scheduling algorithm are proposed. The technique allocates available network resources based on application’s priority, whereas the algorithm makes scheduling decision based on dynamic weighting factors of multi-criteria to satisfy the demands (delay, past average throughput and instantaneous transmission rate) of quality of service. Finally, the simulation results demonstrate that the proposed mechanism achieves higher throughput, lower delay and lower packet loss rate for DA and DER as well as provide a degree of service for EV. In terms of fairness, the proposed algorithm shows 3%, 7 % and 9% better performance compared to exponential rule (EXP-Rule), modified-largest weighted delay first (M-LWDF) and exponential/PF (EXP/PF), respectively. PMID:25830703
Applying Evolutionary Genetics to Developmental Toxicology and Risk Assessment
Leung, Maxwell C. K.; Procter, Andrew C.; Goldstone, Jared V.; Foox, Jonathan; DeSalle, Robert; Mattingly, Carolyn J.; Siddall, Mark E.; Timme-Laragy, Alicia R.
2018-01-01
Evolutionary thinking continues to challenge our views on health and disease. Yet, there is a communication gap between evolutionary biologists and toxicologists in recognizing the connections among developmental pathways, high-throughput screening, and birth defects in humans. To increase our capability in identifying potential developmental toxicants in humans, we propose to apply evolutionary genetics to improve the experimental design and data interpretation with various in vitro and whole-organism models. We review five molecular systems of stress response and update 18 consensual cell-cell signaling pathways that are the hallmark for early development, organogenesis, and differentiation; and revisit the principles of teratology in light of recent advances in high-throughput screening, big data techniques, and systems toxicology. Multiscale systems modeling plays an integral role in the evolutionary approach to cross-species extrapolation. Phylogenetic analysis and comparative bioinformatics are both valuable tools in identifying and validating the molecular initiating events that account for adverse developmental outcomes in humans. The discordance of susceptibility between test species and humans (ontogeny) reflects their differences in evolutionary history (phylogeny). This synthesis not only can lead to novel applications in developmental toxicity and risk assessment, but also can pave the way for applying an evo-devo perspective to the study of developmental origins of health and disease. PMID:28267574
Jiang, Hui; Jiang, Donglei; Shao, Jingdong; Sun, Xiulan; Wang, Jiasheng
2016-11-14
Due to the high toxicity of bacterial lipopolysaccharide (LPS), resulting in sepsis and septic shock, two major causes of death worldwide, significant effort is directed toward the development of specific trace-level LPS detection systems. Here, we report sensitive, user-friendly, high-throughput LPS detection in a 96-well microplate using a transcriptional biosensor system, based on 293/hTLR4A-MD2-CD14 cells that are transformed by a red fluorescent protein (mCherry) gene under the transcriptional control of an NF-κB response element. The recognition of LPS activates the biosensor cell, TLR4, and the co-receptor-induced NF-κB signaling pathway, which results in the expression of mCherry fluorescent protein. The novel cell-based biosensor detects LPS with specificity at low concentration. The cell-based biosensor was evaluated by testing LPS isolated from 14 bacteria. Of the tested bacteria, 13 isolated Enterobacteraceous LPSs with hexa-acylated structures were found to increase red fluorescence and one penta-acylated LPS from Pseudomonadaceae appeared less potent. The proposed biosensor has potential for use in the LPS detection in foodstuff and biological products, as well as bacteria identification, assisting the control of foodborne diseases.
Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul
2012-12-01
Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).
Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.
Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin
2016-02-01
High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
Integrative Systems Biology for Data Driven Knowledge Discovery
Greene, Casey S.; Troyanskaya, Olga G.
2015-01-01
Integrative systems biology is an approach that brings together diverse high throughput experiments and databases to gain new insights into biological processes or systems at molecular through physiological levels. These approaches rely on diverse high-throughput experimental techniques that generate heterogeneous data by assaying varying aspects of complex biological processes. Computational approaches are necessary to provide an integrative view of these experimental results and enable data-driven knowledge discovery. Hypotheses generated from these approaches can direct definitive molecular experiments in a cost effective manner. Using integrative systems biology approaches, we can leverage existing biological knowledge and large-scale data to improve our understanding of yet unknown components of a system of interest and how its malfunction leads to disease. PMID:21044756
Characterization of Pleurotus ostreatus Biofilms by Using the Calgary Biofilm Device
Pesciaroli, Lorena; Petruccioli, Maurizio; Fedi, Stefano; Firrincieli, Andrea; Federici, Federico
2013-01-01
The adequacy of the Calgary biofilm device, often referred to as the MBEC system, as a high-throughput approach to the production and subsequent characterization of Pleurotus ostreatus biofilms was assessed. The hydroxyapatite-coating of pegs was necessary to enable biofilm attachment, and the standardization of vegetative inocula ensured a uniform distribution of P. ostreatus biofilms, which is necessary for high-throughput evaluations of several antimicrobials and exposure conditions. Scanning electron microscopy showed surface-associated growth, the occurrence of a complex aggregated growth organized in multilayers or hyphal bundles, and the encasement of hyphae within an extracellular matrix (ECM), the extent of which increased with time. Chemical analyses showed that biofilms differed from free-floating cultures for their higher contents of total sugars (TS) and ECM, with the latter being mainly composed of TS and, to a lesser extent, protein. Confocal laser scanning microscopy analysis of 4-day-old biofilms showed the presence of interspersed interstitial voids and water channels in the mycelial network, the density and compactness of which increased after a 7-day incubation, with the novel occurrence of ECM aggregates with an α-glucan moiety. In 4- and 7-day-old biofilms, tolerance to cadmium was increased by factors of 3.2 and 11.1, respectively, compared to coeval free-floating counterparts. PMID:23892744
Characterization of Pleurotus ostreatus biofilms by using the calgary biofilm device.
Pesciaroli, Lorena; Petruccioli, Maurizio; Fedi, Stefano; Firrincieli, Andrea; Federici, Federico; D'Annibale, Alessandro
2013-10-01
The adequacy of the Calgary biofilm device, often referred to as the MBEC system, as a high-throughput approach to the production and subsequent characterization of Pleurotus ostreatus biofilms was assessed. The hydroxyapatite-coating of pegs was necessary to enable biofilm attachment, and the standardization of vegetative inocula ensured a uniform distribution of P. ostreatus biofilms, which is necessary for high-throughput evaluations of several antimicrobials and exposure conditions. Scanning electron microscopy showed surface-associated growth, the occurrence of a complex aggregated growth organized in multilayers or hyphal bundles, and the encasement of hyphae within an extracellular matrix (ECM), the extent of which increased with time. Chemical analyses showed that biofilms differed from free-floating cultures for their higher contents of total sugars (TS) and ECM, with the latter being mainly composed of TS and, to a lesser extent, protein. Confocal laser scanning microscopy analysis of 4-day-old biofilms showed the presence of interspersed interstitial voids and water channels in the mycelial network, the density and compactness of which increased after a 7-day incubation, with the novel occurrence of ECM aggregates with an α-glucan moiety. In 4- and 7-day-old biofilms, tolerance to cadmium was increased by factors of 3.2 and 11.1, respectively, compared to coeval free-floating counterparts.
Droplet-based microfluidic analysis and screening of single plant cells.
Yu, Ziyi; Boehm, Christian R; Hibberd, Julian M; Abell, Chris; Haseloff, Jim; Burgess, Steven J; Reyna-Llorens, Ivan
2018-01-01
Droplet-based microfluidics has been used to facilitate high-throughput analysis of individual prokaryote and mammalian cells. However, there is a scarcity of similar workflows applicable to rapid phenotyping of plant systems where phenotyping analyses typically are time-consuming and low-throughput. We report on-chip encapsulation and analysis of protoplasts isolated from the emergent plant model Marchantia polymorpha at processing rates of >100,000 cells per hour. We use our microfluidic system to quantify the stochastic properties of a heat-inducible promoter across a population of transgenic protoplasts to demonstrate its potential for assessing gene expression activity in response to environmental conditions. We further demonstrate on-chip sorting of droplets containing YFP-expressing protoplasts from wild type cells using dielectrophoresis force. This work opens the door to droplet-based microfluidic analysis of plant cells for applications ranging from high-throughput characterisation of DNA parts to single-cell genomics to selection of rare plant phenotypes.
History, applications, and challenges of immune repertoire research.
Liu, Xiao; Wu, Jinghua
2018-02-27
The diversity of T and B cells in terms of their receptor sequences is huge in the vertebrate's immune system and provides broad protection against the vast diversity of pathogens. Immune repertoire is defined as the sum of T cell receptors and B cell receptors (also named immunoglobulin) that makes the organism's adaptive immune system. Before the emergence of high-throughput sequencing, the studies on immune repertoire were limited by the underdeveloped methodologies, since it was impossible to capture the whole picture by the low-throughput tools. The massive paralleled sequencing technology suits perfectly the researches on immune repertoire. In this article, we review the history of immune repertoire studies, in terms of technologies and research applications. Particularly, we discuss several aspects of challenges in this field and highlight the efforts to develop potential solutions, in the era of high-throughput sequencing of the immune repertoire.
Wu, Fengfeng; Jin, Yamei; Li, Dandan; Zhou, Yuyi; Guo, Lunan; Zhang, Mengyue; Xu, Xueming; Yang, Na
2017-06-01
To improve the economic value of lignocellulosic biomasses, an innovative electrofluidic technology has been applied to the efficient hydrolysis of corncob. The system combines fluidic reactors and induced voltages via magnetoelectric coupling effect. The excitation voltage had a positive impact on reducing sugar content (RSC). But, the increase of voltage frequency at 400-700Hz caused a slight decline of the RSC. Higher temperature limits the electrical effect on the hydrolysis at 70-80°C. The energy efficiency increased under the addition of metallic ions and series of in-phase induced voltage to promote hydrolysis. In addition, the 4-series system with in-phase and reverse-phase induced voltages under the synchronous magnetic flux, exhibited a significant influence on the RSC with a maximum increase of 56%. High throughput could be achieved by increasing series in a compact system. Electrofluid hydrolysis avoids electrochemical reaction, electrode corrosion, and sample contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.
The growing impact of lyophilized cell-free protein expression systems
Hunt, J. Porter; Yang, Seung Ook; Wilding, Kristen M.; Bundy, Bradley C.
2017-01-01
ABSTRACT Recently reported shelf-stable, on-demand protein synthesis platforms are enabling new possibilities in biotherapeutics, biosensing, biocatalysis, and high throughput protein expression. Lyophilized cell-free protein expression systems not only overcome cold-storage limitations, but also enable stockpiling for on-demand synthesis and completely sterilize the protein synthesis platform. Recently reported high-yield synthesis of cytotoxic protein Onconase from lyophilized E. coli extract preparations demonstrates the utility of lyophilized cell-free protein expression and its potential for creating on-demand biotherapeutics, vaccines, biosensors, biocatalysts, and high throughput protein synthesis. PMID:27791452
Three applications of backscatter x-ray imaging technology to homeland defense
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2005-05-01
A brief review of backscatter x-ray imaging and a description of three systems currently applying it to homeland defense missions (BodySearch, ZBV and ZBP). These missions include detection of concealed weapons, explosives and contraband on personnel, in vehicles and large cargo containers. An overview of the x-ray imaging subsystems is provided as well as sample images from each system. Key features such as x-ray safety, throughput and detection are discussed. Recent trends in operational modes are described that facilitate 100% inspection at high throughput chokepoints.
A Domain Analysis Model for eIRB Systems: Addressing the Weak Link in Clinical Research Informatics
He, Shan; Narus, Scott P.; Facelli, Julio C.; Lau, Lee Min; Botkin, Jefferey R.; Hurdle, John F.
2014-01-01
Institutional Review Boards (IRBs) are a critical component of clinical research and can become a significant bottleneck due to the dramatic increase, in both volume and complexity of clinical research. Despite the interest in developing clinical research informatics (CRI) systems and supporting data standards to increase clinical research efficiency and interoperability, informatics research in the IRB domain has not attracted much attention in the scientific community. The lack of standardized and structured application forms across different IRBs causes inefficient and inconsistent proposal reviews and cumbersome workflows. These issues are even more prominent in multi-institutional clinical research that is rapidly becoming the norm. This paper proposes and evaluates a domain analysis model for electronic IRB (eIRB) systems, paving the way for streamlined clinical research workflow via integration with other CRI systems and improved IRB application throughput via computer-assisted decision support. PMID:24929181
Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean
2016-10-01
To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.
Neurotechnology for intelligence analysts
NASA Astrophysics Data System (ADS)
Kruse, Amy A.; Boyd, Karen C.; Schulman, Joshua J.
2006-05-01
Geospatial Intelligence Analysts are currently faced with an enormous volume of imagery, only a fraction of which can be processed or reviewed in a timely operational manner. Computer-based target detection efforts have failed to yield the speed, flexibility and accuracy of the human visual system. Rather than focus solely on artificial systems, we hypothesize that the human visual system is still the best target detection apparatus currently in use, and with the addition of neuroscience-based measurement capabilities it can surpass the throughput of the unaided human severalfold. Using electroencephalography (EEG), Thorpe et al1 described a fast signal in the brain associated with the early detection of targets in static imagery using a Rapid Serial Visual Presentation (RSVP) paradigm. This finding suggests that it may be possible to extract target detection signals from complex imagery in real time utilizing non-invasive neurophysiological assessment tools. To transform this phenomenon into a capability for defense applications, the Defense Advanced Research Projects Agency (DARPA) currently is sponsoring an effort titled Neurotechnology for Intelligence Analysts (NIA). The vision of the NIA program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Successful development of a neurobiologically-based image triage system will enable image analysts to train more effectively and process imagery with greater speed and precision.
D. Lee Taylor; Michael G. Booth; Jack W. McFarland; Ian C. Herriott; Niall J. Lennon; Chad Nusbaum; Thomas G. Marr
2008-01-01
High throughput sequencing methods are widely used in analyses of microbial diversity but are generally applied to small numbers of samples, which precludes charaterization of patterns of microbial diversity across space and time. We have designed a primer-tagging approach that allows pooling and subsequent sorting of numerous samples, which is directed to...
Optimization and high-throughput screening of antimicrobial peptides.
Blondelle, Sylvie E; Lohner, Karl
2010-01-01
While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.
A rocket-borne pulse-height analyzer for energetic particle measurements
NASA Technical Reports Server (NTRS)
Leung, W.; Smith, L. G.; Voss, H. D.
1979-01-01
The pulse-height analyzer basically resembles a time-sharing multiplexing data-acquisition system which acquires analog data (from energetic particle spectrometers) and converts them into digital code. The PHA simultaneously acquires pulse-height information from the analog signals of the four input channels and sequentially multiplexes the digitized data to a microprocessor. The PHA together with the microprocessor form an on-board real-time data-manipulation system. The system processes data obtained during the rocket flight and reduces the amount of data to be sent back to the ground station. Consequently the data-reduction process for the rocket experiments is speeded up. By using a time-sharing technique, the throughput rate of the microprocessor is increased. Moreover, data from several particle spectrometers are manipulated to share one information channel; consequently, the TM capacity is increased.
Networking Omic Data to Envisage Systems Biological Regulation.
Kalapanulak, Saowalak; Saithong, Treenut; Thammarongtham, Chinae
To understand how biological processes work, it is necessary to explore the systematic regulation governing the behaviour of the processes. Not only driving the normal behavior of organisms, the systematic regulation evidently underlies the temporal responses to surrounding environments (dynamics) and long-term phenotypic adaptation (evolution). The systematic regulation is, in effect, formulated from the regulatory components which collaboratively work together as a network. In the drive to decipher such a code of lives, a spectrum of technologies has continuously been developed in the post-genomic era. With current advances, high-throughput sequencing technologies are tremendously powerful for facilitating genomics and systems biology studies in the attempt to understand system regulation inside the cells. The ability to explore relevant regulatory components which infer transcriptional and signaling regulation, driving core cellular processes, is thus enhanced. This chapter reviews high-throughput sequencing technologies, including second and third generation sequencing technologies, which support the investigation of genomics and transcriptomics data. Utilization of this high-throughput data to form the virtual network of systems regulation is explained, particularly transcriptional regulatory networks. Analysis of the resulting regulatory networks could lead to an understanding of cellular systems regulation at the mechanistic and dynamics levels. The great contribution of the biological networking approach to envisage systems regulation is finally demonstrated by a broad range of examples.
Modeling and Simulation Reliable Spacecraft On-Board Computing
NASA Technical Reports Server (NTRS)
Park, Nohpill
1999-01-01
The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.
Analysis of Container Yard Capacity In North TPK Using ARIMA Method
NASA Astrophysics Data System (ADS)
Sirajuddin; Cut Gebrina Hisbach, M.; Ekawati, Ratna; Ade Irman, SM
2018-03-01
North container terminal known as North TPK is container terminal located in Indonesia Port Corporation area serving domestic container loading and unloading. It has 1006 ground slots with a total capacity of 5,544 TEUs and the maximum throughput of containers is 539,616 TEUs / year. Container throughput in the North TPK is increasing year by year. In 2011-2012, the North TPK container throughput is 165,080 TEUs / year and in 2015-2016 has reached 213,147 TEUs / year. To avoid congestion, and prevent possible losses in the future, this paper will analyze the flow of containers and the level of Yard Occupation Ratio in the North TPK at Tanjung Priok Port. The method used is the Autoregressive Integrated Moving Average (ARIMA) Model. ARIMA is a model that completely ignores independent variables in making forecasting. ARIMA results show that in 2016-2017 the total throughput of containers reached 234,006 TEUs / year with field effectiveness of 43.4% and in 2017-2018 the total throughput of containers reached 249,417 TEUs / year with field effectiveness 46.2%.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
Life in the fast lane: high-throughput chemistry for lead generation and optimisation.
Hunter, D
2001-01-01
The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.
A Brief History of Airborne Self-Spacing Concepts
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
2009-01-01
This paper presents a history of seven of the more significant airborne and airborne-assisted aircraft spacing concepts that have been developed and evaluated during the past 40 years. The primary focus of the earlier concepts was on enhancing airport terminal area productivity and reducing air traffic controller workload. The more recent efforts were designed to increase runway throughput through improved aircraft spacing precision at landing. The latest concepts are aimed at supporting more fuel efficient and lower community noise operations while maintaining or increasing runway throughput efficiency.
Opportunistic data locality for end user data analysis
NASA Astrophysics Data System (ADS)
Fischer, M.; Heidecker, C.; Kuehn, E.; Quast, G.; Giffels, M.; Schnepf, M.; Heiss, A.; Petzold, A.
2017-10-01
With the increasing data volume of LHC Run2, user analyses are evolving towards increasing data throughput. This evolution translates to higher requirements for efficiency and scalability of the underlying analysis infrastructure. We approach this issue with a new middleware to optimise data access: a layer of coordinated caches transparently provides data locality for high-throughput analyses. We demonstrated the feasibility of this approach with a prototype used for analyses of the CMS working groups at KIT. In this paper, we present our experience both with the approach in general, and our prototype in specific.
Missile signal processing common computer architecture for rapid technology upgrade
NASA Astrophysics Data System (ADS)
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
2004-10-01
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.
'PACLIMS': a component LIM system for high-throughput functional genomic analysis.
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-04-12
Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the approximately 11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors.
'PACLIMS': A component LIM system for high-throughput functional genomic analysis
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-01-01
Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the ~11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. Results The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Conclusion Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors. PMID:15826298
IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.
Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis
2018-04-01
Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.
Hunt, Rodney D.; Collins, Jack L.; Johnson, Jared A.; ...
2017-03-17
Hundreds of grams of calcined cerium dioxide (CeO 2) microspheres were produced in this paper using the internal gelation process with a focus on 75–150 µm and <75 µm diameter sizes. To achieve these small sizes, a modified internal gelation system was employed, which utilized a two-fluid nozzle, two static mixers for turbulent flow, and 2-ethyl-1-hexanol as the medium for gel formation at 333–338 K. This effort generated over 400 g of 75–150 µm and 300 g of <75 µm CeO 2 microspheres. The typical product yields for the 75–150 µm and <75 µm microspheres that were collected and processedmore » were 72 and 99%, respectively, with a typical throughput of 66–73 g of CeO 2 microspheres per test, which could generate a maximum of 78.6 g of CeO 2. The higher yield of very small cerium spheres led to challenges and modifications, which are discussed in detail. Finally, as expected, when the <75 µm microspheres were targeted, losses to the system increased significantly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Rodney D.; Collins, Jack L.; Johnson, Jared A.
Hundreds of grams of calcined cerium dioxide (CeO 2) microspheres were produced in this paper using the internal gelation process with a focus on 75–150 µm and <75 µm diameter sizes. To achieve these small sizes, a modified internal gelation system was employed, which utilized a two-fluid nozzle, two static mixers for turbulent flow, and 2-ethyl-1-hexanol as the medium for gel formation at 333–338 K. This effort generated over 400 g of 75–150 µm and 300 g of <75 µm CeO 2 microspheres. The typical product yields for the 75–150 µm and <75 µm microspheres that were collected and processedmore » were 72 and 99%, respectively, with a typical throughput of 66–73 g of CeO 2 microspheres per test, which could generate a maximum of 78.6 g of CeO 2. The higher yield of very small cerium spheres led to challenges and modifications, which are discussed in detail. Finally, as expected, when the <75 µm microspheres were targeted, losses to the system increased significantly.« less
Machine vision for digital microfluidics
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun; Lee, Jeong-Bong
2010-01-01
Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future.
Computational Modeling of Human Metabolism and Its Application to Systems Biomedicine.
Aurich, Maike K; Thiele, Ines
2016-01-01
Modern high-throughput techniques offer immense opportunities to investigate whole-systems behavior, such as those underlying human diseases. However, the complexity of the data presents challenges in interpretation, and new avenues are needed to address the complexity of both diseases and data. Constraint-based modeling is one formalism applied in systems biology. It relies on a genome-scale reconstruction that captures extensive biochemical knowledge regarding an organism. The human genome-scale metabolic reconstruction is increasingly used to understand normal cellular and disease states because metabolism is an important factor in many human diseases. The application of human genome-scale reconstruction ranges from mere querying of the model as a knowledge base to studies that take advantage of the model's topology and, most notably, to functional predictions based on cell- and condition-specific metabolic models built based on omics data.An increasing number and diversity of biomedical questions are being addressed using constraint-based modeling and metabolic models. One of the most successful biomedical applications to date is cancer metabolism, but constraint-based modeling also holds great potential for inborn errors of metabolism or obesity. In addition, it offers great prospects for individualized approaches to diagnostics and the design of disease prevention and intervention strategies. Metabolic models support this endeavor by providing easy access to complex high-throughput datasets. Personalized metabolic models have been introduced. Finally, constraint-based modeling can be used to model whole-body metabolism, which will enable the elucidation of metabolic interactions between organs and disturbances of these interactions as either causes or consequence of metabolic diseases. This chapter introduces constraint-based modeling and describes some of its contributions to systems biomedicine.
Lu, Qin; Yi, Jing; Yang, Dianhai
2016-01-01
High-solid anaerobic digestion of sewage sludge achieves highly efficient volatile solid reduction, and production of volatile fatty acid (VFA) and methane compared with conventional low-solid anaerobic digestion. In this study, the potential mechanisms of the better performance in high-solid anaerobic digestion of sewage sludge were investigated by using 454 high-throughput pyrosequencing and real-time PCR to analyze the microbial characteristics in sewage sludge fermentation reactors. The results obtained by 454 high-throughput pyrosequencing revealed that the phyla Chloroflexi, Bacteroidetes, and Firmicutes were the dominant functional microorganisms in high-solid and low-solid anaerobic systems. Meanwhile, the real-time PCR assays showed that high-solid anaerobic digestion significantly increased the number of total bacteria, which enhanced the hydrolysis and acidification of sewage sludge. Further study indicated that the number of total archaea (dominated by Methanosarcina) in a high-solid anaerobic fermentation reactor was also higher than that in a low-solid reactor, resulting in higher VFA consumption and methane production. Hence, the increased key bacteria and methanogenic archaea involved in sewage sludge hydrolysis, acidification, and methanogenesis resulted in the better performance of high-solid anaerobic sewage sludge fermentation.
Mobility for GCSS-MC through virtual PCs
2017-06-01
their productivity. Mobile device access to GCSS-MC would allow Marines to access a required program for their mission using a form of computing ...network throughput applications with a device running on various operating systems with limited computational ability. The use of VPCs leads to a...reduced need for network throughput and faster overall execution. 14. SUBJECT TERMS GCSS-MC, enterprise resource planning, virtual personal computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cresap, D.A.; Halverson, D.S.
In the Fluorinel Dissolution Process (FDP) upgrade, excess hydrofluoric acid in the dissolver product must be complexed with aluminum nitrate (ANN) to eliminate corrosion concerns, adjusted with nitrate to facilitate extraction, and diluted with water to ensure solution stability. This is currently accomplished via batch processing in large vessels. However, to accommodate increases in projected throughput and reduce water production in a cost-effective manner, a semi-continuous system (In-line Complexing (ILC)) has been developed. The major conclusions drawn from tests demonstrating the feasibility of this concept are given in this report.
1992-10-01
Manual CI APPENDIX D: Drawing Navigator Field Test D1 DISTRIBUTION Accesion For NTIS CRA&I OTIC TAB Unannouncea JustiteCdtOn By Distribution I "".i•I...methods replace manual methods, the automation will handle the data for the designer, thus reducing error and increasing throughput. However, the two...actively move data from one automation tool (CADD) to the other (the analysis program). This intervention involves a manual rekeying of data already in
Kang, Lifeng; Chung, Bong Geun; Langer, Robert; Khademhosseini, Ali
2009-01-01
Microfluidic technologies’ ability to miniaturize assays and increase experimental throughput have generated significant interest in the drug discovery and development domain. These characteristics make microfluidic systems a potentially valuable tool for many drug discovery and development applications. Here, we review the recent advances of microfluidic devices for drug discovery and development and highlight their applications in different stages of the process, including target selection, lead identification, preclinical tests, clinical trials, chemical synthesis, formulations studies, and product management. PMID:18190858
NASA Astrophysics Data System (ADS)
Yan, Yunxiang; Wang, Gang; Sun, Weimin; Luo, A.-Li; Ma, Zhenyu; Li, Jian; Wang, Shuqing
2017-04-01
Focal ratio degradation (FRD) is a major contributor to throughput and light loss in a fibre spectroscopic telescope system. We combine the guided mode theory in geometric optics and a well-known model, the power distribution model (PDM), to predict and explain the FRD dependence properties. We present a robust method by modifying the energy distribution method with f-intercept to control the input condition. This method provides a way to determine the proper position of the fibre end on the focal plane to improve energy utilization and FRD performance, which lifts the relative throughput up to 95 per cent with variation of output focal ratio less than 2 per cent. This method can also help to optimize the arrangement of the position of focal-plane plate to enhance the coupling efficiency in a telescope. To investigate length properties, we modified the PDM by introducing a new parameter, the focal distance f, into the original model to make it available for a multiposition measurement system. The results show that the modified model is robust and feasible for measuring the key parameter d0 to simulate the transmission characteristics. The output focal ratio in the experiment does not follow the prediction trend but shows an interesting phenomenon: the output focal ratio increases first to the peak, then decreases and remains stable finally with increasing fibre length longer than 15 m. This provides a reference for choosing the appropriate length of fibre to improve the FRD performance for the design of the fibre system in a telescope.
High-throughput measurement of rice tillers using a conveyor equipped with x-ray computed tomography
NASA Astrophysics Data System (ADS)
Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian
2011-02-01
Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.
Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian
2011-02-01
Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.
A Next-Generation Hard X-Ray Nanoprobe Beamline for In Situ Studies of Energy Materials and Devices
NASA Astrophysics Data System (ADS)
Maser, Jörg; Lai, Barry; Buonassisi, Tonio; Cai, Zhonghou; Chen, Si; Finney, Lydia; Gleber, Sophie-Charlotte; Jacobsen, Chris; Preissner, Curt; Roehrig, Chris; Rose, Volker; Shu, Deming; Vine, David; Vogt, Stefan
2014-01-01
The Advanced Photon Source is developing a suite of new X-ray beamlines to study materials and devices across many length scales and under real conditions. One of the flagship beamlines of the APS upgrade is the In Situ Nanoprobe (ISN) beamline, which will provide in situ and operando characterization of advanced energy materials and devices under varying temperatures, gas ambients, and applied fields, at previously unavailable spatial resolution and throughput. Examples of materials systems include inorganic and organic photovoltaic systems, advanced battery systems, fuel cell components, nanoelectronic devices, advanced building materials and other scientifically and technologically relevant systems. To characterize these systems at very high spatial resolution and trace sensitivity, the ISN will use both nanofocusing mirrors and diffractive optics to achieve spots sizes as small as 20 nm. Nanofocusing mirrors in Kirkpatrick-Baez geometry will provide several orders of magnitude increase in photon flux at a spatial resolution of 50 nm. Diffractive optics such as zone plates and/or multilayer Laue lenses will provide a highest spatial resolution of 20 nm. Coherent diffraction methods will be used to study even small specimen features with sub-10 nm relevant length scale. A high-throughput data acquisition system will be employed to significantly increase operations efficiency and usability of the instrument. The ISN will provide full spectroscopy capabilities to study the chemical state of most materials in the periodic table, and enable X-ray fluorescence tomography. In situ electrical characterization will enable operando studies of energy and electronic devices such as photovoltaic systems and batteries. We describe the optical concept for the ISN beamline, the technical design, and the approach for enabling a broad variety of in situ studies. We furthermore discuss the application of hard X-ray microscopy to study defects in multi-crystalline solar cells, one of the lines of inquiries for which the ISN is being developed.
Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas
2017-01-01
Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.
A bipolar population counter using wave pipelining to achieve 2.5 x normal clock frequency
NASA Technical Reports Server (NTRS)
Wong, Derek C.; De Micheli, Giovanni; Flynn, Michael J.; Huston, Robert E.
1992-01-01
Wave pipelining is a technique for pipelining digital systems that can increase clock frequency in practical circuits without increasing the number of storage elements. In wave pipelining, multiple coherent waves of data are sent through a block of combinational logic by applying new inputs faster than the delay through the logic. The throughput of a 63-b CML population counter was increased from 97 to 250 MHz using wave pipelining. The internal circuit is flowthrough combinational logic. Novel CAD methods have balanced all input-to-output paths to about the same delay. This allows multiple data waves to propagate in sequence when the circuit is clocked faster than its propagation delay.
Towards high-throughput automated targeted femtosecond laser-based transfection of adherent cells
NASA Astrophysics Data System (ADS)
Antkowiak, Maciej; Torres-Mapa, Maria Leilani; Gunn-Moore, Frank; Dholakia, Kishan
2011-03-01
Femtosecond laser induced cell membrane poration has proven to be an attractive alternative to the classical methods of drug and gene delivery. It is a selective, sterile, non-contact technique that offers a highly localized operation, low toxicity and consistent performance. However, its broader application still requires the development of robust, high-throughput and user-friendly systems. We present a system capable of unassisted enhanced targeted optoinjection and phototransfection of adherent mammalian cells with a femtosecond laser. We demonstrate the advantages of a dynamic diffractive optical element, namely a spatial light modulator (SLM) for precise three dimensional positioning of the beam. It enables the implementation of a "point-and-shoot" system in which using the software interface a user simply points at the cell and a predefined sequence of precisely positioned doses can be applied. We show that irradiation in three axial positions alleviates the problem of exact beam positioning on the cell membrane and doubles the number of viably optoinjected cells when compared with a single dose. The presented system enables untargeted raster scan irradiation which provides transfection of adherent cells at the throughput of 1 cell per second.
NASA Technical Reports Server (NTRS)
Bhasin, K. B.; Connolly, D. J.
1986-01-01
Future communications satellites are likely to use gallium arsenide (GaAs) monolithic microwave integrated-circuit (MMIC) technology in most, if not all, communications payload subsystems. Multiple-scanning-beam antenna systems are expected to use GaAs MMIC's to increase functional capability, to reduce volume, weight, and cost, and to greatly improve system reliability. RF and IF matrix switch technology based on GaAs MMIC's is also being developed for these reasons. MMIC technology, including gigabit-rate GaAs digital integrated circuits, offers substantial advantages in power consumption and weight over silicon technologies for high-throughput, on-board baseband processor systems. In this paper, current developments in GaAs MMIC technology are described, and the status and prospects of the technology are assessed.
Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.
2008-01-01
Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598
Duarte, José M; Barbier, Içvara; Schaerli, Yolanda
2017-11-17
Synthetic biologists increasingly rely on directed evolution to optimize engineered biological systems. Applying an appropriate screening or selection method for identifying the potentially rare library members with the desired properties is a crucial step for success in these experiments. Special challenges include substantial cell-to-cell variability and the requirement to check multiple states (e.g., being ON or OFF depending on the input). Here, we present a high-throughput screening method that addresses these challenges. First, we encapsulate single bacteria into microfluidic agarose gel beads. After incubation, they harbor monoclonal bacterial microcolonies (e.g., expressing a synthetic construct) and can be sorted according their fluorescence by fluorescence activated cell sorting (FACS). We determine enrichment rates and demonstrate that we can measure the average fluorescent signals of microcolonies containing phenotypically heterogeneous cells, obviating the problem of cell-to-cell variability. Finally, we apply this method to sort a pBAD promoter library at ON and OFF states.
A High-Throughput Arabidopsis Reverse Genetics System
Sessions, Allen; Burke, Ellen; Presting, Gernot; Aux, George; McElver, John; Patton, David; Dietrich, Bob; Ho, Patrick; Bacwaden, Johana; Ko, Cynthia; Clarke, Joseph D.; Cotton, David; Bullis, David; Snell, Jennifer; Miguel, Trini; Hutchison, Don; Kimmerly, Bill; Mitzel, Theresa; Katagiri, Fumiaki; Glazebrook, Jane; Law, Marc; Goff, Stephen A.
2002-01-01
A collection of Arabidopsis lines with T-DNA insertions in known sites was generated to increase the efficiency of functional genomics. A high-throughput modified thermal asymetric interlaced (TAIL)-PCR protocol was developed and used to amplify DNA fragments flanking the T-DNA left borders from ∼100,000 transformed lines. A total of 85,108 TAIL-PCR products from 52,964 T-DNA lines were sequenced and compared with the Arabidopsis genome to determine the positions of T-DNAs in each line. Predicted T-DNA insertion sites, when mapped, showed a bias against predicted coding sequences. Predicted insertion mutations in genes of interest can be identified using Arabidopsis Gene Index name searches or by BLAST (Basic Local Alignment Search Tool) search. Insertions can be confirmed by simple PCR assays on individual lines. Predicted insertions were confirmed in 257 of 340 lines tested (76%). This resource has been named SAIL (Syngenta Arabidopsis Insertion Library) and is available to the scientific community at www.tmri.org. PMID:12468722
Dolado, Ignacio; Nieto, Joan; Saraiva, Maria João M; Arsequell, Gemma; Valencia, Gregori; Planas, Antoni
2005-01-01
Stabilization of tetrameric transthyretin (TTR) by binding of small ligands is a current strategy aimed at inhibiting amyloid fibrillogenesis in transthyretin-associated pathologies, such as senile systemic amyloidosis (SSA) and familial amyloidotic polyneuropathy (FAP). A kinetic assay is developed for rapid evaluation of compounds as potential in vitro inhibitors in a high-throughput screening format. It is based on monitoring the time-dependent increase of absorbance due to turbidity occurring by acid-induced protein aggregation. The method uses the highly amyloidogenic Y78F mutant of human transthyretin (heterogously expressed in Escherichia coli cells). Initial rates of protein aggregation at different inhibitor concentrations follow a monoexponential dose-response curve from which inhibition parameters are calculated. For the assay development, thyroid hormones and nonsteroidal antiinflamatory drugs were chosen among other reference compounds. Some of them are already known to be in vitro inhibitors of TTR amyloidogenesis. Analysis time is optimized to last 1.5 h, and the method is implemented in microtiter plates for screening of libraries of potential fibrillogenesis inhibitors.
Embedded Hyperchaotic Generators: A Comparative Analysis
NASA Astrophysics Data System (ADS)
Sadoudi, Said; Tanougast, Camel; Azzaz, Mohamad Salah; Dandache, Abbas
In this paper, we present a comparative analysis of FPGA implementation performances, in terms of throughput and resources cost, of five well known autonomous continuous hyperchaotic systems. The goal of this analysis is to identify the embedded hyperchaotic generator which leads to designs with small logic area cost, satisfactory throughput rates, low power consumption and low latency required for embedded applications such as secure digital communications between embedded systems. To implement the four-dimensional (4D) chaotic systems, we use a new structural hardware architecture based on direct VHDL description of the forth order Runge-Kutta method (RK-4). The comparative analysis shows that the hyperchaotic Lorenz generator provides attractive performances compared to that of others. In fact, its hardware implementation requires only 2067 CLB-slices, 36 multipliers and no block RAMs, and achieves a throughput rate of 101.6 Mbps, at the output of the FPGA circuit, at a clock frequency of 25.315 MHz with a low latency time of 316 ns. Consequently, these good implementation performances offer to the embedded hyperchaotic Lorenz generator the advantage of being the best candidate for embedded communications applications.
PREVAIL: IBM's e-beam technology for next generation lithography
NASA Astrophysics Data System (ADS)
Pfeiffer, Hans C.
2000-07-01
PREVAIL - Projection Reduction Exposure with Variable Axis Immersion Lenses represents the high throughput e-beam projection approach to NGL which IBM is pursuing in cooperation with Nikon Corporation as alliance partner. This paper discusses the challenges and accomplishments of the PREVAIL project. The supreme challenge facing all e-beam lithography approaches has been and still is throughput. Since the throughput of e-beam projection systems is severely limited by the available optical field size, the key to success is the ability to overcome this limitation. The PREVAIL technique overcomes field-limiting off-axis aberrations through the use of variable axis lenses, which electronically shift the optical axis simultaneously with the deflected beam so that the beam effectively remains on axis. The resist images obtained with the Proof-of-Concept (POC) system demonstrate that PREVAIL effectively eliminates off- axis aberrations affecting both resolution and placement accuracy of pixels. As part of the POC system a high emittance gun has been developed to provide uniform illumination of the patterned subfield and to fill the large numerical aperture projection optics designed to significantly reduce beam blur caused by Coulomb interaction.
Yun, Kyungwon; Lee, Hyunjae; Bang, Hyunwoo; Jeon, Noo Li
2016-02-21
This study proposes a novel way to achieve high-throughput image acquisition based on a computer-recognizable micro-pattern implemented on a microfluidic device. We integrated the QR code, a two-dimensional barcode system, onto the microfluidic device to simplify imaging of multiple ROIs (regions of interest). A standard QR code pattern was modified to arrays of cylindrical structures of polydimethylsiloxane (PDMS). Utilizing the recognition of the micro-pattern, the proposed system enables: (1) device identification, which allows referencing additional information of the device, such as device imaging sequences or the ROIs and (2) composing a coordinate system for an arbitrarily located microfluidic device with respect to the stage. Based on these functionalities, the proposed method performs one-step high-throughput imaging for data acquisition in microfluidic devices without further manual exploration and locating of the desired ROIs. In our experience, the proposed method significantly reduced the time for the preparation of an acquisition. We expect that the method will innovatively improve the prototype device data acquisition and analysis.
Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey
2011-11-01
In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers
Using Six Sigma and Lean methodologies to improve OR throughput.
Fairbanks, Catharine B
2007-07-01
Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members. (c) AORN, Inc, 2007.
High-throughput optofluidic system for the laser microsurgery of oocytes
NASA Astrophysics Data System (ADS)
Chandsawangbhuwana, Charlie; Shi, Linda Z.; Zhu, Qingyuan; Alliegro, Mark C.; Berns, Michael W.
2012-01-01
This study combines microfluidics with optical microablation in a microscopy system that allows for high-throughput manipulation of oocytes, automated media exchange, and long-term oocyte observation. The microfluidic component of the system transports oocytes from an inlet port into multiple flow channels. Within each channel, oocytes are confined against a microfluidic barrier using a steady fluid flow provided by an external computer-controlled syringe pump. This allows for easy media replacement without disturbing the oocyte location. The microfluidic and optical-laser microbeam ablation capabilities of the system were validated using surf clam (Spisula solidissima) oocytes that were immobilized in order to permit ablation of the 5 μm diameter nucleolinus within the oocyte nucleolus. Oocytes were the followed and assayed for polar body ejection.
Peroxisystem: harnessing systems cell biology to study peroxisomes.
Schuldiner, Maya; Zalckvar, Einat
2015-04-01
In recent years, high-throughput experimentation with quantitative analysis and modelling of cells, recently dubbed systems cell biology, has been harnessed to study the organisation and dynamics of simple biological systems. Here, we suggest that the peroxisome, a fascinating dynamic organelle, can be used as a good candidate for studying a complete biological system. We discuss several aspects of peroxisomes that can be studied using high-throughput systematic approaches and be integrated into a predictive model. Such approaches can be used in the future to study and understand how a more complex biological system, like a cell and maybe even ultimately a whole organism, works. © 2015 Société Française des Microscopies and Société de Biologie Cellulaire de France. Published by John Wiley & Sons Ltd.
Jeudy, Christian; Adrian, Marielle; Baussard, Christophe; Bernard, Céline; Bernaud, Eric; Bourion, Virginie; Busset, Hughes; Cabrera-Bosquet, Llorenç; Cointault, Frédéric; Han, Simeng; Lamboeuf, Mickael; Moreau, Delphine; Pivato, Barbara; Prudent, Marion; Trouvelot, Sophie; Truong, Hoai Nam; Vernoud, Vanessa; Voisin, Anne-Sophie; Wipf, Daniel; Salon, Christophe
2016-01-01
In order to maintain high yields while saving water and preserving non-renewable resources and thus limiting the use of chemical fertilizer, it is crucial to select plants with more efficient root systems. This could be achieved through an optimization of both root architecture and root uptake ability and/or through the improvement of positive plant interactions with microorganisms in the rhizosphere. The development of devices suitable for high-throughput phenotyping of root structures remains a major bottleneck. Rhizotrons suitable for plant growth in controlled conditions and non-invasive image acquisition of plant shoot and root systems (RhizoTubes) are described. These RhizoTubes allow growing one to six plants simultaneously, having a maximum height of 1.1 m, up to 8 weeks, depending on plant species. Both shoot and root compartment can be imaged automatically and non-destructively throughout the experiment thanks to an imaging cabin (RhizoCab). RhizoCab contains robots and imaging equipment for obtaining high-resolution pictures of plant roots. Using this versatile experimental setup, we illustrate how some morphometric root traits can be determined for various species including model (Medicago truncatula), crops (Pisum sativum, Brassica napus, Vitis vinifera, Triticum aestivum) and weed (Vulpia myuros) species grown under non-limiting conditions or submitted to various abiotic and biotic constraints. The measurement of the root phenotypic traits using this system was compared to that obtained using "classic" growth conditions in pots. This integrated system, to include 1200 Rhizotubes, will allow high-throughput phenotyping of plant shoots and roots under various abiotic and biotic environmental conditions. Our system allows an easy visualization or extraction of roots and measurement of root traits for high-throughput or kinetic analyses. The utility of this system for studying root system architecture will greatly facilitate the identification of genetic and environmental determinants of key root traits involved in crop responses to stresses, including interactions with soil microorganisms.
A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing.
Hsiao, Yi-Hsing; Hsu, Chia-Hsien; Chen, Chihchen
2016-07-08
The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca(2+) concentration. However, glucose evoked a rapid elevation of intracellular Ca(2+) followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.
NASA Technical Reports Server (NTRS)
Prevot, Thomas; Mercer, Joey S.; Martin, Lynne Hazel; Homola, Jeffrey R.; Cabrall, Christopher D.; Brasil, Connie L.
2011-01-01
In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace.
Methods for the development of a bioregenerative life support system
NASA Technical Reports Server (NTRS)
Goldman, Michelle; Gomez, Shawn; Voorhees, Mike
1990-01-01
Presented here is a rudimentary approach to designing a life support system based on the utilization of plants and animals. The biggest stumbling block in the initial phases of developing a bioregenerative life support system is encountered in collecting and consolidating the data. If a database existed for the systems engineer so that he or she may have accurate data and a better understanding of biological systems in engineering terms, then the design process would be simplified. Also addressed is a means of evaluating the subsystems chosen. These subsystems are unified into a common metric, kilograms of mass, and normalized in relation to the throughput of a few basic elements. The initial integration of these subsystems is based on input/output masses and eventually balanced to a point of operation within the inherent performance ranges of the organisms chosen. At this point, it becomes necessary to go beyond the simplifying assumptions of simple mass relationships and further define for each organism the processes used to manipulate the throughput matter. Mainly considered here is the fact that these organisms perform input/output functions on differing timescales, thus establishing the need for buffer volumes or appropriate subsystem phasing. At each point in a systematic design it is necessary to disturb the system and discern its sensitivity to the disturbance. This can be done either through the introduction of a catastrophic failure or by applying a small perturbation to the system. One example is increasing the crew size. Here the wide range of performance characteristics once again shows that biological systems have an inherent advantage in responding to systemic perturbations. Since the design of any space-based system depends on mass, power, and volume requirements, each subsystem must be evaluated in these terms.
From Lab to Fab: Developing a Nanoscale Delivery Tool for Scalable Nanomanufacturing
NASA Astrophysics Data System (ADS)
Safi, Asmahan A.
The emergence of nanomaterials with unique properties at the nanoscale over the past two decades carries a capacity to impact society and transform or create new industries ranging from nanoelectronics to nanomedicine. However, a gap in nanomanufacturing technologies has prevented the translation of nanomaterial into real-world commercialized products. Bridging this gap requires a paradigm shift in methods for fabricating structured devices with a nanoscale resolution in a repeatable fashion. This thesis explores the new paradigms for fabricating nanoscale structures devices and systems for high throughput high registration applications. We present a robust and scalable nanoscale delivery platform, the Nanofountain Probe (NFP), for parallel direct-write of functional materials. The design and microfabrication of NFP is presented. The new generation addresses the challenges of throughput, resolution and ink replenishment characterizing tip-based nanomanufacturing. To achieve these goals, optimized probe geometry is integrated to the process along with channel sealing and cantilever bending. The capabilities of the newly fabricated probes are demonstrated through two type of delivery: protein nanopatterning and single cell nanoinjection. The broad applications of the NFP for single cell delivery are investigated. An external microfluidic packaging is developed to enable delivery in liquid environment. The system is integrated to a combined atomic force microscope and inverted fluorescence microscope. Intracellular delivery is demonstrated by injecting a fluorescent dextran into Hela cells in vitro while monitoring the injection forces. Such developments enable in vitro cellular delivery for single cell studies and high throughput gene expression. The nanomanufacturing capabilities of NFPs are explored. Nanofabrication of carbon nanotube-based electronics presents all the manufacturing challenges characterizing of assembling nanomaterials precisely onto devices. The presented study combines top-down and bottom-approaches by integrating the catalyst patterning and carbon nanotube growth directly on structures. Large array of iron-rich catalyst are patterned on an substrate for subsequent carbon nanotubes synthesis. The dependence of probe geometry and substrate wetting is assessed by modeling and experimental studies. Finally preliminary results on synthesis of carbon nanotube by catalyst assisted chemical vapor deposition suggest increasing the catalyst yield is critical. Such work will enable high throughput nanomanufacturing of carbon nanotube based devices.
High-throughput methods for characterizing the mechanical properties of coatings
NASA Astrophysics Data System (ADS)
Siripirom, Chavanin
The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component segregates to the surface during curing. The distribution of siloxane as a function of thickness into the sample showed differences depending on the formulation parameters. The coatings which had higher siloxane content near the surface were those coatings found to perform well in field tests.
Hupert, Mateusz L; Jackson, Joshua M; Wang, Hong; Witek, Małgorzata A; Kamande, Joyce; Milowsky, Matthew I; Whang, Young E; Soper, Steven A
2014-10-01
Microsystem-based technologies are providing new opportunities in the area of in vitro diagnostics due to their ability to provide process automation enabling point-of-care operation. As an example, microsystems used for the isolation and analysis of circulating tumor cells (CTCs) from complex, heterogeneous samples in an automated fashion with improved recoveries and selectivity are providing new opportunities for this important biomarker. Unfortunately, many of the existing microfluidic systems lack the throughput capabilities and/or are too expensive to manufacture to warrant their widespread use in clinical testing scenarios. Here, we describe a disposable, all-polymer, microfluidic system for the high-throughput (HT) isolation of CTCs directly from whole blood inputs. The device employs an array of high aspect ratio (HAR), parallel, sinusoidal microchannels (25 µm × 150 µm; W × D; AR = 6.0) with walls covalently decorated with anti-EpCAM antibodies to provide affinity-based isolation of CTCs. Channel width, which is similar to an average CTC diameter (12-25 µm), plays a critical role in maximizing the probability of cell/wall interactions and allows for achieving high CTC recovery. The extended channel depth allows for increased throughput at the optimized flow velocity (2 mm/s in a microchannel); maximizes cell recovery, and prevents clogging of the microfluidic channels during blood processing. Fluidic addressing of the microchannel array with a minimal device footprint is provided by large cross-sectional area feed and exit channels poised orthogonal to the network of the sinusoidal capillary channels (so-called Z-geometry). Computational modeling was used to confirm uniform addressing of the channels in the isolation bed. Devices with various numbers of parallel microchannels ranging from 50 to 320 have been successfully constructed. Cyclic olefin copolymer (COC) was chosen as the substrate material due to its superior properties during UV-activation of the HAR microchannels surfaces prior to antibody attachment. Operation of the HT-CTC device has been validated by isolation of CTCs directly from blood secured from patients with metastatic prostate cancer. High CTC sample purities (low number of contaminating white blood cells, WBCs) allowed for direct lysis and molecular profiling of isolated CTCs.
Elich, Thomas; Iskra, Timothy; Daniels, William; Morrison, Christopher J
2016-06-01
Effective cleaning of chromatography resin is required to prevent fouling and maximize the number of processing cycles which can be achieved. Optimization of resin cleaning procedures, however, can lead to prohibitive material, labor, and time requirements, even when using milliliter scale chromatography columns. In this work, high throughput (HT) techniques were used to evaluate cleaning agents for a monoclonal antibody (mAb) polishing step utilizing Fractogel(®) EMD TMAE HiCap (M) anion exchange (AEX) resin. For this particular mAb feed stream, the AEX resin could not be fully restored with traditional NaCl and NaOH cleaning solutions, resulting in a loss of impurity capacity with resin cycling. Miniaturized microliter scale chromatography columns and an automated liquid handling system (LHS) were employed to evaluate various experimental cleaning conditions. Cleaning agents were monitored for their ability to maintain resin impurity capacity over multiple processing cycles by analyzing the flowthrough material for turbidity and high molecular weight (HMW) content. HT experiments indicated that a 167 mM acetic acid strip solution followed by a 0.5 M NaOH, 2 M NaCl sanitization provided approximately 90% cleaning improvement over solutions containing solely NaCl and/or NaOH. Results from the microliter scale HT experiments were confirmed in subsequent evaluations at the milliliter scale. These results identify cleaning agents which may restore resin performance for applications involving fouling species in ion exchange systems. In addition, this work demonstrates the use of miniaturized columns operated with an automated LHS for HT evaluation of chromatographic cleaning procedures, effectively decreasing material requirements while simultaneously increasing throughput. Biotechnol. Bioeng. 2016;113: 1251-1259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Nanostructured plasmonic interferometers for ultrasensitive label-free biosensing
NASA Astrophysics Data System (ADS)
Gao, Yongkang
Optical biosensors that utilize surface plasmon resonance (SPR) technique to analyze the biomolecular interactions have been extensively explored in the last two decades and have become the gold standard for label-free biosensing. These powerful sensing tools allow fast, highly-sensitive monitoring of the interaction between biomolecules in real time, without the need for laborious fluorescent labeling, and have found widely ranging applications from biomedical diagnostics and drug discovery, to environmental sensing and food safety monitoring. However, the prism-coupling SPR geometry is complex and bulky, and has severely limited the integration of this technique into low-cost portable biomedical devices for point-of-care diagnostics and personal healthcare applications. Also, the complex prism-coupling scheme prevents the use of high numerical aperture (NA) optics to increase the spatial resolution for multi-channel, high-throughput detection in SPR imaging mode. This dissertation is focused on the design and fabrication of a promising new class of nanopatterned interferometric SPR sensors that integrate the strengths of miniaturized nanoplasmonic architectures with sensitive optical interferometry techniques to achieve bold advances in SPR biosensing. The nanosensor chips developed provide superior sensing performance comparable to conventional SPR systems, but employing a far simpler collinear optical transmission geometry, which largely facilitates system integration, miniaturization, and low-cost production. Moreover, the fabricated nanostructure-based SPR sensors feature a very small sensor footprint, allowing massive multiplexing on a chip for high-throughput detection. The successful transformation of SPR technique from bulky prism-coupling setup into this low-cost compact plasmonic platform would have a far-reaching impact on point-of-care diagnostic tools and also lead to advances in high-throughput sensing applications in proteomics, immunology, drug discovery, and fundamental cell biology research.
Identification of microRNAs in PCV2 subclinically infected pigs by high throughput sequencing.
Núñez-Hernández, Fernando; Pérez, Lester J; Muñoz, Marta; Vera, Gonzalo; Tomás, Anna; Egea, Raquel; Córdoba, Sarai; Segalés, Joaquim; Sánchez, Armand; Núñez, José I
2015-03-03
Porcine circovirus type 2 (PCV2) is the essential etiological infectious agent of PCV2-systemic disease and has been associated with other swine diseases, all of them collectively known as porcine circovirus diseases. MicroRNAs (miRNAs) are a new class of small non-coding RNAs that regulate gene expression post-transcriptionally. miRNAs play an increasing role in many biological processes. The study of miRNA-mediated host-pathogen interactions has emerged in the last decade due to the important role that miRNAs play in antiviral defense. The objective of this study was to identify the miRNA expression pattern in PCV2 subclinically infected and non-infected pigs. For this purpose an experimental PCV2 infection was carried out and small-RNA libraries were constructed from tonsil and mediastinal lymph node (MLN) of infected and non-infected pigs. High throughput sequencing determined differences in miRNA expression in MLN between infected and non-infected while, in tonsil, a very conserved pattern was observed. In MLN, miRNA 126-3p, miRNA 126-5p, let-7d-3p, mir-129a and mir-let-7b-3p were up-regulated whereas mir-193a-5p, mir-574-5p and mir-34a down-regulated. Prediction of functional analysis showed that these miRNAs can be involved in pathways related to immune system and in processes related to the pathogenesis of PCV2, although functional assays are needed to support these predictions. This is the first study on miRNA gene expression in pigs infected with PCV2 using a high throughput sequencing approach in which several host miRNAs were differentially expressed in response to PCV2 infection.
Laser processes and system technology for the production of high-efficient crystalline solar cells
NASA Astrophysics Data System (ADS)
Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.
2012-10-01
The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.
High-radiance LDP source for mask inspection and beam line applications (Conference Presentation)
NASA Astrophysics Data System (ADS)
Teramoto, Yusuke; Santos, Bárbara; Mertens, Guido; Kops, Ralf; Kops, Margarete; von Wezyk, Alexander; Bergmann, Klaus; Yabuta, Hironobu; Nagano, Akihisa; Ashizawa, Noritaka; Taniguchi, Yuta; Yamatani, Daiki; Shirai, Takahiro; Kasama, Kunihiko
2017-04-01
High-throughput actinic mask inspection tools are needed as EUVL begins to enter into volume production phase. One of the key technologies to realize such inspection tools is a high-radiance EUV source of which radiance is supposed to be as high as 100 W/mm2/sr. Ushio is developing laser-assisted discharge-produced plasma (LDP) sources. Ushio's LDP source is able to provide sufficient radiance as well as cleanliness, stability and reliability. Radiance behind the debris mitigation system was confirmed to be 120 W/mm2/sr at 9 kHz and peak radiance at the plasma was increased to over 200 W/mm2/sr in the recent development which supports high-throughput, high-precision mask inspection in the current and future technology nodes. One of the unique features of Ushio's LDP source is cleanliness. Cleanliness evaluation using both grazing-incidence Ru mirrors and normal-incidence Mo/Si mirrors showed no considerable damage to the mirrors other than smooth sputtering of the surface at the pace of a few nm per Gpulse. In order to prove the system reliability, several long-term tests were performed. Data recorded during the tests was analyzed to assess two-dimensional radiance stability. In addition, several operating parameters were monitored to figure out which contributes to the radiance stability. The latest model that features a large opening angle was recently developed so that the tool can utilize a large number of debris-free photons behind the debris shield. The model was designed both for beam line application and high-throughput mask inspection application. At the time of publication, the first product is supposed to be in use at the customer site.
Jiang, Fan; Fu, Wei; Clarke, Anthony R; Schutze, Mark Kurt; Susanto, Agus; Zhu, Shuifang; Li, Zhihong
2016-11-01
Invasive species can be detrimental to a nation's ecology, economy and human health. Rapid and accurate diagnostics are critical to limit the establishment and spread of exotic organisms. The increasing rate of biological invasions relative to the taxonomic expertise available generates a demand for high-throughput, DNA-based diagnostics methods for identification. We designed species-specific qPCR primer and probe combinations for 27 economically important tephritidae species in six genera (Anastrepha, Bactrocera, Carpomya, Ceratitis, Dacus and Rhagoletis) based on 935 COI DNA barcode haplotypes from 181 fruit fly species publically available in BOLD, and then tested the specificity for each primer pair and probe through qPCR of 35 of those species. We then developed a standardization reaction system for detecting the 27 target species based on a microfluidic dynamic array and also applied the method to identify unknown immature samples from port interceptions and field monitoring. This method led to a specific and simultaneous detection for all 27 species in 7.5 h, using only 0.2 μL of reaction system in each reaction chamber. The approach successfully discriminated among species within complexes that had genetic similarities of up to 98.48%, while it also identified all immature samples consistent with the subsequent results of morphological examination of adults which were reared from larvae of cohorts from the same samples. We present an accurate, rapid and high-throughput innovative approach for detecting fruit flies of quarantine concern. This is a new method which has broad potential to be one of international standards for plant quarantine and invasive species detection. © 2016 John Wiley & Sons Ltd.
Development and applicability of a ready-to-use PCR system for GMO screening.
Rosa, Sabrina F; Gatto, Francesco; Angers-Loustau, Alexandre; Petrillo, Mauro; Kreysa, Joachim; Querci, Maddalena
2016-06-15
With the growing number of GMOs introduced to the market, testing laboratories have seen their workload increase significantly. Ready-to-use multi-target PCR-based detection systems, such as pre-spotted plates (PSP), reduce analysis time while increasing capacity. This paper describes the development and applicability to GMO testing of a screening strategy involving a PSP and its associated web-based Decision Support System. The screening PSP was developed to detect all GMOs authorized in the EU in one single PCR experiment, through the combination of 16 validated assays. The screening strategy was successfully challenged in a wide inter-laboratory study on real-life food/feed samples. The positive outcome of this study could result in the adoption of a PSP screening strategy across the EU; a step that would increase harmonization and quality of GMO testing in the EU. Furthermore, this system could represent a model for other official control areas where high-throughput DNA-based detection systems are needed. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen
2015-10-01
Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.
Xu, Chun-Xiu; Yin, Xue-Feng
2011-02-04
A chip-based microfluidic system for high-throughput single-cell analysis is described. The system was integrated with continuous introduction of individual cells, rapid dynamic lysis, capillary electrophoretic (CE) separation and laser induced fluorescence (LIF) detection. A cross microfluidic chip with one sheath-flow channel located on each side of the sampling channel was designed. The labeled cells were hydrodynamically focused by sheath-flow streams and sequentially introduced into the cross section of the microchip under hydrostatic pressure generated by adjusting liquid levels in the reservoirs. Combined with the electric field applied on the separation channel, the aligned cells were driven into the separation channel and rapidly lysed within 33ms at the entry of the separation channel by Triton X-100 added in the sheath-flow solution. The maximum rate for introducing individual cells into the separation channel was about 150cells/min. The introduction of sheath-flow streams also significantly reduced the concentration of phosphate-buffered saline (PBS) injected into the separation channel along with single cells, thus reducing Joule heating during electrophoretic separation. The performance of this microfluidic system was evaluated by analysis of reduced glutathione (GSH) and reactive oxygen species (ROS) in single erythrocytes. A throughput of 38cells/min was obtained. The proposed method is simple and robust for high-throughput single-cell analysis, allowing for analysis of cell population with considerable size to generate results with statistical significance. Copyright © 2010 Elsevier B.V. All rights reserved.
Demonstration of lithography patterns using reflective e-beam direct write
NASA Astrophysics Data System (ADS)
Freed, Regina; Sun, Jeff; Brodie, Alan; Petric, Paul; McCord, Mark; Ronse, Kurt; Haspeslagh, Luc; Vereecke, Bart
2011-04-01
Traditionally, e-beam direct write lithography has been too slow for most lithography applications. E-beam direct write lithography has been used for mask writing rather than wafer processing since the maximum blur requirements limit column beam current - which drives e-beam throughput. To print small features and a fine pitch with an e-beam tool requires a sacrifice in processing time unless one significantly increases the total number of beams on a single writing tool. Because of the uncertainty with regards to the optical lithography roadmap beyond the 22 nm technology node, the semiconductor equipment industry is in the process of designing and testing e-beam lithography tools with the potential for high volume wafer processing. For this work, we report on the development and current status of a new maskless, direct write e-beam lithography tool which has the potential for high volume lithography at and below the 22 nm technology node. A Reflective Electron Beam Lithography (REBL) tool is being developed for high throughput electron beam direct write maskless lithography. The system is targeting critical patterning steps at the 22 nm node and beyond at a capital cost equivalent to conventional lithography. Reflective Electron Beam Lithography incorporates a number of novel technologies to generate and expose lithographic patterns with a throughput and footprint comparable to current 193 nm immersion lithography systems. A patented, reflective electron optic or Digital Pattern Generator (DPG) enables the unique approach. The Digital Pattern Generator is a CMOS ASIC chip with an array of small, independently controllable lens elements (lenslets), which act as an array of electron mirrors. In this way, the REBL system is capable of generating the pattern to be written using massively parallel exposure by ~1 million beams at extremely high data rates (~ 1Tbps). A rotary stage concept using a rotating platen carrying multiple wafers optimizes the writing strategy of the DPG to achieve the capability of high throughput for sparse pattern wafer levels. The lens elements on the DPG are fabricated at IMEC (Leuven, Belgium) under IMEC's CMORE program. The CMOS fabricated DPG contains ~ 1,000,000 lens elements, allowing for 1,000,000 individually controllable beamlets. A single lens element consists of 5 electrodes, each of which can be set at controlled voltage levels to either absorb or reflect the electron beam. A system using a linear movable stage and the DPG integrated into the electron optics module was used to expose patterns on device representative wafers. Results of these exposure tests are discussed.
Orthogonal strip HPGe planar SmartPET detectors in Compton configuration
NASA Astrophysics Data System (ADS)
Boston, H. C.; Gillam, J.; Boston, A. J.; Cooper, R. J.; Cresswell, J.; Grint, A. N.; Mather, A. R.; Nolan, P. J.; Scraggs, D. P.; Turk, G.; Hall, C. J.; Lazarus, I.; Berry, A.; Beveridge, T.; Lewis, R.
2007-10-01
The evolution of Germanium detector technology over the last decade has lead to the possibility that they can be employed in medical and security imaging. The potential of excellent energy resolution coupled with good position information that Germanium affords removes the necessity for mechanical collimators that would be required in a conventional gamma camera system. By removing this constraint, the overall dose to the patient can be reduced or the throughput of the system can be increased. An additional benefit of excellent energy resolution is that tight gates can be placed on energies from either a multi-lined gamma source or from multi-nuclide sources increasing the number of sources that can be used in medical imaging. In terms of security imaging, segmented Germanium gives directionality and excellent spectroscopic information.
NASA Astrophysics Data System (ADS)
Regmi, Raju; Mohan, Kavya; Mondal, Partha Pratim
2014-09-01
Visualization of intracellular organelles is achieved using a newly developed high throughput imaging cytometry system. This system interrogates the microfluidic channel using a sheet of light rather than the existing point-based scanning techniques. The advantages of the developed system are many, including, single-shot scanning of specimens flowing through the microfluidic channel at flow rate ranging from micro- to nano- lit./min. Moreover, this opens-up in-vivo imaging of sub-cellular structures and simultaneous cell counting in an imaging cytometry system. We recorded a maximum count of 2400 cells/min at a flow-rate of 700 nl/min, and simultaneous visualization of fluorescently-labeled mitochondrial network in HeLa cells during flow. The developed imaging cytometry system may find immediate application in biotechnology, fluorescence microscopy and nano-medicine.
Velez‐Suberbie, M. Lourdes; Betts, John P. J.; Walker, Kelly L.; Robinson, Colin; Zoro, Barney
2017-01-01
High throughput automated fermentation systems have become a useful tool in early bioprocess development. In this study, we investigated a 24 x 15 mL single use microbioreactor system, ambr 15f, designed for microbial culture. We compared the fed‐batch growth and production capabilities of this system for two Escherichia coli strains, BL21 (DE3) and MC4100, and two industrially relevant molecules, hGH and scFv. In addition, different carbon sources were tested using bolus, linear or exponential feeding strategies, showing the capacity of the ambr 15f system to handle automated feeding. We used power per unit volume (P/V) as a scale criterion to compare the ambr 15f with 1 L stirred bioreactors which were previously scaled‐up to 20 L with a different biological system, thus showing a potential 1,300 fold scale comparability in terms of both growth and product yield. By exposing the cells grown in the ambr 15f system to a level of shear expected in an industrial centrifuge, we determined that the cells are as robust as those from a bench scale bioreactor. These results provide evidence that the ambr 15f system is an efficient high throughput microbial system that can be used for strain and molecule selection as well as rapid scale‐up. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 34:58–68, 2018 PMID:28748655
New high-throughput measurement systems for radioactive wastes segregation and free release.
Suran, J; Kovar, P; Smoldasova, J; Solc, J; Skala, L; Arnold, D; Jerome, S; de Felice, P; Pedersen, B; Bogucarska, T; Tzika, F; van Ammel, R
2017-12-01
This paper addresses the measurement facilities for pre-selection of waste materials prior to measurement for repository acceptance or possible free release (segregation measurement system); and free release (free release measurement system), based on a single standardized concept characterized by unique, patented lead-free shielding. The key objective is to improve the throughput, accuracy, reliability, modularity and mobility of segregation and free-release measurement. This will result in a more reliable decision-making with regard to the safe release and disposal of radioactive wastes into the environment and, resulting in positive economic outcomes. The research was carried out within "Metrology for Decommissioning Nuclear Facilities" (MetroDecom) project. Copyright © 2017 Elsevier Ltd. All rights reserved.
High throughput computing: a solution for scientific analysis
O'Donnell, M.
2011-01-01
handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).
The path of least resistance: is there a better route?
Loree, Ann; Maihack, Marcia; Powell, Marge
2003-01-01
In May 2000, the radiology department at Stanford University Medical Center embarked on a five-year journey toward complete digitization. While the end goal was known, there was much less certainty about the steps involved along the way. Stanford worked with a team from GE Medical Systems to implement Six Sigma process improvement methodologies and related change management techniques. The methodical and evidence-based framework of Six Sigma significantly organized the process of "going digital" by breaking it into manageable projects with clear objectives. Stanford identified five key areas where improvement could be made: MR outpatient throughput, CT inpatient throughput, CT outpatient throughput, report turnaround time, and Lucile Packard Children's Hospital CR/Ortho throughput and digitization. The CT project is presented in this article. Although labor intensive, collecting radiology data manually is often the best way to obtain the level of detail required, unless there is a robust RIS in place with solid data integrity. To gather the necessary information without unduly impacting staff and workflow at Stanford, the consultants working onsite handled the actual observation and recording of data. Some of the changes introduced through Six Sigma may appear, at least on the surface, to be common sense. It is only by presenting clear evidence in terms of data, however, that the improvements can actually be implemented and accepted. By converting all appointments to 30 minutes and expanding hours of operation, Stanford was able to boost diagnostic imaging productivity, volume and revenue. With the ability to scan over lunch breaks and rest periods, potential appointment capacity increased by 140 CT scans per month. Overall, the CT project increased potential for outpatient appointment capacity by nearly 75% and projected over $1.5 million in additional annual gross revenue. The complex process of moving toward a digital radiology department at Stanford demonstrates that healthcare cannot be healed by technology alone. The ability to optimize patient services revolves around a combination of leading edge technology, dedicated and well-trained staff, and careful examination of processes and productivity.
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1990-01-01
As telescience systems become more and more complex, autonomous, and opaque to their operators it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed as they relate to total system validation. The assumption is made that human interaction with the automated system will be required well into the Space Station Freedom era. Candidate human performance measurement-validation techniques are discussed for selected ground-to-space-to-ground and space-to-space situations. Most of these measures may be used in conjunction with an information throughput model presented elsewhere (Haines, 1990). Teleoperations, teleanalysis, teleplanning, teledesign, and teledocumentation are considered, as are selected illustrative examples of space related telescience activities.
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
Natarajan, A; Molnar, P; Sieverdes, K; Jamshidi, A; Hickman, J J
2006-04-01
The threat of environmental pollution, biological warfare agent dissemination and new diseases in recent decades has increased research into cell-based biosensors. The creation of this class of sensors could specifically aid the detection of toxic chemicals and their effects in the environment, such as pyrethroid pesticides. Pyrethroids are synthetic pesticides that have been used increasingly over the last decade to replace other pesticides like DDT. In this study we used a high-throughput method to detect pyrethroids by using multielectrode extracellular recordings from cardiac cells. The data from this cell-electrode hybrid system was compared to published results obtained with patch-clamp electrophysiology and also used as an alternative method to further understand pyrethroid effects. Our biosensor consisted of a confluent monolayer of cardiac myocytes cultured on microelectrode arrays (MEA) composed of 60 substrate-integrated electrodes. Spontaneous activity of these beating cells produced extracellular field potentials in the range of 100 microV to nearly 1200 microV with a beating frequency of 0.5-4 Hz. All of the tested pyrethroids; alpha-Cypermethrin, Tetramethrin and Tefluthrin, produced similar changes in the electrophysiological properties of the cardiac myocytes, namely reduced beating frequency and amplitude. The sensitivity of our toxin detection method was comparable to earlier patch-clamp studies, which indicates that, in specific applications, high-throughput extracellular methods can replace single-cell studies. Moreover, the similar effect of all three pyrethroids on the measured parameters suggests, that not only detection of the toxins but, their classification might also be possible with this method. Overall our results support the idea that whole cell biosensors might be viable alternatives when compared to current toxin detection methods.
WDM mid-board optics for chip-to-chip wavelength routing interconnects in the H2020 ICT-STREAMS
NASA Astrophysics Data System (ADS)
Kanellos, G. T.; Pleros, N.
2017-02-01
Multi-socket server boards have emerged to increase the processing power density on the board level and further flatten the data center networks beyond leaf-spine architectures. Scaling however the number of processors per board puts current electronic technologies into challenge, as it requires high bandwidth interconnects and high throughput switches with increased number of ports that are currently unavailable. On-board optical interconnection has proved the potential to efficiently satisfy the bandwidth needs, but their use has been limited to parallel links without performing any smart routing functionality. With CWDM optical interconnects already a commodity, cyclical wavelength routing proposed to fit the datacom for rack-to-rack and board-to-board communication now becomes a promising on-board routing platform. ICT-STREAMS is a European research project that aims to combine WDM parallel on-board transceivers with a cyclical AWGR, in order to create a new board-level, chip-to-chip interconnection paradigm that will leverage WDM parallel transmission to a powerful wavelength routing platform capable to interconnect multiple processors with unprecedented bandwidth and throughput capacity. Direct, any-to-any, on-board interconnection of multiple processors will significantly contribute to further flatten the data centers and facilitate east-west communication. In the present communication, we present ICT-STREAMS on-board wavelength routing architecture for multiple chip-to-chip interconnections and evaluate the overall system performance in terms of throughput and latency for several schemes and traffic profiles. We also review recent advances of the ICT-STREAMS platform key-enabling technologies that span from Si in-plane lasers and polymer based electro-optical circuit boards to silicon photonics transceivers and photonic-crystal amplifiers.
Williams, James A; Eddleman, Laura; Pantone, Amy; Martinez, Regina; Young, Stephen; Van Der Pol, Barbara
2014-08-01
Next-generation diagnostics for Chlamydia trachomatis and Neisseria gonorrhoeae are available on semi- or fully-automated platforms. These systems require less hands-on time than older platforms and are user friendly. Four automated systems, the ABBOTT m2000 system, Becton Dickinson Viper System with XTR Technology, Gen-Probe Tigris DTS system, and Roche cobas 4800 system, were evaluated for total run time, hands-on time, and walk-away time. All of the systems evaluated in this time-motion study were able to complete a diagnostic test run within an 8-h work shift, instrument setup and operation were straightforward and uncomplicated, and walk-away time ranged from approximately 90 to 270 min in a head-to-head comparison of each system. All of the automated systems provide technical staff with increased time to perform other tasks during the run, offer easy expansion of the diagnostic test menu, and have the ability to increase specimen throughput. © 2013 Society for Laboratory Automation and Screening.
Optimizing multi-dimensional high throughput screening using zebrafish
Truong, Lisa; Bugel, Sean M.; Chlebowski, Anna; Usenko, Crystal Y.; Simonich, Michael T.; Massey Simonich, Staci L.; Tanguay, Robert L.
2016-01-01
The use of zebrafish for high throughput screening (HTS) for chemical bioactivity assessments is becoming routine in the fields of drug discovery and toxicology. Here we report current recommendations from our experiences in zebrafish HTS. We compared the effects of different high throughput chemical delivery methods on nominal water concentration, chemical sorption to multi-well polystyrene plates, transcription responses, and resulting whole animal responses. We demonstrate that digital dispensing consistently yields higher data quality and reproducibility compared to standard plastic tip-based liquid handling. Additionally, we illustrate the challenges in using this sensitive model for chemical assessment when test chemicals have trace impurities. Adaptation of these better practices for zebrafish HTS should increase reproducibility across laboratories. PMID:27453428
Combinatorial and high-throughput approaches in polymer science
NASA Astrophysics Data System (ADS)
Zhang, Huiqi; Hoogenboom, Richard; Meier, Michael A. R.; Schubert, Ulrich S.
2005-01-01
Combinatorial and high-throughput approaches have become topics of great interest in the last decade due to their potential ability to significantly increase research productivity. Recent years have witnessed a rapid extension of these approaches in many areas of the discovery of new materials including pharmaceuticals, inorganic materials, catalysts and polymers. This paper mainly highlights our progress in polymer research by using an automated parallel synthesizer, microwave synthesizer and ink-jet printer. The equipment and methodologies in our experiments, the high-throughput experimentation of different polymerizations (such as atom transfer radical polymerization, cationic ring-opening polymerization and emulsion polymerization) and the automated matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS) sample preparation are described.
A high performance hardware implementation image encryption with AES algorithm
NASA Astrophysics Data System (ADS)
Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab
2011-06-01
This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.
Pediatric Glioblastoma Therapies Based on Patient-Derived Stem Cell Resources
2014-11-01
genomic DNA and then subjected to Illumina high-throughput sequencing . In this analysis, shRNAs lost in the GSC population represent candidate gene...and genomic DNA and then subjected to Illumina high-throughput sequencing . In this analysis, shRNAs lost in the GSC population represent candidate...PRISM 7900 Sequence Detection System ( Genomics Resource, FHCRC). Relative transcript abundance was analyzed using the 2−ΔΔCt method. TRIzol (Invitrogen
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
On Data Transfers Over Wide-Area Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang
Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which showmore » concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.« less
Re-engineering adenovirus vector systems to enable high-throughput analyses of gene function.
Stanton, Richard J; McSharry, Brian P; Armstrong, Melanie; Tomasec, Peter; Wilkinson, Gavin W G
2008-12-01
With the enhanced capacity of bioinformatics to interrogate extensive banks of sequence data, more efficient technologies are needed to test gene function predictions. Replication-deficient recombinant adenovirus (Ad) vectors are widely used in expression analysis since they provide for extremely efficient expression of transgenes in a wide range of cell types. To facilitate rapid, high-throughput generation of recombinant viruses, we have re-engineered an adenovirus vector (designated AdZ) to allow single-step, directional gene insertion using recombineering technology. Recombineering allows for direct insertion into the Ad vector of PCR products, synthesized sequences, or oligonucleotides encoding shRNAs without requirement for a transfer vector Vectors were optimized for high-throughput applications by making them "self-excising" through incorporating the I-SceI homing endonuclease into the vector removing the need to linearize vectors prior to transfection into packaging cells. AdZ vectors allow genes to be expressed in their native form or with strep, V5, or GFP tags. Insertion of tetracycline operators downstream of the human cytomegalovirus major immediate early (HCMV MIE) promoter permits silencing of transgenes in helper cells expressing the tet repressor thus making the vector compatible with the cloning of toxic gene products. The AdZ vector system is robust, straightforward, and suited to both sporadic and high-throughput applications.
Mock, Andreas; Chiblak, Sara; Herold-Mende, Christel
2014-01-01
A growing body of evidence suggests that glioma stem cells (GSCs) account for tumor initiation, therapy resistance, and the subsequent regrowth of gliomas. Thus, continuous efforts have been undertaken to further characterize this subpopulation of less differentiated tumor cells. Although we are able to enrich GSCs, we still lack a comprehensive understanding of GSC phenotypes and behavior. The advent of high-throughput technologies raised hope that incorporation of these newly developed platforms would help to tackle such questions. Since then a couple of comparative genome-, transcriptome- and proteome-wide studies on GSCs have been conducted giving new insights in GSC biology. However, lessons had to be learned in designing high-throughput experiments and some of the resulting conclusions fell short of expectations because they were performed on only a few GSC lines or at one molecular level instead of an integrative poly-omics approach. Despite these shortcomings, our knowledge of GSC biology has markedly expanded due to a number of survival-associated biomarkers as well as glioma-relevant signaling pathways and therapeutic targets being identified. In this article we review recent findings obtained by comparative high-throughput analyses of GSCs. We further summarize fundamental concepts of systems biology as well as its applications for glioma stem cell research.
Artificial intelligence and robotics in high throughput post-genomics.
Laghaee, Aroosha; Malcolm, Chris; Hallam, John; Ghazal, Peter
2005-09-15
The shift of post-genomics towards a systems approach has offered an ever-increasing role for artificial intelligence (AI) and robotics. Many disciplines (e.g. engineering, robotics, computer science) bear on the problem of automating the different stages involved in post-genomic research with a view to developing quality assured high-dimensional data. We review some of the latest contributions of AI and robotics to this end and note the limitations arising from the current independent, exploratory way in which specific solutions are being presented for specific problems without regard to how these could be eventually integrated into one comprehensible integrated intelligent system.
Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R; Seliskar, Carl J; Limbach, Patrick A; Heineman, William R
2010-08-01
Parallel separations using CE on a multilane microchip with multiplexed LIF detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be determined in parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pK(a) determination of small molecule analytes is demonstrated with the multilane microchip.
Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R.; Seliskar, Carl J.; Limbach, Patrick A.; Heineman, William R.
2010-01-01
Parallel separations using capillary electrophoresis on a multilane microchip with multiplexed laser induced fluorescence detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be analyzed on parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pKa determination of small molecule analytes is demonstrated with the multilane microchip. PMID:20737446