NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Throughput increase by adjustment of the BARC drying time with coat track process
NASA Astrophysics Data System (ADS)
Brakensiek, Nickolas L.; Long, Ryan
2005-05-01
Throughput of a coater module within the coater track is related to the solvent evaporation rate from the material that is being coated. Evaporation rate is controlled by the spin dynamics of the wafer and airflow dynamics over the wafer. Balancing these effects is the key to achieving very uniform coatings across a flat unpatterned wafer. As today"s coat tracks are being pushed to higher throughputs to match the scanner, the coat module throughput must be increased as well. For chemical manufacturers the evaporation rate of the material depends on the solvent used. One measure of relative evaporation rates is to compare flash points of a solvent. The lower the flash point, the quicker the solvent will evaporate. It is possible to formulate products with these volatile solvents although at a price. Shipping and manufacturing a more flammable product increase chances of fire, thereby increasing insurance premiums. Also, the end user of these chemicals will have to take extra precautions in the fab and in storage of these more flammable chemicals. An alternative coat process is possible which would allow higher throughput in a distinct coat module without sacrificing safety. A tradeoff is required for this process, that being a more complicated coat process and a higher viscosity chemical. The coat process uses the fact that evaporation rate depends on the spin dynamics of the wafer by utilizing a series of spin speeds that first would set the thickness of the material followed by a high spin speed to remove the residual solvent. This new process can yield a throughput of over 150 wafers per hour (wph) given two coat modules. The thickness uniformity of less than 2 nm (3 sigma) is still excellent, while drying times are shorter than 10 seconds to achieve the 150 wph throughput targets.
High-throughput assay for optimising microbial biological control agent production and delivery
USDA-ARS?s Scientific Manuscript database
Lack of technologies to produce and deliver effective biological control agents (BCAs) is a major barrier to their commercialization. A myriad of variables associated with BCA cultivation, formulation, drying, storage, and reconstitution processes complicates agent quality maximization. An efficie...
Leaders in Future and Current Technology Teaming Up to Improve Ethanol
and NREL expertise to: Develop improvements in process throughput and water management for dry mill , Complete an overall process engineering model of the dry mill technology that identifies new ways to and operation of "dry mill" plants that currently produce ethanol from corn starch. Dry
Delta-Doping at Wafer Level for High Throughput, High Yield Fabrication of Silicon Imaging Arrays
NASA Technical Reports Server (NTRS)
Hoenk, Michael E. (Inventor); Nikzad, Shoulch (Inventor); Jones, Todd J. (Inventor); Greer, Frank (Inventor); Carver, Alexander G. (Inventor)
2014-01-01
Systems and methods for producing high quantum efficiency silicon devices. A silicon MBE has a preparation chamber that provides for cleaning silicon surfaces using an oxygen plasma to remove impurities and a gaseous (dry) NH3 + NF3 room temperature oxide removal process that leaves the silicon surface hydrogen terminated. Silicon wafers up to 8 inches in diameter have devices that can be fabricated using the cleaning procedures and MBE processing, including delta doping.
Pharmaceutical spray drying: solid-dose process technology platform for the 21st century.
Snyder, Herman E
2012-07-01
Requirement for precise control of solid-dosage particle properties created with a scalable process technology are continuing to expand in the pharmaceutical industry. Alternate methods of drug delivery, limited active drug substance solubility and the need to improve drug product stability under room-temperature conditions are some of the pharmaceutical applications that can benefit from spray-drying technology. Used widely for decades in other industries with production rates up to several tons per hour, pharmaceutical uses for spray drying are expanding beyond excipient production and solvent removal from crystalline material. Creation of active pharmaceutical-ingredient particles with combinations of unique target properties are now more common. This review of spray-drying technology fundamentals provides a brief perspective on the internal process 'mechanics', which combine with both the liquid and solid properties of a formulation to enable high-throughput, continuous manufacturing of precision powder properties.
Introducing Discrete Frequency Infrared Technology for High-Throughput Biofluid Screening
NASA Astrophysics Data System (ADS)
Hughes, Caryn; Clemens, Graeme; Bird, Benjamin; Dawson, Timothy; Ashton, Katherine M.; Jenkinson, Michael D.; Brodbelt, Andrew; Weida, Miles; Fotheringham, Edeline; Barre, Matthew; Rowlette, Jeremy; Baker, Matthew J.
2016-02-01
Accurate early diagnosis is critical to patient survival, management and quality of life. Biofluids are key to early diagnosis due to their ease of collection and intimate involvement in human function. Large-scale mid-IR imaging of dried fluid deposits offers a high-throughput molecular analysis paradigm for the biomedical laboratory. The exciting advent of tuneable quantum cascade lasers allows for the collection of discrete frequency infrared data enabling clinically relevant timescales. By scanning targeted frequencies spectral quality, reproducibility and diagnostic potential can be maintained while significantly reducing acquisition time and processing requirements, sampling 16 serum spots with 0.6, 5.1 and 15% relative standard deviation (RSD) for 199, 14 and 9 discrete frequencies respectively. We use this reproducible methodology to show proof of concept rapid diagnostics; 40 unique dried liquid biopsies from brain, breast, lung and skin cancer patients were classified in 2.4 cumulative seconds against 10 non-cancer controls with accuracies of up to 90%.
Advancing microwave technology for dehydration processing of biologics.
Cellemme, Stephanie L; Van Vorst, Matthew; Paramore, Elisha; Elliott, Gloria D
2013-10-01
Our prior work has shown that microwave processing can be effective as a method for dehydrating cell-based suspensions in preparation for anhydrous storage, yielding homogenous samples with predictable and reproducible drying times. In the current work an optimized microwave-based drying process was developed that expands upon this previous proof-of-concept. Utilization of a commercial microwave (CEM SAM 255, Matthews, NC) enabled continuous drying at variable low power settings. A new turntable was manufactured from Ultra High Molecular Weight Polyethylene (UHMW-PE; Grainger, Lake Forest, IL) to provide for drying of up to 12 samples at a time. The new process enabled rapid and simultaneous drying of multiple samples in containment devices suitable for long-term storage and aseptic rehydration of the sample. To determine sample repeatability and consistency of drying within the microwave cavity, a concentration series of aqueous trehalose solutions were dried for specific intervals and water content assessed using Karl Fischer Titration at the end of each processing period. Samples were dried on Whatman S-14 conjugate release filters (Whatman, Maidestone, UK), a glass fiber membrane used currently in clinical laboratories. The filters were cut to size for use in a 13 mm Swinnex(®) syringe filter holder (Millipore(™), Billerica, MA). Samples of 40 μL volume could be dehydrated to the equilibrium moisture content by continuous processing at 20% with excellent sample-to-sample repeatability. The microwave-assisted procedure enabled high throughput, repeatable drying of multiple samples, in a manner easily adaptable for drying a wide array of biological samples. Depending on the tolerance for sample heating, the drying time can be altered by changing the power level of the microwave unit.
High-Throughput Fabrication of Flexible and Transparent All-Carbon Nanotube Electronics.
Chen, Yong-Yang; Sun, Yun; Zhu, Qian-Bing; Wang, Bing-Wei; Yan, Xin; Qiu, Song; Li, Qing-Wen; Hou, Peng-Xiang; Liu, Chang; Sun, Dong-Ming; Cheng, Hui-Ming
2018-05-01
This study reports a simple and effective technique for the high-throughput fabrication of flexible all-carbon nanotube (CNT) electronics using a photosensitive dry film instead of traditional liquid photoresists. A 10 in. sized photosensitive dry film is laminated onto a flexible substrate by a roll-to-roll technology, and a 5 µm pattern resolution of the resulting CNT films is achieved for the construction of flexible and transparent all-CNT thin-film transistors (TFTs) and integrated circuits. The fabricated TFTs exhibit a desirable electrical performance including an on-off current ratio of more than 10 5 , a carrier mobility of 33 cm 2 V -1 s -1 , and a small hysteresis. The standard deviations of on-current and mobility are, respectively, 5% and 2% of the average value, demonstrating the excellent reproducibility and uniformity of the devices, which allows constructing a large noise margin inverter circuit with a voltage gain of 30. This study indicates that a photosensitive dry film is very promising for the low-cost, fast, reliable, and scalable fabrication of flexible and transparent CNT-based integrated circuits, and opens up opportunities for future high-throughput CNT-based printed electronics.
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Selective phenotyping traits related to multiple stress and drought response in dry bean
USDA-ARS?s Scientific Manuscript database
Dry bean (Phaseolus vulgaris L.) tolerance to stressful environments is not well understood. Moreover, the increasing population sizes necessary for improving genomic resolution of QTL conditioning stress response has made it difficult for phenotyping to keep pace with high throughput genotyping. ...
Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc
2010-07-01
We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.
A High-Throughput Process for the Solid-Phase Purification of Synthetic DNA Sequences
Grajkowski, Andrzej; Cieślak, Jacek; Beaucage, Serge L.
2017-01-01
An efficient process for the purification of synthetic phosphorothioate and native DNA sequences is presented. The process is based on the use of an aminopropylated silica gel support functionalized with aminooxyalkyl functions to enable capture of DNA sequences through an oximation reaction with the keto function of a linker conjugated to the 5′-terminus of DNA sequences. Deoxyribonucleoside phosphoramidites carrying this linker, as a 5′-hydroxyl protecting group, have been synthesized for incorporation into DNA sequences during the last coupling step of a standard solid-phase synthesis protocol executed on a controlled pore glass (CPG) support. Solid-phase capture of the nucleobase- and phosphate-deprotected DNA sequences released from the CPG support is demonstrated to proceed near quantitatively. Shorter than full-length DNA sequences are first washed away from the capture support; the solid-phase purified DNA sequences are then released from this support upon reaction with tetra-n-butylammonium fluoride in dry dimethylsulfoxide (DMSO) and precipitated in tetrahydrofuran (THF). The purity of solid-phase-purified DNA sequences exceeds 98%. The simulated high-throughput and scalability features of the solid-phase purification process are demonstrated without sacrificing purity of the DNA sequences. PMID:28628204
Rapid high throughput amylose determination in freeze dried potato tuber samples
USDA-ARS?s Scientific Manuscript database
Approximately 80% of the fresh weight of a potato tuber is water; nearly all of the remaining dry matter is starch. Most of the starch (70%) is composed of amylopectin, while the remainder is amylose. The ratio between amylose and amylopectin is the most important property influencing the physical p...
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
2018-01-01
The development of high-yielding crops with drought tolerance is necessary to increase food, feed, fiber and fuel production. Methods that create similar environmental conditions for a large number of genotypes are essential to investigate plant responses to drought in gene discovery studies. Modern facilities that control water availability for each plant remain cost-prohibited to some sections of the research community. We present an alternative cost-effective automated irrigation system scalable for a high-throughput and controlled dry-down treatment of plants. This system was tested in sorghum using two experiments. First, four genotypes were subjected to ten days of dry-down to achieve three final Volumetric Water Content (VWC) levels: drought (0.10 and 0.20 m3 m-3) and control (0.30 m3 m-3). The final average VWC was 0.11, 0.22, and 0.31 m3 m-3, respectively, and significant differences in biomass accumulation were observed between control and drought treatments. Second, 42 diverse sorghum genotypes were subjected to a seven-day dry-down treatment for a final drought stress of 0.15 m3 m-3 VWC. The final average VWC was 0.17 m3 m-3, and plants presented significant differences in photosynthetic rate during the drought period. These results demonstrate that cost-effective automation systems can successfully control substrate water content for each plant, to accurately compare their phenotypic responses to drought, and be scaled up for high-throughput phenotyping studies. PMID:29870560
Ortiz, Diego; Litvin, Alexander G; Salas Fernandez, Maria G
2018-01-01
The development of high-yielding crops with drought tolerance is necessary to increase food, feed, fiber and fuel production. Methods that create similar environmental conditions for a large number of genotypes are essential to investigate plant responses to drought in gene discovery studies. Modern facilities that control water availability for each plant remain cost-prohibited to some sections of the research community. We present an alternative cost-effective automated irrigation system scalable for a high-throughput and controlled dry-down treatment of plants. This system was tested in sorghum using two experiments. First, four genotypes were subjected to ten days of dry-down to achieve three final Volumetric Water Content (VWC) levels: drought (0.10 and 0.20 m3 m-3) and control (0.30 m3 m-3). The final average VWC was 0.11, 0.22, and 0.31 m3 m-3, respectively, and significant differences in biomass accumulation were observed between control and drought treatments. Second, 42 diverse sorghum genotypes were subjected to a seven-day dry-down treatment for a final drought stress of 0.15 m3 m-3 VWC. The final average VWC was 0.17 m3 m-3, and plants presented significant differences in photosynthetic rate during the drought period. These results demonstrate that cost-effective automation systems can successfully control substrate water content for each plant, to accurately compare their phenotypic responses to drought, and be scaled up for high-throughput phenotyping studies.
AIR TOXICS ASSESSMENT REFINEMENT IN RAPCA'S JURISDICTION - DAYTON, OH AREA
RAPCA has receive two grants to conduct this project. As part of the original project, RAPCA has improved and expanded their point source inventory by converting the following area sources to point sources: dry cleaners, gasoline throughput processes and halogenated solvent clea...
The changes of dominant lactic acid bacteria and their metabolites during corn stover ensiling.
Xu, Zhenshang; Zhang, Susu; Zhang, Rongling; Li, Shixu; Kong, Jian
2018-05-15
Monitoring the succession of bacterial populations during corn stover ensiling are helpful for improving the silage quality. Fermentation characteristics were assessed and bacterial communities were described along with the ensiling process. The ensiled corn stover exhibited chemical traits as low pH value (3.92 ± 0.02) and high levels of lactic acid (66.75 ± 1.97 g kg -1 dry matter) which were associated with well ensiled forages, as well as moderate concentrations of acetic acid (19.69 ± 1.51 g kg -1 dry matter) and small amounts of 1, 2-propanediol (4.4 ± 0.11 g kg -1 dry matter). In the early stages of the ensiling process, a significant increase and then reduction of the abundance of species Lactococcus lactis, Leuconostoc pseudomesenteroides, Pediococcus pentosaceus and Weissella sp. were observed. The species Lactobacillus plantarum (Lb. plantarum) group and Lb. brevis grew vigorously, and the species Lb. farciminis and Lb. parafarraginis gradually increased along with the course of ensiling. High-throughput sequencing was successfully used to describe bacterial communities throughout the process of corn stover ensiling. The knowledge about the ecological succession of the dominant lactic acid bacteria could lead to improved ensiling practices and the selection of corn stover silage inoculants. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre
2009-01-01
Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.
Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations.
Trnka, Hjalte; Wu, Jian X; Van De Weert, Marco; Grohganz, Holger; Rantanen, Jukka
2013-12-01
Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilarity concept complicate the development phase of safe and cost-effective drug products. To streamline the development phase and to make high-throughput formulation screening possible, efficient solutions for analyzing critical quality attributes such as cake quality with minimal material consumption are needed. The aim of this study was to develop a fuzzy logic system based on image analysis (IA) for analyzing cake quality. Freeze-dried samples with different visual quality attributes were prepared in well plates. Imaging solutions together with image analytical routines were developed for extracting critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed fuzzy logic-based system was found to give comparable quality scores with visual evaluation, making high-throughput classification of cake quality possible. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Astrophysics Data System (ADS)
Al-Shroofy, Mohanad; Zhang, Qinglin; Xu, Jiagang; Chen, Tao; Kaur, Aman Preet; Cheng, Yang-Tse
2017-06-01
We report a solvent-free dry powder coating process for making LiNi1/3Mn1/3Co1/3O2 (NMC) positive electrodes in lithium-ion batteries. This process eliminates volatile organic compound emission and reduces thermal curing time from hours to minutes. A mixture of NMC, carbon black, and poly(vinylidene difluoride) was electrostatically sprayed onto an aluminum current collector, forming a uniformly distributed electrode with controllable thickness and porosity. Charge/discharge cycling of the dry-powder-coated electrodes in lithium-ion half cells yielded a discharge specific capacity of 155 mAh g-1 and capacity retention of 80% for more than 300 cycles when the electrodes were tested between 3.0 and 4.3 V at a rate of C/5. The long-term cycling performance and durability of dry-powder coated electrodes are similar to those made by the conventional wet slurry-based method. This solvent-free dry powder coating process is a potentially lower-cost, higher-throughput, and more environmentally friendly manufacturing process compared with the conventional wet slurry-based electrode manufacturing method.
Perera, Rushini S.; Ding, Xavier C.; Tully, Frank; Oliver, James; Bright, Nigel; Bell, David; Chiodini, Peter L.; Gonzalez, Iveth J.; Polley, Spencer D.
2017-01-01
Background Accurate and efficient detection of sub-microscopic malaria infections is crucial for enabling rapid treatment and interruption of transmission. Commercially available malaria LAMP kits have excellent diagnostic performance, though throughput is limited by the need to prepare samples individually. Here, we evaluate the clinical performance of a newly developed high throughput (HTP) sample processing system for use in conjunction with the Eiken malaria LAMP kit. Methods The HTP system utilised dried blood spots (DBS) and liquid whole blood (WB), with parallel sample processing of 94 samples per run. The system was evaluated using 699 samples of known infection status pre-determined by gold standard nested PCR. Results The sensitivity and specificity of WB-HTP-LAMP was 98.6% (95% CI, 95.7–100), and 99.7% (95% CI, 99.2–100); sensitivity of DBS-HTP-LAMP was 97.1% (95% CI, 93.1–100), and specificity 100% against PCR. At parasite densities greater or equal to 2 parasites/μL, WB and DBS HTP-LAMP showed 100% sensitivity and specificity against PCR. At densities less than 2 p/μL, WB-HTP-LAMP sensitivity was 88.9% (95% CI, 77.1–100) and specificity was 99.7% (95% CI, 99.2–100); sensitivity and specificity of DBS-HTP-LAMP was 77.8% (95% CI, 54.3–99.5) and 100% respectively. Conclusions The HTP-LAMP system is a highly sensitive diagnostic test, with the potential to allow large scale population screening in malaria elimination campaigns. PMID:28166235
Quantitative assessment of anthrax vaccine immunogenicity using the dried blood spot matrix.
Schiffer, Jarad M; Maniatis, Panagiotis; Garza, Ilana; Steward-Clark, Evelene; Korman, Lawrence T; Pittman, Phillip R; Mei, Joanne V; Quinn, Conrad P
2013-03-01
The collection, processing and transportation to a testing laboratory of large numbers of clinical samples during an emergency response situation present significant cost and logistical issues. Blood and serum are common clinical samples for diagnosis of disease. Serum preparation requires significant on-site equipment and facilities for immediate processing and cold storage, and significant costs for cold-chain transport to testing facilities. The dried blood spot (DBS) matrix offers an alternative to serum for rapid and efficient sample collection with fewer on-site equipment requirements and considerably lower storage and transport costs. We have developed and validated assay methods for using DBS in the quantitative anti-protective antigen IgG enzyme-linked immunosorbent assay (ELISA), one of the primary assays for assessing immunogenicity of anthrax vaccine and for confirmatory diagnosis of Bacillus anthracis infection in humans. We have also developed and validated high-throughput data analysis software to facilitate data handling for large clinical trials and emergency response. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antoniadis, H.
Reported are the development and demonstration of a 17% efficient 25mm x 25mm crystalline Silicon solar cell and a 16% efficient 125mm x 125mm crystalline Silicon solar cell, both produced by Ink-jet printing Silicon Ink on a thin crystalline Silicon wafer. To achieve these objectives, processing approaches were developed to print the Silicon Ink in a predetermined pattern to form a high efficiency selective emitter, remove the solvents in the Silicon Ink and fuse the deposited particle Silicon films. Additionally, standard solar cell manufacturing equipment with slightly modified processes were used to complete the fabrication of the Silicon Ink highmore » efficiency solar cells. Also reported are the development and demonstration of a 18.5% efficient 125mm x 125mm monocrystalline Silicon cell, and a 17% efficient 125mm x 125mm multicrystalline Silicon cell, by utilizing high throughput Ink-jet and screen printing technologies. To achieve these objectives, Innovalight developed new high throughput processing tools to print and fuse both p and n type particle Silicon Inks in a predetermined pat-tern applied either on the front or the back of the cell. Additionally, a customized Ink-jet and screen printing systems, coupled with customized substrate handling solution, customized printing algorithms, and a customized ink drying process, in combination with a purchased turn-key line, were used to complete the high efficiency solar cells. This development work delivered a process capable of high volume producing 18.5% efficient crystalline Silicon solar cells and enabled the Innovalight to commercialize its technology by the summer of 2010.« less
Tiersch, Terrence R.; Yang, Huiping; Hu, E.
2011-01-01
With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666
Bacterial communities and their association with the bio-drying of sewage sludge.
Cai, Lu; Chen, Tong-Bin; Gao, Ding; Yu, Jie
2016-03-01
Bio-drying is a technology that aims to remove water from a material using the microbial heat originating from organic matter degradation. However, the evolution of bacterial communities that are associated with the drying process has not been researched systematically. This study was performed to investigate the variations of bacterial communities and the relationships among bacterial communities, water evaporation, water generation, and organic matter degradation during the bio-drying of sewage sludge. High-throughput pyrosequencing was used to analyze the bacterial communities, while water evaporation and water generation were determined based on an in situ water vapor monitoring device. The values of water evaporation, water generation, and volatile solids degradation were 412.9 g kg(-1) sewage sludge bio-drying material (SSBM), 65.0 g kg(-1) SSBM, and 70.2 g kg(-1) SSBM, respectively. Rarefaction curves and diversity indices showed that bacterial diversity plummeted after the temperature of the bio-drying pile dramatically increased on d 2, which coincided with a remarkable increase of water evaporation on d 2. Bacterial diversity increased when the pile cooled. During the thermophilic phase, in which Acinetobacter and Bacillus were the dominant genera, the rates of water evaporation, water generation, and VS degradation peaked. These results implied that the elevated temperature reshaped the bacterial communities, which played a key role in water evaporation, and the high temperature also contributed to the effective elimination of pathogens. Copyright © 2015 Elsevier Ltd. All rights reserved.
De Bruyn, Florac; Zhang, Sophia Jiyuan; Pothakos, Vasileios; Torres, Julio; Lambot, Charles; Moroni, Alice V.; Callanan, Michael; Sybesma, Wilbert; Weckx, Stefan
2016-01-01
ABSTRACT The postharvest treatment and processing of fresh coffee cherries can impact the quality of the unroasted green coffee beans. In the present case study, freshly harvested Arabica coffee cherries were processed through two different wet and dry methods to monitor differences in the microbial community structure and in substrate and metabolite profiles. The changes were followed throughout the postharvest processing chain, from harvest to drying, by implementing up-to-date techniques, encompassing multiple-step metagenomic DNA extraction, high-throughput sequencing, and multiphasic metabolite target analysis. During wet processing, a cohort of lactic acid bacteria (i.e., Leuconostoc, Lactococcus, and Lactobacillus) was the most commonly identified microbial group, along with enterobacteria and yeasts (Pichia and Starmerella). Several of the metabolites associated with lactic acid bacterial metabolism (e.g., lactic acid, acetic acid, and mannitol) produced in the mucilage were also found in the endosperm. During dry processing, acetic acid bacteria (i.e., Acetobacter and Gluconobacter) were most abundant, along with Pichia and non-Pichia (Candida, Starmerella, and Saccharomycopsis) yeasts. Accumulation of associated metabolites (e.g., gluconic acid and sugar alcohols) took place in the drying outer layers of the coffee cherries. Consequently, both wet and dry processing methods significantly influenced the microbial community structures and hence the composition of the final green coffee beans. This systematic approach to dissecting the coffee ecosystem contributes to a deeper understanding of coffee processing and might constitute a state-of-the-art framework for the further analysis and subsequent control of this complex biotechnological process. IMPORTANCE Coffee production is a long process, starting with the harvest of coffee cherries and the on-farm drying of their beans. In a later stage, the dried green coffee beans are roasted and ground in order to brew a cup of coffee. The on-farm, postharvest processing method applied can impact the quality of the green coffee beans. In the present case study, freshly harvested Arabica coffee cherries were processed through wet and dry processing in four distinct variations. The microorganisms present and the chemical profiles of the coffee beans were analyzed throughout the postharvest processing chain. The up-to-date techniques implemented facilitated the investigation of differences related to the method applied. For instance, different microbial groups were associated with wet and dry processing methods. Additionally, metabolites associated with the respective microorganisms accumulated on the final green coffee beans. PMID:27793826
De Bruyn, Florac; Zhang, Sophia Jiyuan; Pothakos, Vasileios; Torres, Julio; Lambot, Charles; Moroni, Alice V; Callanan, Michael; Sybesma, Wilbert; Weckx, Stefan; De Vuyst, Luc
2017-01-01
The postharvest treatment and processing of fresh coffee cherries can impact the quality of the unroasted green coffee beans. In the present case study, freshly harvested Arabica coffee cherries were processed through two different wet and dry methods to monitor differences in the microbial community structure and in substrate and metabolite profiles. The changes were followed throughout the postharvest processing chain, from harvest to drying, by implementing up-to-date techniques, encompassing multiple-step metagenomic DNA extraction, high-throughput sequencing, and multiphasic metabolite target analysis. During wet processing, a cohort of lactic acid bacteria (i.e., Leuconostoc, Lactococcus, and Lactobacillus) was the most commonly identified microbial group, along with enterobacteria and yeasts (Pichia and Starmerella). Several of the metabolites associated with lactic acid bacterial metabolism (e.g., lactic acid, acetic acid, and mannitol) produced in the mucilage were also found in the endosperm. During dry processing, acetic acid bacteria (i.e., Acetobacter and Gluconobacter) were most abundant, along with Pichia and non-Pichia (Candida, Starmerella, and Saccharomycopsis) yeasts. Accumulation of associated metabolites (e.g., gluconic acid and sugar alcohols) took place in the drying outer layers of the coffee cherries. Consequently, both wet and dry processing methods significantly influenced the microbial community structures and hence the composition of the final green coffee beans. This systematic approach to dissecting the coffee ecosystem contributes to a deeper understanding of coffee processing and might constitute a state-of-the-art framework for the further analysis and subsequent control of this complex biotechnological process. Coffee production is a long process, starting with the harvest of coffee cherries and the on-farm drying of their beans. In a later stage, the dried green coffee beans are roasted and ground in order to brew a cup of coffee. The on-farm, postharvest processing method applied can impact the quality of the green coffee beans. In the present case study, freshly harvested Arabica coffee cherries were processed through wet and dry processing in four distinct variations. The microorganisms present and the chemical profiles of the coffee beans were analyzed throughout the postharvest processing chain. The up-to-date techniques implemented facilitated the investigation of differences related to the method applied. For instance, different microbial groups were associated with wet and dry processing methods. Additionally, metabolites associated with the respective microorganisms accumulated on the final green coffee beans. Copyright © 2016 American Society for Microbiology.
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Multilayer ultra thick resist development for MEMS
NASA Astrophysics Data System (ADS)
Washio, Yasushi; Senzaki, Takahiro; Masuda, Yasuo; Saito, Koji; Obiya, Hiroyuki
2005-05-01
MEMS (Micro-Electro-Mechanical Systems) is achieved through a process technology, called Micro-machining. There are two distinct methods to manufacture a MEMS-product. One method is to form permanent film through photolithography, and the other is to form a non-permanent film resist after photolithography proceeded by etch or plating process. The three-dimensional ultra-fine processing technology based on photolithography, and is assembled by processes, such as anode junction, and post lithography processes such as etching and plating. Currently ORDYL PR-100 (Dry Film Type) is used for the permanent resist process. TOK has developed TMMR S2000 (Liquid Type) and TMMF S2000 (Dry Film Type) also. TOK has developed a new process utilizing these resist. The electro-forming method by photolithography is developed as one of the methods for enabling high resolution and high aspect formation. In recent years, it has become possible to manufacture conventionally difficult multilayer through our development with material and equipment project (M&E). As for material for electro-forming, it was checked that chemically amplified resist is optimal from the reaction mechanism as it is easily removed by the clean solution. Moreover, multiple plating formations were enabled with the resist through a new process. As for the equipment, TOK developed Applicator (It can apply 500 or more μms) and Developer, which achieves high throughput and quality. The detailed plating formations, which a path differs, and air wiring are realizable through M&E. From the above results, opposed to metallic mold plating, electro-forming method by resist, enabled to form high resolution and aspect pattern, at low cost. It is thought that the infinite possibility spreads by applying this process.
The planning and establishment of a sample preparation laboratory for drug discovery
Dufresne, Claude
2000-01-01
Nature has always been a productive source of new drugs. With the advent of high-throughput screening, it has now become possible to rapidly screen large sample collections. In addition to seeking greater diversity from natural product sources (micro-organisms, plants, etc.), fractionation of the crude extracts prior to screening is becoming a more important part of our efforts. As sample preparation protocols become more involved, automation can help to achieve and maintain a desired sample throughput. To address the needs of our screening program, two robotic systems were designed. The first system processes crude extracts all the way to 96-well plates, containing solutions suitable for screening in biological and biochemical assays. The system can dissolve crude extracts, fractionate them on solid-phase extraction cartridges, dry and weigh each fraction, re-dissolve them to a known concentration, and prepare mother plates. The second system replicates mother plates into a number of daughter plates. PMID:18924691
High-Throughput Industrial Coatings Research at The Dow Chemical Company.
Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T
2016-09-12
At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.
Next-Generation Molecular Testing of Newborn Dried Blood Spots for Cystic Fibrosis.
Lefterova, Martina I; Shen, Peidong; Odegaard, Justin I; Fung, Eula; Chiang, Tsoyu; Peng, Gang; Davis, Ronald W; Wang, Wenyi; Kharrazi, Martin; Schrijver, Iris; Scharfe, Curt
2016-03-01
Newborn screening for cystic fibrosis enables early detection and management of this debilitating genetic disease. Implementing comprehensive CFTR analysis using Sanger sequencing as a component of confirmatory testing of all screen-positive newborns has remained impractical due to relatively lengthy turnaround times and high cost. Here, we describe CFseq, a highly sensitive, specific, rapid (<3 days), and cost-effective assay for comprehensive CFTR gene analysis from dried blood spots, the common newborn screening specimen. The unique design of CFseq integrates optimized dried blood spot sample processing, a novel multiplex amplification method from as little as 1 ng of genomic DNA, and multiplex next-generation sequencing of 96 samples in a single run to detect all relevant CFTR mutation types. Sequence data analysis utilizes publicly available software supplemented by an expert-curated compendium of >2000 CFTR variants. Validation studies across 190 dried blood spots demonstrated 100% sensitivity and a positive predictive value of 100% for single-nucleotide variants and insertions and deletions and complete concordance across the polymorphic poly-TG and consecutive poly-T tracts. Additionally, we accurately detected both a known exon 2,3 deletion and a previously undetected exon 22,23 deletion. CFseq is thus able to replace all existing CFTR molecular assays with a single robust, definitive assay at significant cost and time savings and could be adapted to high-throughput screening of other inherited conditions. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Fabrication of solution-processed InSnZnO/ZrO2 thin film transistors.
Hwang, Soo Min; Lee, Seung Muk; Choi, Jun Hyuk; Lim, Jun Hyung; Joo, Jinho
2013-11-01
We fabricated InSnZnO (ITZO) thin-film transistors (TFTs) with a high-permittivity (K) ZrO2 gate insulator using a solution process and explored the microstructure and electrical properties. ZrO2 and ITZO (In:Sn:Zn = 2:1:1) precursor solutions were deposited using consecutive spin-coating and drying steps on highly doped p-type Si substrate, followed by annealing at 700 degrees C in ambient air. The ITZO/ZrO2 TFT device showed n-channel depletion mode characteristics, and it possessed a high saturation mobility of approximately 9.8 cm2/V x s, a small subthreshold voltage swing of approximately 2.3 V/decade, and a negative V(TH) of approximately 1.5 V, but a relatively low on/off current ratio of approximately 10(-3). These results were thought to be due to the use of the high-kappa crystallized ZrO2 dielectric (kappa approximately 21.8) as the gate insulator, which could permit low-voltage operation of the solution-processed ITZO TFT devices for applications to high-throughput, low-cost, flexible and transparent electronics.
Electron irradiation of dry food products
NASA Astrophysics Data System (ADS)
Grünewald, Th.
The interest of the industrial food producer is increasing in having the irradiation facility installed in the food processing chain. The throughput of the irradiator should be high and the residence time of the product in the facility should be short. These conditions can be accomplished by electron irradiators. To clarify the irradiation conditions spices taken out of the industrial process, food grade salt, sugar, and gums as models of dry food products were irradiated. With a radiation dose of 10 kGy microbial load can be reduced on 10∗∗4 microorganisms/g. The sensory properties of the spices were not changed in an atypical way. For food grade salt and sugar changes of colour were observed which are due to lattice defects or initiated browning. The irradiation of several gums led only in some cases to an improvement of the thickness properties in the application below 50°C, in most cases the thickness effect was reduced. The products were packaged before irradiation. But it would be possible also to irradiate the products without packaging moving the product through the iradiation field in a closed conveyor system.
Tolu, Julie; Gerber, Lorenz; Boily, Jean-François; Bindler, Richard
2015-06-23
Molecular-level chemical information about organic matter (OM) in sediments helps to establish the sources of OM and the prevalent degradation/diagenetic processes, both essential for understanding the cycling of carbon (C) and of the elements associated with OM (toxic trace metals and nutrients) in lake ecosystems. Ideally, analytical methods for characterizing OM should allow high sample throughput, consume small amounts of sample and yield relevant chemical information, which are essential for multidisciplinary, high-temporal resolution and/or large spatial scale investigations. We have developed a high-throughput analytical method based on pyrolysis-gas chromatography/mass spectrometry and automated data processing to characterize sedimentary OM in sediments. Our method consumes 200 μg of freeze-dried and ground sediment sample. Pyrolysis was performed at 450°C, which was found to avoid degradation of specific biomarkers (e.g., lignin compounds, fresh carbohydrates/cellulose) compared to 650°C, which is in the range of temperatures commonly applied for environmental samples. The optimization was conducted using the top ten sediment samples of an annually resolved sediment record (containing 16-18% and 1.3-1.9% of total carbon and nitrogen, respectively). Several hundred pyrolytic compound peaks were detected of which over 200 were identified, which represent different classes of organic compounds (i.e., n-alkanes, n-alkenes, 2-ketones, carboxylic acids, carbohydrates, proteins, other N compounds, (methoxy)phenols, (poly)aromatics, chlorophyll and steroids/hopanoids). Technical reproducibility measured as relative standard deviation of the identified peaks in triplicate analyses was 5.5±4.3%, with 90% of the RSD values within 10% and 98% within 15%. Finally, a multivariate calibration model was calculated between the pyrolytic degradation compounds and the sediment depth (i.e., sediment age), which is a function of degradation processes and changes in OM source type. This allowed validation of the Py-GC/MS dataset against fundamental processes involved in OM cycling in aquatic ecosystems. Copyright © 2015 Elsevier B.V. All rights reserved.
HTP-NLP: A New NLP System for High Throughput Phenotyping.
Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L
2017-01-01
Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.
Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.
2016-01-01
Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633
Lehotay, Steven J; Han, Lijun; Sapozhnikova, Yelena
2016-01-01
This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography-tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. Cleanup efficiencies and breakthrough volumes using different mini-SPE sorbents were compared using avocado, salmon, pork loin, and kale as representative matrices. Optimum extract load volume was 300 µL for the 45 mg mini-cartridges containing 20/12/12/1 (w/w/w/w) anh. MgSO 4 /PSA (primary secondary amine)/C 18 /CarbonX sorbents used in the final method. In method validation to demonstrate high-throughput capabilities and performance results, 230 spiked extracts of 10 different foods (apple, kiwi, carrot, kale, orange, black olive, wheat grain, dried basil, pork, and salmon) underwent automated mini-SPE cleanup and analysis over the course of 5 days. In all, 325 analyses for 54 pesticides and 43 environmental contaminants (3 analyzed together) were conducted using the 10 min LPGC-MS/MS method without changing the liner or retuning the instrument. Merely, 1 mg equivalent sample injected achieved <5 ng g -1 limits of quantification. With the use of internal standards, method validation results showed that 91 of the 94 analytes including pairs achieved satisfactory results (70-120 % recovery and RSD ≤ 25 %) in the 10 tested food matrices ( n = 160). Matrix effects were typically less than ±20 %, mainly due to the use of analyte protectants, and minimal human review of software data processing was needed due to summation function integration of analyte peaks. This study demonstrated that the automated mini-SPE + LPGC-MS/MS method yielded accurate results in rugged, high-throughput operations with minimal labor and data review.
Sun, Wei; Xia, Chunyu; Xu, Meiying; Guo, Jun; Sun, Guoping
2017-01-01
Water quality ranks the most vital criterion for rivers serving as drinking water sources, which periodically changes over seasons. Such fluctuation is believed associated with the state shifts of bacterial community within. To date, seasonality effects on bacterioplankton community patterns in large rivers serving as drinking water sources however, are still poorly understood. Here we investigated the intra-annual bacterial community structure in the Dongjiang River, a drinking water source of Hong Kong, using high-throughput pyrosequencing in concert with geochemical property measurements during dry, and wet seasons. Our results showed that Proteobacteria, Actinobacteria , and Bacteroidetes were the dominant phyla of bacterioplankton communities, which varied in composition, and distribution from dry to wet seasons, and exhibited profound seasonal changes. Actinobacteria, Bacteroidetes , and Cyanobacteria seemed to be more associated with seasonality that the relative abundances of Actinobacteria , and Bacteroidetes were significantly higher in the dry season than those in the wet season ( p < 0.01), while the relative abundance of Cyanobacteria was about 10-fold higher in the wet season than in the dry season. Temperature and [Formula: see text]-N concentration represented key contributing factors to the observed seasonal variations. These findings help understand the roles of various bacterioplankton and their interactions with the biogeochemical processes in the river ecosystem.
Zhang, Guozhu; Xie, Changsheng; Zhang, Shunping; Zhao, Jianwei; Lei, Tao; Zeng, Dawen
2014-09-08
A combinatorial high-throughput temperature-programmed method to obtain the optimal operating temperature (OOT) of gas sensor materials is demonstrated here for the first time. A material library consisting of SnO2, ZnO, WO3, and In2O3 sensor films was fabricated by screen printing. Temperature-dependent conductivity curves were obtained by scanning this gas sensor library from 300 to 700 K in different atmospheres (dry air, formaldehyde, carbon monoxide, nitrogen dioxide, toluene and ammonia), giving the OOT of each sensor formulation as a function of the carrier and analyte gases. A comparative study of the temperature-programmed method and a conventional method showed good agreement in measured OOT.
Ormes, James D; Zhang, Dan; Chen, Alex M; Hou, Shirley; Krueger, Davida; Nelson, Todd; Templeton, Allen
2013-02-01
There has been a growing interest in amorphous solid dispersions for bioavailability enhancement in drug discovery. Spray drying, as shown in this study, is well suited to produce prototype amorphous dispersions in the Candidate Selection stage where drug supply is limited. This investigation mapped the processing window of a micro-spray dryer to achieve desired particle characteristics and optimize throughput/yield. Effects of processing variables on the properties of hypromellose acetate succinate were evaluated by a fractional factorial design of experiments. Parameters studied include solid loading, atomization, nozzle size, and spray rate. Response variables include particle size, morphology and yield. Unlike most other commercial small-scale spray dryers, the ProCepT was capable of producing particles with a relatively wide mean particle size, ca. 2-35 µm, allowing material properties to be tailored to support various applications. In addition, an optimized throughput of 35 g/hour with a yield of 75-95% was achieved, which affords to support studies from Lead-identification/Lead-optimization to early safety studies. A regression model was constructed to quantify the relationship between processing parameters and the response variables. The response surface curves provide a useful tool to design processing conditions, leading to a reduction in development time and drug usage to support drug discovery.
Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.
Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen
2018-07-20
Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret
NASA Astrophysics Data System (ADS)
Zheng, Guokuo
In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.
Polystyrene negative resist for high-resolution electron beam lithography
2011-01-01
We studied the exposure behavior of low molecular weight polystyrene as a negative tone electron beam lithography (EBL) resist, with the goal of finding the ultimate achievable resolution. It demonstrated fairly well-defined patterning of a 20-nm period line array and a 15-nm period dot array, which are the densest patterns ever achieved using organic EBL resists. Such dense patterns can be achieved both at 20 and 5 keV beam energies using different developers. In addition to its ultra-high resolution capability, polystyrene is a simple and low-cost resist with easy process control and practically unlimited shelf life. It is also considerably more resistant to dry etching than PMMA. With a low sensitivity, it would find applications where negative resist is desired and throughput is not a major concern. PMID:21749679
Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines
NASA Astrophysics Data System (ADS)
Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.
2017-01-01
Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.
Shimada, Tsutomu; Kelly, Joan; LaMarr, William A; van Vlies, Naomi; Yasuda, Eriko; Mason, Robert W.; Mackenzie, William; Kubaski, Francyne; Giugliani, Roberto; Chinen, Yasutsugu; Yamaguchi, Seiji; Suzuki, Yasuyuki; Orii, Kenji E.; Fukao, Toshiyuki; Orii, Tadao; Tomatsu, Shunji
2014-01-01
Mucopolysaccharidoses (MPS) are caused by deficiency of one of a group of specific lysosomal enzymes, resulting in excessive accumulation of glycosaminoglycans (GAGs). We previously developed GAG assay methods using liquid chromatography tandem mass spectrometry (LC-MS/MS); however, it takes 4–5 min per sample for analysis. For the large numbers of samples in a screening program, a more rapid process is desirable. The automated high-throughput mass spectrometry (HT-MS/MS) system (RapidFire) integrates a solid phase extraction robot to concentrate and desalt samples prior to direction into the MS/MS without chromatographic separation; thereby allowing each sample to be processed within ten seconds (enabling screening of more than one million samples per year). The aim of this study was to develop a higher throughput system to assay heparan sulfate (HS) using HT-MS/MS, and to compare its reproducibility, sensitivity and specificity with conventional LC-MS/MS. HS levels were measured in blood (plasma and serum) from control subjects and patients with MPS II, III, or IV and in dried blood spots (DBS) from newborn controls and patients with MPS I, II, or III. Results obtained from HT-MS/MS showed 1) that there was a strong correlation of levels of disaccharides derived from HS in blood, between those calculated using conventional LC-MS/MS and HT-MS/MS, 2) that levels of HS in blood were significantly elevated in patients with MPS II and III, but not in MPS IVA, 3) that the level of HS in patients with a severe form of MPS II was higher than that in an attenuated form, 4) that reduction of blood HS level was observed in MPS II patients treated with enzyme replacement therapy or hematopoietic stem cell transplantation, and 5) that levels of HS in newborn DBS were elevated in patients with MPS I, II or III, compared to control newborns. In conclusion, HT-MS/MS provides much higher throughput than LC-MS/MS-based methods with similar sensitivity and specificity in an HS assay, indicating that HT-MS/MS may be feasible for diagnosis, monitoring, and newborn screening of MPS. PMID:25092413
Starch Applications for Delivery Systems
NASA Astrophysics Data System (ADS)
Li, Jason
2013-03-01
Starch is one of the most abundant and economical renewable biopolymers in nature. Starch molecules are high molecular weight polymers of D-glucose linked by α-(1,4) and α-(1,6) glycosidic bonds, forming linear (amylose) and branched (amylopectin) structures. Octenyl succinic anhydride modified starches (OSA-starch) are designed by carefully choosing a proper starch source, path and degree of modification. This enables emulsion and micro-encapsulation delivery systems for oil based flavors, micronutrients, fragrance, and pharmaceutical actives. A large percentage of flavors are encapsulated by spray drying in today's industry due to its high throughput. However, spray drying encapsulation faces constant challenges with retention of volatile compounds, oxidation of sensitive compound, and manufacturing yield. Specialty OSA-starches were developed suitable for the complex dynamics in spray drying and to provide high encapsulation efficiency and high microcapsule quality. The OSA starch surface activity, low viscosity and film forming capability contribute to high volatile retention and low active oxidation. OSA starches exhibit superior performance, especially in high solids and high oil load encapsulations compared with other hydrocolloids. The submission is based on research and development of Ingredion
Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H
1997-01-01
The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.
Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich
2006-01-01
The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.
In Vivo High-Content Evaluation of Three-Dimensional Scaffolds Biocompatibility
Oliveira, Mariana B.; Ribeiro, Maximiano P.; Miguel, Sónia P.; Neto, Ana I.; Coutinho, Paula; Correia, Ilídio J.
2014-01-01
While developing tissue engineering strategies, inflammatory response caused by biomaterials is an unavoidable aspect to be taken into consideration, as it may be an early limiting step of tissue regeneration approaches. We demonstrate the application of flat and flexible films exhibiting patterned high-contrast wettability regions as implantable platforms for the high-content in vivo study of inflammatory response caused by biomaterials. Screening biomaterials by using high-throughput platforms is a powerful method to detect hit spots with promising properties and to exclude uninteresting conditions for targeted applications. High-content analysis of biomaterials has been mostly restricted to in vitro tests where crucial information is lost, as in vivo environment is highly complex. Conventional biomaterials implantation requires the use of high numbers of animals, leading to ethical questions and costly experimentation. Inflammatory response of biomaterials has also been highly neglected in high-throughput studies. We designed an array of 36 combinations of biomaterials based on an initial library of four polysaccharides. Biomaterials were dispensed onto biomimetic superhydrophobic platforms with wettable regions and processed as freeze-dried three-dimensional scaffolds with a high control of the array configuration. These chips were afterward implanted subcutaneously in Wistar rats. Lymphocyte recruitment and activated macrophages were studied on-chip, by performing immunocytochemistry in the miniaturized biomaterials after 24 h and 7 days of implantation. Histological cuts of the surrounding tissue of the implants were also analyzed. Localized and independent inflammatory responses were detected. The integration of these data with control data proved that these chips are robust platforms for the rapid screening of early-stage in vivo biomaterials' response. PMID:24568682
The microbiota of marketed processed edible insects as revealed by high-throughput sequencing.
Garofalo, Cristiana; Osimani, Andrea; Milanović, Vesna; Taccari, Manuela; Cardinali, Federica; Aquilanti, Lucia; Riolo, Paola; Ruschioni, Sara; Isidoro, Nunzio; Clementi, Francesca
2017-04-01
Entomophagy has been linked to nutritional, economic, social and ecological benefits. However, scientific studies on the potential safety risks in eating edible insects need to be carried out for legislators, markets and consumers. In this context, the microbiota of edible insects deserves to be deeply investigated. The aim of this study was to elucidate the microbial species occurring in some processed marketed edible insects, namely powdered small crickets, whole dried small crickets (Acheta domesticus), whole dried locusts (Locusta migratoria), and whole dried mealworm larvae (Tenebrio molitor), through culture-dependent (classical microbiological analyses) and -independent methods (pyrosequencing). A great bacterial diversity and variation among insects was seen. Relatively low counts of total mesophilic aerobes, Enterobacteriaceae, lactic acid bacteria, Clostridium perfringens spores, yeasts and moulds in all of the studied insect batches were found. Furthermore, the presence of several gut-associated bacteria, some of which may act as opportunistic pathogens in humans, were found through pyrosequencing. Food spoilage bacteria were also identified, as well as Spiroplasma spp. in mealworm larvae, which has been found to be related to neurodegenerative diseases in animals and humans. Although viable pathogens such as Salmonella spp. and Listeria monocytogenes were not detected, the presence of Listeria spp., Staphylococcus spp., Clostridium spp. and Bacillus spp. (with low abundance) was also found through pyrosequencing. The results of this study contribute to the elucidation of the microbiota associated with edible insects and encourage further studies aimed to evaluate the influence of rearing and processing conditions on that microbiota. Copyright © 2016 Elsevier Ltd. All rights reserved.
Turning tumor-promoting copper into an anti-cancer weapon via high-throughput chemistry.
Wang, F; Jiao, P; Qi, M; Frezza, M; Dou, Q P; Yan, B
2010-01-01
Copper is an essential element for multiple biological processes. Its concentration is elevated to a very high level in cancer tissues for promoting cancer development through processes such as angiogenesis. Organic chelators of copper can passively reduce cellular copper and serve the role as inhibitors of angiogenesis. However, they can also actively attack cellular targets such as proteasome, which plays a critical role in cancer development and survival. The discovery of such molecules initially relied on a step by step synthesis followed by biological assays. Today high-throughput chemistry and high-throughput screening have significantly expedited the copper-binding molecules discovery to turn "cancer-promoting" copper into anti-cancer agents.
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Fard, Ehsan Mohseni; Bakhshi, Behnam; Farsi, Mohammad; Kakhki, Amin Mirshamsi; Nikpay, Nava; Ebrahimi, Mohammad Ali; Mardi, Mohsen; Salekdeh, Ghasem Hosseini
2017-10-24
MicroRNAs (miRNAs) are small endogenous regulatory RNAs that are involved in a variety of biological processes related to proliferation, development, and response to biotic and abiotic stresses. miRNA profiles of rice (Oryza sativa L. cv. IR64.) leaves in a partial root zone drying (PRD) system were analysed using a high-throughput sequencing approach to identify miRNAs associated with drought signalling. The treatments performed in this study were as follows: well-watered ("wet" roots, WW), wherein both halves of the pot were watered daily; drought ("dry" roots, DD), wherein water was withheld from both halves of the pot; and well-watered/drought ("wet" and "dry" roots, WD), wherein one half of each pot was watered daily, the same as in WW, and water was withheld from the other part, the same as in DD. High-throughput sequencing enabled us to detect novel miRNAs and study the differential expression of known miRNAs. A total of 209 novel miRNAs were detected in this study. Differential miRNA profiling of the DD, WD and WW conditions showed differential expression of 159 miRNAs, among which 83, 44 and 32 miRNAs showed differential expression under both DD and WD conditions. The detection of putative targets of the differentially expressed miRNAs and investigation of their functions showed that most of these genes encode transcription factors involved in growth and development, leaf morphology, regulation of hormonal homeostasis, and stress response. The most important differences between the DD and WD conditions involved regulation of the levels of hormones such as auxin, cytokinin, abscisic acid, and jasmonic acid and also regulation of phosphor homeostasis. Overall, differentially expressed miRNAs under WD conditions were found to differ from those under DD conditions, with such differences playing a role in adaptation and inducing the normal condition. The mechanisms involved in regulating hormonal homeostasis and involved in energy production and consumption were found to be the most important regulatory pathways distinguishing the DD and WD conditions.
Metallized compliant 3D microstructures for dry contact thermal conductance enhancement
NASA Astrophysics Data System (ADS)
Cui, Jin; Wang, Jicheng; Zhong, Yang; Pan, Liang; Weibel, Justin A.
2018-05-01
Microstructured three-dimensional (3D) materials can be engineered to enable new capabilities for various engineering applications; however, microfabrication of large 3D structures is typically expensive due to the conventional top-down fabrication scheme. Herein we demonstrated the use of projection micro-stereolithography and electrodeposition as cost-effective and high-throughput methods to fabricate compliant 3D microstructures as a thermal interface material (TIM). This novel TIM structure consists of an array of metallized micro-springs designed to enhance the dry contact thermal conductance between nonflat surfaces under low interface pressures (10s-100s kPa). Mechanical compliance and thermal resistance measurements confirm that this dry contact TIM can achieve conformal contact between mating surfaces with a nonflatness of approximately 5 µm under low interface pressures.
Break-up of droplets in a concentrated emulsion flowing through a narrow constriction
NASA Astrophysics Data System (ADS)
Kim, Minkyu; Rosenfeld, Liat; Tang, Sindy; Tang Lab Team
2014-11-01
Droplet microfluidics has enabled a wide range of high throughput screening applications. Compared with other technologies such as robotic screening technology, droplet microfluidics has 1000 times higher throughput, which makes the technology one of the most promising platforms for the ultrahigh throughput screening applications. Few studies have considered the throughput of the droplet interrogation process, however. In this research, we show that the probability of break-up increases with increasing flow rate, entrance angle to the constriction, and size of the drops. Since single drops do not break at the highest flow rate used in the system, break-ups occur primarily from the interactions between highly packed droplets close to each other. Moreover, the probabilistic nature of the break-up process arises from the stochastic variations in the packing configuration. Our results can be used to calculate the maximum throughput of the serial interrogation process. For 40 pL-drops, the highest throughput with less than 1% droplet break-up was measured to be approximately 7,000 drops per second. In addition, the results are useful for understanding the behavior of concentrated emulsions in applications such as mobility control in enhanced oil recovery.
Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R
2018-02-01
High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.
High-throughput measurement of polymer film thickness using optical dyes
NASA Astrophysics Data System (ADS)
Grunlan, Jaime C.; Mehrabi, Ali R.; Ly, Tien
2005-01-01
Optical dyes were added to polymer solutions in an effort to create a technique for high-throughput screening of dry polymer film thickness. Arrays of polystyrene films, cast from a toluene solution, containing methyl red or solvent green were used to demonstrate the feasibility of this technique. Measurements of the peak visible absorbance of each film were converted to thickness using the Beer-Lambert relationship. These absorbance-based thickness calculations agreed within 10% of thickness measured using a micrometer for polystyrene films that were 10-50 µm. At these thicknesses it is believed that the absorbance values are actually more accurate. At least for this solvent-based system, thickness was shown to be accurately measured in a high-throughput manner that could potentially be applied to other equivalent systems. Similar water-based films made with poly(sodium 4-styrenesulfonate) dyed with malachite green oxalate or congo red did not show the same level of agreement with the micrometer measurements. Extensive phase separation between polymer and dye resulted in inflated absorbance values and calculated thickness that was often more than 25% greater than that measured with the micrometer. Only at thicknesses below 15 µm could reasonable accuracy be achieved for the water-based films.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K
2018-01-01
The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.
Optimization and high-throughput screening of antimicrobial peptides.
Blondelle, Sylvie E; Lohner, Karl
2010-01-01
While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.
Changes in Phenolic Acid Content in Maize during Food Product Processing.
Butts-Wilmsmeyer, Carrie J; Mumm, Rita H; Rausch, Kent D; Kandhola, Gurshagan; Yana, Nicole A; Happ, Mary M; Ostezan, Alexandra; Wasmund, Matthew; Bohn, Martin O
2018-04-04
The notion that many nutrients and beneficial phytochemicals in maize are lost due to food product processing is common, but this has not been studied in detail for the phenolic acids. Information regarding changes in phenolic acid content throughout processing is highly valuable because some phenolic acids are chemopreventive agents of aging-related diseases. It is unknown when and why these changes in phenolic acid content might occur during processing, whether some maize genotypes might be more resistant to processing induced changes in phenolic acid content than other genotypes, or if processing affects the bioavailability of phenolic acids in maize-based food products. For this study, a laboratory-scale processing protocol was developed and used to process whole maize kernels into toasted cornflakes. High-throughput microscale wet-lab analyses were applied to determine the concentrations of soluble and insoluble-bound phenolic acids in samples of grain, three intermediate processing stages, and toasted cornflakes obtained from 12 ex-PVP maize inbreds and seven hybrids. In the grain, insoluble-bound ferulic acid was the most common phenolic acid, followed by insoluble-bound p-coumaric acid and soluble cinnamic acid, a precursor to the phenolic acids. Notably, the ferulic acid content was approximately 1950 μg/g, more than ten-times the concentration of many fruits and vegetables. Processing reduced the content of the phenolic acids regardless of the genotype. Most changes occurred during dry milling due to the removal of the bran. The concentration of bioavailable soluble ferulic and p-coumaric acid increased negligibly due to thermal stresses. Therefore, the current dry milling based processing techniques used to manufacture many maize-based foods, including breakfast cereals, are not conducive for increasing the content of bioavailable phenolics in processed maize food products. This suggests that while maize is an excellent source of phenolics, alternative or complementary processing methods must be developed before this nutritional resource can be utilized.
ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)
US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...
Accounting For Uncertainty in The Application Of High Throughput Datasets
The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...
High-throughput screening, predictive modeling and computational embryology - Abstract
High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).
Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475
Lyon, Elaine; Laver, Thomas; Yu, Ping; Jama, Mohamed; Young, Keith; Zoccoli, Michael; Marlowe, Natalia
2010-01-01
Population screening has been proposed for Fragile X syndrome to identify premutation carrier females and affected newborns. We developed a PCR-based assay capable of quickly detecting the presence or absence of an expanded FMR1 allele with high sensitivity and specificity. This assay combines a triplet repeat primed PCR with high-throughput automated capillary electrophoresis. We evaluated assay performance using archived samples sent for Fragile X diagnostic testing representing a range of Fragile X CGG-repeat expansions. Two hundred five previously genotyped samples were tested with the new assay. Data were analyzed for the presence of a trinucleotide “ladder” extending beyond 55 repeats, which was set as a cut-off to identify expanded FMR1 alleles. We identified expanded FMR1 alleles in 132 samples (59 premutation, 71 full mutation, 2 mosaics) and normal FMR1 alleles in 73 samples. We found 100% concordance with previous results from PCR and Southern blot analyses. In addition, we show feasibility of using this assay with DNA extracted from dried-blood spots. Using a single PCR combined with high-throughput fragment analysis on the automated capillary electrophoresis instrument, we developed a rapid and reproducible PCR-based laboratory assay that meets many of the requirements for a first-tier test for population screening. PMID:20431035
The stabilisation of purified, reconstituted P-glycoprotein by freeze drying with disaccharides.
Heikal, Adam; Box, Karl; Rothnie, Alice; Storm, Janet; Callaghan, Richard; Allen, Marcus
2009-02-01
The drug efflux pump P-glycoprotein (P-gp) (ABCB1) confers multidrug resistance, a major cause of failure in the chemotherapy of tumours, exacerbated by a shortage of potent and selective inhibitors. A high throughput assay using purified P-gp to screen and characterise potential inhibitors would greatly accelerate their development. However, long-term stability of purified reconstituted ABCB1 can only be reliably achieved with storage at -80 degrees C. For example, at 20 degrees C, the activity of ABCB1 was abrogated with a half-life of <1 day. The aim of this investigation was to stabilise purified, reconstituted ABCB1 to enable storage at higher temperatures and thereby enable design of a high throughput assay system. The ABCB1 purification procedure was optimised to allow successful freeze drying by substitution of glycerol with the disaccharides trehalose or maltose. Addition of disaccharides resulted in ATPase activity being retained immediately following lyophilisation with no significant difference between the two disaccharides. However, during storage trehalose preserved ATPase activity for several months regardless of the temperature (e.g. 60% retention at 150 days), whereas ATPase activity in maltose purified P-gp was affected by both storage time and temperature. The data provide an effective mechanism for the production of resilient purified, reconstituted ABCB1.
P-TRAP: a Panicle TRAit Phenotyping tool.
A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza
2013-08-29
In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.
P-TRAP: a Panicle Trait Phenotyping tool
2013-01-01
Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653
Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca
2015-04-01
According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.
Noyes, Aaron; Huffman, Ben; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Sunasara, Khurram; Mukhopadhyay, Tarit
2015-08-01
The biotech industry is under increasing pressure to decrease both time to market and development costs. Simultaneously, regulators are expecting increased process understanding. High throughput process development (HTPD) employs small volumes, parallel processing, and high throughput analytics to reduce development costs and speed the development of novel therapeutics. As such, HTPD is increasingly viewed as integral to improving developmental productivity and deepening process understanding. Particle conditioning steps such as precipitation and flocculation may be used to aid the recovery and purification of biological products. In this first part of two articles, we describe an ultra scale-down system (USD) for high throughput particle conditioning (HTPC) composed of off-the-shelf components. The apparatus is comprised of a temperature-controlled microplate with magnetically driven stirrers and integrated with a Tecan liquid handling robot. With this system, 96 individual reaction conditions can be evaluated in parallel, including downstream centrifugal clarification. A comprehensive suite of high throughput analytics enables measurement of product titer, product quality, impurity clearance, clarification efficiency, and particle characterization. HTPC at the 1 mL scale was evaluated with fermentation broth containing a vaccine polysaccharide. The response profile was compared with the Pilot-scale performance of a non-geometrically similar, 3 L reactor. An engineering characterization of the reactors and scale-up context examines theoretical considerations for comparing this USD system with larger scale stirred reactors. In the second paper, we will explore application of this system to industrially relevant vaccines and test different scale-up heuristics. © 2015 Wiley Periodicals, Inc.
Xie, Chen; Tang, Xiaofeng; Berlinghof, Marvin; Langner, Stefan; Chen, Shi; Späth, Andreas; Li, Ning; Fink, Rainer H; Unruh, Tobias; Brabec, Christoph J
2018-06-27
Development of high-quality organic nanoparticle inks is a significant scientific challenge for the industrial production of solution-processed organic photovoltaics (OPVs) with eco-friendly processing methods. In this work, we demonstrate a novel, robot-based, high-throughput procedure performing automatic poly(3-hexylthio-phene-2,5-diyl) and indene-C 60 bisadduct nanoparticle ink synthesis in nontoxic alcohols. A novel methodology to prepare particle dispersions for fully functional OPVs by manipulating the particle size and solvent system was studied in detail. The ethanol dispersion with a particle diameter of around 80-100 nm exhibits reduced degradation, yielding a power conversion efficiency of 4.52%, which is the highest performance reported so far for water/alcohol-processed OPV devices. By successfully deploying the high-throughput robot-based approach for an organic nanoparticle ink preparation, we believe that the findings demonstrated in this work will trigger more research interest and effort on eco-friendly industrial production of OPVs.
Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei
2007-01-01
Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.
Controlling high-throughput manufacturing at the nano-scale
NASA Astrophysics Data System (ADS)
Cooper, Khershed P.
2013-09-01
Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.
Zakaria, Rosita; Allen, Katrina J; Koplin, Jennifer J; Roche, Peter; Greaves, Ronda F
2016-12-01
Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; "blood spot" and "mass spectrometry"; while excluding "newborn"; and "neonate". In addition, databases were restricted to English language and human specific. There was no time period limit applied. As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required.
Use of High-Throughput Testing and Approaches for Evaluating Chemical Risk-Relevance to Humans
ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput screening (HTS) and computational models that integrate knowledge of biological systems and in vivo toxicities. Many of these assays probe signaling pathways and cellular processes critical to...
High-throughput sequence alignment using Graphics Processing Units
Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh
2007-01-01
Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU. PMID:18070356
Assessment of advanced coal gasification processes
NASA Technical Reports Server (NTRS)
Mccarthy, J.; Ferrall, J.; Charng, T.; Houseman, J.
1981-01-01
A technical assessment of the following advanced coal gasification processes is presented: high throughput gasification (HTG) process; single stage high mass flux (HMF) processes; (CS/R) hydrogasification process; and the catalytic coal gasification (CCG) process. Each process is evaluated for its potential to produce synthetic natural gas from a bituminous coal. Key similarities, differences, strengths, weaknesses, and potential improvements to each process are identified. The HTG and the HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging, and syngas as the initial raw product gas. The CS/R hydrogasifier is also SRT, but is nonslagging and produces a raw gas high in methane content. The CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier.
High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis
Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.
2016-09-23
High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less
Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli
2010-01-01
The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers
Bergander, Tryggve; Nilsson-Välimaa, Kristina; Oberg, Katarina; Lacki, Karol M
2008-01-01
Steadily increasing demand for more efficient and more affordable biomolecule-based therapies put a significant burden on biopharma companies to reduce the cost of R&D activities associated with introduction of a new drug to the market. Reducing the time required to develop a purification process would be one option to address the high cost issue. The reduction in time can be accomplished if more efficient methods/tools are available for process development work, including high-throughput techniques. This paper addresses the transitions from traditional column-based process development to a modern high-throughput approach utilizing microtiter filter plates filled with a well-defined volume of chromatography resin. The approach is based on implementing the well-known batch uptake principle into microtiter plate geometry. Two variants of the proposed approach, allowing for either qualitative or quantitative estimation of dynamic binding capacity as a function of residence time, are described. Examples of quantitative estimation of dynamic binding capacities of human polyclonal IgG on MabSelect SuRe and of qualitative estimation of dynamic binding capacity of amyloglucosidase on a prototype of Capto DEAE weak ion exchanger are given. The proposed high-throughput method for determination of dynamic binding capacity significantly reduces time and sample consumption as compared to a traditional method utilizing packed chromatography columns without sacrificing the accuracy of data obtained.
Wei, Sean T S; Lacap-Bugler, Donnabella C; Lau, Maggie C Y; Caruso, Tancredi; Rao, Subramanya; de Los Rios, Asunción; Archer, Stephen K; Chiu, Jill M Y; Higgins, Colleen; Van Nostrand, Joy D; Zhou, Jizhong; Hopkins, David W; Pointing, Stephen B
2016-01-01
The McMurdo Dry Valleys of Antarctica are an extreme polar desert. Mineral soils support subsurface microbial communities and translucent rocks support development of hypolithic communities on ventral surfaces in soil contact. Despite significant research attention, relatively little is known about taxonomic and functional diversity or their inter-relationships. Here we report a combined diversity and functional interrogation for soil and hypoliths of the Miers Valley in the McMurdo Dry Valleys of Antarctica. The study employed 16S rRNA fingerprinting and high throughput sequencing combined with the GeoChip functional microarray. The soil community was revealed as a highly diverse reservoir of bacterial diversity dominated by actinobacteria. Hypolithic communities were less diverse and dominated by cyanobacteria. Major differences in putative functionality were that soil communities displayed greater diversity in stress tolerance and recalcitrant substrate utilization pathways, whilst hypolithic communities supported greater diversity of nutrient limitation adaptation pathways. A relatively high level of functional redundancy in both soil and hypoliths may indicate adaptation of these communities to fluctuating environmental conditions.
Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen
2015-10-01
Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.
Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes
2012-01-01
Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
High-throughput strategies for the discovery and engineering of enzymes for biocatalysis.
Jacques, Philippe; Béchet, Max; Bigan, Muriel; Caly, Delphine; Chataigné, Gabrielle; Coutte, François; Flahaut, Christophe; Heuson, Egon; Leclère, Valérie; Lecouturier, Didier; Phalip, Vincent; Ravallec, Rozenn; Dhulster, Pascal; Froidevaux, Rénato
2017-02-01
Innovations in novel enzyme discoveries impact upon a wide range of industries for which biocatalysis and biotransformations represent a great challenge, i.e., food industry, polymers and chemical industry. Key tools and technologies, such as bioinformatics tools to guide mutant library design, molecular biology tools to create mutants library, microfluidics/microplates, parallel miniscale bioreactors and mass spectrometry technologies to create high-throughput screening methods and experimental design tools for screening and optimization, allow to evolve the discovery, development and implementation of enzymes and whole cells in (bio)processes. These technological innovations are also accompanied by the development and implementation of clean and sustainable integrated processes to meet the growing needs of chemical, pharmaceutical, environmental and biorefinery industries. This review gives an overview of the benefits of high-throughput screening approach from the discovery and engineering of biocatalysts to cell culture for optimizing their production in integrated processes and their extraction/purification.
Future technologies for monitoring HIV drug resistance and cure.
Parikh, Urvi M; McCormick, Kevin; van Zyl, Gert; Mellors, John W
2017-03-01
Sensitive, scalable and affordable assays are critically needed for monitoring the success of interventions for preventing, treating and attempting to cure HIV infection. This review evaluates current and emerging technologies that are applicable for both surveillance of HIV drug resistance (HIVDR) and characterization of HIV reservoirs that persist despite antiretroviral therapy and are obstacles to curing HIV infection. Next-generation sequencing (NGS) has the potential to be adapted into high-throughput, cost-efficient approaches for HIVDR surveillance and monitoring during continued scale-up of antiretroviral therapy and rollout of preexposure prophylaxis. Similarly, improvements in PCR and NGS are resulting in higher throughput single genome sequencing to detect intact proviruses and to characterize HIV integration sites and clonal expansions of infected cells. Current population genotyping methods for resistance monitoring are high cost and low throughput. NGS, combined with simpler sample collection and storage matrices (e.g. dried blood spots), has considerable potential to broaden global surveillance and patient monitoring for HIVDR. Recent adaptions of NGS to identify integration sites of HIV in the human genome and to characterize the integrated HIV proviruses are likely to facilitate investigations of the impact of experimental 'curative' interventions on HIV reservoirs.
Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F
2010-06-01
The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.
Accelerating the design of solar thermal fuel materials through high throughput simulations.
Liu, Yun; Grossman, Jeffrey C
2014-12-10
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.
Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela
2014-09-25
Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by <15%. Stability during storage at different temperatures was confirmed for three weeks. The limits of detection and quantification for each biomarker varied from 0.3 to 6.3 μmol/l and from 1.0 to 20.9 μmol/l, respectively. Analyses of urine specimens from affected patients revealed abnormal results. Targeted biomarkers in urine were detected in the first weeks of life. This rapid, simple and robust liquid chromatography/tandem mass spectrometry methodology is an efficient tool applicable to urine screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen
2015-04-15
High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.
Awan, Muaaz Gul; Saeed, Fahad
2016-05-15
Modern proteomics studies utilize high-throughput mass spectrometers which can produce data at an astonishing rate. These big mass spectrometry (MS) datasets can easily reach peta-scale level creating storage and analytic problems for large-scale systems biology studies. Each spectrum consists of thousands of peaks which have to be processed to deduce the peptide. However, only a small percentage of peaks in a spectrum are useful for peptide deduction as most of the peaks are either noise or not useful for a given spectrum. This redundant processing of non-useful peaks is a bottleneck for streaming high-throughput processing of big MS data. One way to reduce the amount of computation required in a high-throughput environment is to eliminate non-useful peaks. Existing noise removing algorithms are limited in their data-reduction capability and are compute intensive making them unsuitable for big data and high-throughput environments. In this paper we introduce a novel low-complexity technique based on classification, quantization and sampling of MS peaks. We present a novel data-reductive strategy for analysis of Big MS data. Our algorithm, called MS-REDUCE, is capable of eliminating noisy peaks as well as peaks that do not contribute to peptide deduction before any peptide deduction is attempted. Our experiments have shown up to 100× speed up over existing state of the art noise elimination algorithms while maintaining comparable high quality matches. Using our approach we were able to process a million spectra in just under an hour on a moderate server. The developed tool and strategy has been made available to wider proteomics and parallel computing community and the code can be found at https://github.com/pcdslab/MSREDUCE CONTACT: : fahad.saeed@wmich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Back, Alexandre; Rossignol, Tristan; Krier, François; Nicaud, Jean-Marc; Dhulster, Pascal
2016-08-23
Because the model yeast Yarrowia lipolytica can synthesize and store lipids in quantities up to 20 % of its dry weight, it is a promising microorganism for oil production at an industrial scale. Typically, optimization of the lipid production process is performed in the laboratory and later scaled up for industrial production. However, the scale-up process can be complicated by genetic modifications that are optimized for one set of growing conditions can confer a less-than-optimal phenotype in a different environment. To address this issue, small cultivation systems have been developed that mimic the conditions in benchtop bioreactors. In this work, we used one such microbioreactor system, the BioLector, to develop high-throughput fermentation procedures that optimize growth and lipid accumulation in Y. lipolytica. Using this system, we were able to monitor lipid and biomass production in real time throughout the culture duration. The BioLector can monitor the growth of Y. lipolytica in real time by evaluating scattered light; this produced accurate measurements until cultures reached an equivalent of OD600nm = 115 and a cell dry weight of 100 g L(-1). In addition, a lipid-specific fluorescent probe was applied which reliably monitored lipid production up to a concentration of 12 g L(-1). Through screening various growing conditions, we determined that a carbon/nitrogen ratio of 35 was the most efficient for lipid production. Further screening showed that ammonium chloride and glycerol were the most valuable nitrogen and carbon sources, respectively, for growth and lipid production. Moreover, a carbon concentration above 1 M appeared to impair growth and lipid accumulation. Finally, we used these optimized conditions to screen engineered strains of Y. lipolytica with high lipid-accumulation capability. The growth and lipid content of the strains cultivated in the BioLector were compared to those grown in benchtop bioreactors. To our knowledge, this is the first time that the BioLector has been used to track lipid production in real time and to monitor the growth of Y. lipolytica. The present study also showed the efficacy of the BioLector in screening growing conditions and engineered strains prior to scale-up. The method described here could be applied to other oleaginous microorganisms.
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
Silveira, Alexandre Kleber; Correa, Ana Paula Folmer; Oliveria, Rafael R.; Borges, Adriana Giongo; Grun, Lucas; Barbé-Tuana, Florencia; Zmozinski, Ariane; Brandelli, Adriano; Vale, Maria Goretti Rodrigues; Bassani, Valquiria Linck; Moreira, José Cláudio Fonseca
2017-01-01
Three Achyrocline satureioides (AS) inflorescences extracts were characterized: (i) a freeze-dried extract prepared from the aqueous extractive solution and (ii) a freeze-dried and (iii) a spray-dried extract prepared from hydroethanol extractive solution (80% ethanol). The chemical profile, antioxidant potential, and antimicrobial activity against intestinal pathogenic bacteria of AS extracts were evaluated. In vitro antioxidant activity was determined by the total reactive antioxidant potential (TRAP) assay. In vivo analysis and characterization of intestinal microbiota were performed in male Wistar rats (saline versus treated animals with AS dried extracts) by high-throughput sequencing analysis: metabarcoding. Antimicrobial activity was tested in vitro by the disc diffusion tests. Moisture content of the extracts ranged from 10 to 15% and 5.7 to 17 mg kg−1 of fluorine. AS exhibited antioxidant activity, especially in its freeze-dried form which also exhibited a wide spectrum of antimicrobial activity against intestinal pathogenic bacteria greater than those observed by the antibiotic, amoxicillin, when tested against Bacillus cereus and Staphylococcus aureus. Antioxidant and antimicrobial activities of AS extracts seemed to be positively correlated with the present amount of flavonoids. These findings suggest a potential use of AS as a coadjuvant agent for treating bacterial-induced intestinal diseases with high rates of antibiotic resistance. PMID:29853943
Moresco, Karla Suzana; Silveira, Alexandre Kleber; Zeidán-Chuliá, Fares; Correa, Ana Paula Folmer; Oliveria, Rafael R; Borges, Adriana Giongo; Grun, Lucas; Barbé-Tuana, Florencia; Zmozinski, Ariane; Brandelli, Adriano; Vale, Maria Goretti Rodrigues; Gelain, Daniel Pens; Bassani, Valquiria Linck; Moreira, José Cláudio Fonseca
2017-01-01
Three Achyrocline satureioides (AS) inflorescences extracts were characterized: (i) a freeze-dried extract prepared from the aqueous extractive solution and (ii) a freeze-dried and (iii) a spray-dried extract prepared from hydroethanol extractive solution (80% ethanol). The chemical profile, antioxidant potential, and antimicrobial activity against intestinal pathogenic bacteria of AS extracts were evaluated. In vitro antioxidant activity was determined by the total reactive antioxidant potential (TRAP) assay. In vivo analysis and characterization of intestinal microbiota were performed in male Wistar rats (saline versus treated animals with AS dried extracts) by high-throughput sequencing analysis: metabarcoding. Antimicrobial activity was tested in vitro by the disc diffusion tests. Moisture content of the extracts ranged from 10 to 15% and 5.7 to 17 mg kg -1 of fluorine. AS exhibited antioxidant activity, especially in its freeze-dried form which also exhibited a wide spectrum of antimicrobial activity against intestinal pathogenic bacteria greater than those observed by the antibiotic, amoxicillin, when tested against Bacillus cereus and Staphylococcus aureus . Antioxidant and antimicrobial activities of AS extracts seemed to be positively correlated with the present amount of flavonoids. These findings suggest a potential use of AS as a coadjuvant agent for treating bacterial-induced intestinal diseases with high rates of antibiotic resistance.
Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng
2018-06-04
In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.
Chemical perturbation of vascular development is a putative toxicity pathway which may result in developmental toxicity. EPA’s high-throughput screening (HTS) ToxCast program contains assays which measure cellular signals and biological processes critical for blood vessel develop...
High-Throughput, Motility-Based Sorter for Microswimmers such as C. elegans
Yuan, Jinzhou; Zhou, Jessie; Raizen, David M.; Bau, Haim H.
2015-01-01
Animal motility varies with genotype, disease, aging, and environmental conditions. In many studies, it is desirable to carry out high throughput motility-based sorting to isolate rare animals for, among other things, forward genetic screens to identify genetic pathways that regulate phenotypes of interest. Many commonly used screening processes are labor-intensive, lack sensitivity, and require extensive investigator training. Here, we describe a sensitive, high throughput, automated, motility-based method for sorting nematodes. Our method is implemented in a simple microfluidic device capable of sorting thousands of animals per hour per module, and is amenable to parallelism. The device successfully enriches for known C. elegans motility mutants. Furthermore, using this device, we isolate low-abundance mutants capable of suppressing the somnogenic effects of the flp-13 gene, which regulates C. elegans sleep. By performing genetic complementation tests, we demonstrate that our motility-based sorting device efficiently isolates mutants for the same gene identified by tedious visual inspection of behavior on an agar surface. Therefore, our motility-based sorter is capable of performing high throughput gene discovery approaches to investigate fundamental biological processes. PMID:26008643
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
High-throughput electrical characterization for robust overlay lithography control
NASA Astrophysics Data System (ADS)
Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.
2017-03-01
Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.
Isolation and Characterization of a Novel, Highly Selective Astaxanthin-Producing Marine Bacterium.
Asker, Dalal
2017-10-18
A high-throughput screening approach for astaxanthin-producing bacteria led to the discovery of a novel, highly selective astaxanthin-producing marine bacterium (strain N-5). Phylogenetic analysis based on partial 16S rRNA gene and phenotypic metabolic testing indicated it belongs to the genus Brevundimonas. Therefore, it was designated as Brevundimonas sp. strain N-5. To identify and quantify carotenoids produced by strain N-5, HPLC-DAD and HPLC-MS methods were used. The culture conditions including media, shaking, and time had significant effects on cell growth and carotenoids production including astaxanthin. The total carotenoids were ∼601.2 μg g -1 dry cells including a remarkable amount (364.6 μg g -1 dry cells) of optically pure astaxanthin (3S, 3'S) isomer, with high selectivity (∼60.6%) under medium aeration conditions. Notably, increasing the culture aeration enhanced astaxanthin production up to 85% of total carotenoids. This is the first report that describes a natural, highly selective astaxanthin-producing marine bacterium.
Mönch, Bettina; Becker, Roland; Nehls, Irene
2014-01-01
A combination of simultaneous milling and extraction known as micropulverized extraction was developed for the quantification of the alcohol marker ethyl glucuronide (EtG) in hair samples using a homogeneous reference material and a mixer mill. Best extraction results from 50 mg of hair were obtained with 2-mL plastic tubes containing two steel balls (∅ = 5 mm), 0.5 mL of water and with an oscillating frequency of 30 s(-1) over a period of 30 min. EtG was quantified employing a validated GC-MS procedure involving derivatization with pentafluoropropionic acid anhydride. This micropulverization procedure was compared with dry milling followed by separate aqueous extraction and with aqueous extraction after manual cutting to millimeter-size snippets. Micropulverization yielded 28.0 ± 1.70 pg/mg and was seen to be superior to manually cutting (23.0 ± 0.83 pg/mg) and equivalent to dry grinding (27.7 ± 1.71 pg/mg) with regard to completeness of EtG extraction. The option to process up to 20 samples simultaneously makes micropulverization especially valuable for the high throughput of urgent samples.
Leulliot, Nicolas; Trésaugues, Lionel; Bremang, Michael; Sorel, Isabelle; Ulryck, Nathalie; Graille, Marc; Aboulfath, Ilham; Poupon, Anne; Liger, Dominique; Quevillon-Cheruel, Sophie; Janin, Joël; van Tilbeurgh, Herman
2005-06-01
Crystallization has long been regarded as one of the major bottlenecks in high-throughput structural determination by X-ray crystallography. Structural genomics projects have addressed this issue by using robots to set up automated crystal screens using nanodrop technology. This has moved the bottleneck from obtaining the first crystal hit to obtaining diffraction-quality crystals, as crystal optimization is a notoriously slow process that is difficult to automatize. This article describes the high-throughput optimization strategies used in the Yeast Structural Genomics project, with selected successful examples.
Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter
2016-01-01
Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263
Singh, Nitesh Kumar; Ernst, Mathias; Liebscher, Volkmar; Fuellen, Georg; Taher, Leila
2016-10-20
The biological relationships both between and within the functions, processes and pathways that operate within complex biological systems are only poorly characterized, making the interpretation of large scale gene expression datasets extremely challenging. Here, we present an approach that integrates gene expression and biological annotation data to identify and describe the interactions between biological functions, processes and pathways that govern a phenotype of interest. The product is a global, interconnected network, not of genes but of functions, processes and pathways, that represents the biological relationships within the system. We validated our approach on two high-throughput expression datasets describing organismal and organ development. Our findings are well supported by the available literature, confirming that developmental processes and apoptosis play key roles in cell differentiation. Furthermore, our results suggest that processes related to pluripotency and lineage commitment, which are known to be critical for development, interact mainly indirectly, through genes implicated in more general biological processes. Moreover, we provide evidence that supports the relevance of cell spatial organization in the developing liver for proper liver function. Our strategy can be viewed as an abstraction that is useful to interpret high-throughput data and devise further experiments.
High-throughput automatic defect review for 300mm blank wafers with atomic force microscope
NASA Astrophysics Data System (ADS)
Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il
2015-03-01
While feature size in lithography process continuously becomes smaller, defect sizes on blank wafers become more comparable to device sizes. Defects with nm-scale characteristic size could be misclassified by automated optical inspection (AOI) and require post-processing for proper classification. Atomic force microscope (AFM) is known to provide high lateral and the highest vertical resolution by mechanical probing among all techniques. However, its low throughput and tip life in addition to the laborious efforts for finding the defects have been the major limitations of this technique. In this paper we introduce automatic defect review (ADR) AFM as a post-inspection metrology tool for defect study and classification for 300 mm blank wafers and to overcome the limitations stated above. The ADR AFM provides high throughput, high resolution, and non-destructive means for obtaining 3D information for nm-scale defect review and classification.
High-throughput method to predict extrusion pressure of ceramic pastes.
Cao, Kevin; Liu, Yang; Tucker, Christopher; Baumann, Michael; Grit, Grote; Lakso, Steven
2014-04-14
A new method was developed to measure the rheology of extrudable ceramic pastes using a Hamilton MicroLab Star liquid handler. The Hamilton instrument, normally used for high throughput liquid processing, was expanded to function as a low pressure capillary rheometer. Diluted ceramic pastes were forced through the modified pipettes, which produced pressure drop data that was converted to standard rheology data. A known ceramic paste containing cellulose ether was made and diluted to various concentrations in water. The most dilute paste samples were tested in the Hamilton instrument and the more typical, highly concentrated, ceramic paste were tested with a hydraulic ram extruder fitted with a capillary die and pressure measurement system. The rheology data from this study indicates that the dilute high throughput method using the Hamilton instrument correlates to, and can predict, the rheology of concentrated ceramic pastes normally used in ceramic extrusion production processes.
USDA-ARS?s Scientific Manuscript database
The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...
High-throughput sequencing methods to study neuronal RNA-protein interactions.
Ule, Jernej
2009-12-01
UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.
Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Y; Grossman, JC
2014-12-01
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less
Engineering Interfacial Processes at Mini-Micro-Nano Scales Using Sessile Droplet Architecture.
Bansal, Lalit; Sanyal, Apratim; Kabi, Prasenjit; Pathak, Binita; Basu, Saptarshi
2018-03-01
Evaporating sessile functional droplets act as the fundamental building block that controls the cumulative outcome of many industrial and biological applications such as surface patterning, 3D printing, photonic crystals, and DNA sequencing, to name a few. Additionally, a drying single sessile droplet forms a high-throughput processing technique using low material volume which is especially suitable for medical diagnosis. A sessile droplet also provides an elementary platform to study and analyze fundamental interfacial processes at various length scales ranging from macroscopically observable wetting and evaporation to microfluidic transport to interparticle forces operating at a nanometric length scale. As an example, to ascertain the quality of 3D printing we must understand the fundamental interfacial processes at the droplet scale. In this article, we review the coupled physics of evaporation flow-contact-line-driven particle transport in sessile colloidal droplets and provide methodologies to control the same. Through natural alterations in droplet vaporization, one can change the evaporative pattern and contact line dynamics leading to internal flow which will modulate the final particle assembly in a nontrivial fashion. We further show that control over particle transport can also be exerted by external stimuli which can be thermal, mechanical oscillations, vapor confinement (walled or a fellow droplet), or chemical (surfactant-induced) in nature. For example, significant augmentation of an otherwise evaporation-driven particle transport in sessile droplets can be brought about simply through controlled interfacial oscillations. The ability to control the final morphologies by manipulating the governing interfacial mechanisms in the precursor stages of droplet drying makes it perfectly suitable for fabrication-, mixing-, and diagnostic-based applications.
Morphology control in polymer blend fibers—a high throughput computing approach
NASA Astrophysics Data System (ADS)
Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar
2016-08-01
Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.
ERIC Educational Resources Information Center
Grossman, Elly S.; Cleaton-Jones, Peter E.
2011-01-01
This retrospective study documents the Masters and PhD training of 131 Dental Research Institute (DRI) postgraduates (1954-2006) to establish demographics, throughput and research outcomes for future PhD pipeline strategies using the DRI database. Descriptive statistics show four degree-based groups of postgraduates: 18 PhDs; 55 MScs; 42 MDents…
Asif, Muhammad; Guo, Xiangzhou; Zhang, Jing; Miao, Jungang
2018-04-17
Digital cross-correlation is central to many applications including but not limited to Digital Image Processing, Satellite Navigation and Remote Sensing. With recent advancements in digital technology, the computational demands of such applications have increased enormously. In this paper we are presenting a high throughput digital cross correlator, capable of processing 1-bit digitized stream, at the rate of up to 2 GHz, simultaneously on 64 channels i.e., approximately 4 Trillion correlation and accumulation operations per second. In order to achieve higher throughput, we have focused on frequency based partitioning of our design and tried to minimize and localize high frequency operations. This correlator is designed for a Passive Millimeter Wave Imager intended for the detection of contraband items concealed on human body. The goals are to increase the system bandwidth, achieve video rate imaging, improve sensitivity and reduce the size. Design methodology is detailed in subsequent sections, elaborating the techniques enabling high throughput. The design is verified for Xilinx Kintex UltraScale device in simulation and the implementation results are given in terms of device utilization and power consumption estimates. Our results show considerable improvements in throughput as compared to our baseline design, while the correlator successfully meets the functional requirements.
High Throughput Screen to Identify Novel Drugs that Inhibit Prostate Cancer Metastasis
2007-10-01
determined by electrophoresis on 8% denaturing polyacryl - amide gel containing 7 M urea. A 10-bp 32P-labeled ladder and sequencing reaction performed with...isolated from nuclear lysates from P69 and C4-2 cells. The bands shown as rectangles were excised after staining of the proteins in the gel and then...the same primer on a genomic clone was used as a reference. The gel was dried, and the radioactive signals were identified by phos- phorimaging (Storm
Cepero-Betancourt, Yamira; Oliva-Moresco, Patricio; Pasten-Contreras, Alexis; Tabilo-Munizaga, Gipsy; Pérez-Won, Mario; Moreno-Osorio, Luis; Lemus-Mondaca, Roberto
2017-10-01
Abalone (Haliotis spp.) is an exotic seafood product recognized as a protein source of high biological value. Traditional methods used to preserve foods such as drying technology can affect their nutritional quality (protein quality and digestibility). A 28-day rat feeding study was conducted to evaluate the effects of the drying process assisted by high-pressure impregnation (HPI) (350, 450, and 500 MPa × 5 min) on chemical proximate and amino acid compositions and nutritional parameters, such as protein efficiency ratio (PER), true digestibility (TD), net protein ratio, and protein digestibility corrected amino acid score (PDCAAS) of dried abalone. The HPI-assisted drying process ensured excellent protein quality based on PER values, regardless of the pressure level. At 350 and 500 MPa, the HPI-assisted drying process had no negative effect on TD and PDCAAS then, based on nutritional parameters analysed, we recommend HPI-assisted drying process at 350 MPa × 5 min as the best process condition to dry abalone. Variations in nutritional parameters compared to casein protein were observed; nevertheless, the high protein quality and digestibility of HPI-assisted dried abalones were maintained to satisfy the metabolic demands of human beings.
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko
2010-06-23
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less
Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor
2016-01-01
Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.
From drug to protein: using yeast genetics for high-throughput target discovery.
Armour, Christopher D; Lum, Pek Yee
2005-02-01
The budding yeast Saccharomyces cerevisiae has long been an effective eukaryotic model system for understanding basic cellular processes. The genetic tractability and ease of manipulation in the laboratory make yeast well suited for large-scale chemical and genetic screens. Several recent studies describing the use of yeast genetics for high-throughput drug target identification are discussed in this review.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., transfer racks, and equipment leaks. An owner or operator who is referred to this subpart for controlling regulated material emissions from storage vessels, process vents, low and high throughput transfer racks, or... racks. (i) For low throughput transfer racks, the owner or operator shall comply with the applicable...
High-throughput countercurrent microextraction in passive mode.
Xie, Tingliang; Xu, Cong
2018-05-15
Although microextraction is much more efficient than conventional macroextraction, its practical application has been limited by low throughputs and difficulties in constructing robust countercurrent microextraction (CCME) systems. In this work, a robust CCME process was established based on a novel passive microextractor with four units without any moving parts. The passive microextractor has internal recirculation and can efficiently mix two immiscible liquids. The hydraulic characteristics as well as the extraction and back-extraction performance of the passive CCME were investigated experimentally. The recovery efficiencies of the passive CCME were 1.43-1.68 times larger than the best values achieved using cocurrent extraction. Furthermore, the total throughput of the passive CCME developed in this work was about one to three orders of magnitude higher than that of other passive CCME systems reported in the literature. Therefore, a robust CCME process with high throughputs has been successfully constructed, which may promote the application of passive CCME in a wide variety of fields.
FPGA cluster for high-performance AO real-time control system
NASA Astrophysics Data System (ADS)
Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.
2006-06-01
Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.
Budavari, Tamas; Langmead, Ben; Wheelan, Sarah J.; Salzberg, Steven L.; Szalay, Alexander S.
2015-01-01
When computing alignments of DNA sequences to a large genome, a key element in achieving high processing throughput is to prioritize locations in the genome where high-scoring mappings might be expected. We formulated this task as a series of list-processing operations that can be efficiently performed on graphics processing unit (GPU) hardware.We followed this approach in implementing a read aligner called Arioc that uses GPU-based parallel sort and reduction techniques to identify high-priority locations where potential alignments may be found. We then carried out a read-by-read comparison of Arioc’s reported alignments with the alignments found by several leading read aligners. With simulated reads, Arioc has comparable or better accuracy than the other read aligners we tested. With human sequencing reads, Arioc demonstrates significantly greater throughput than the other aligners we evaluated across a wide range of sensitivity settings. The Arioc software is available at https://github.com/RWilton/Arioc. It is released under a BSD open-source license. PMID:25780763
Process in manufacturing high efficiency AlGaAs/GaAs solar cells by MO-CVD
NASA Technical Reports Server (NTRS)
Yeh, Y. C. M.; Chang, K. I.; Tandon, J.
1984-01-01
Manufacturing technology for mass producing high efficiency GaAs solar cells is discussed. A progress using a high throughput MO-CVD reactor to produce high efficiency GaAs solar cells is discussed. Thickness and doping concentration uniformity of metal oxide chemical vapor deposition (MO-CVD) GaAs and AlGaAs layer growth are discussed. In addition, new tooling designs are given which increase the throughput of solar cell processing. To date, 2cm x 2cm AlGaAs/GaAs solar cells with efficiency up to 16.5% were produced. In order to meet throughput goals for mass producing GaAs solar cells, a large MO-CVD system (Cambridge Instrument Model MR-200) with a susceptor which was initially capable of processing 20 wafers (up to 75 mm diameter) during a single growth run was installed. In the MR-200, the sequencing of the gases and the heating power are controlled by a microprocessor-based programmable control console. Hence, operator errors can be reduced, leading to a more reproducible production sequence.
A comparison of high-throughput techniques for assaying circadian rhythms in plants.
Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony
2015-01-01
Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.
Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y
2014-07-08
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B
2009-03-01
The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.
Bahrami-Samani, Emad; Vo, Dat T.; de Araujo, Patricia Rosa; Vogel, Christine; Smith, Andrew D.; Penalva, Luiz O. F.; Uren, Philip J.
2014-01-01
Co- and post-transcriptional regulation of gene expression is complex and multi-faceted, spanning the complete RNA lifecycle from genesis to decay. High-throughput profiling of the constituent events and processes is achieved through a range of technologies that continue to expand and evolve. Fully leveraging the resulting data is non-trivial, and requires the use of computational methods and tools carefully crafted for specific data sources and often intended to probe particular biological processes. Drawing upon databases of information pre-compiled by other researchers can further elevate analyses. Within this review, we describe the major co- and post-transcriptional events in the RNA lifecycle that are amenable to high-throughput profiling. We place specific emphasis on the analysis of the resulting data, in particular the computational tools and resources available, as well as looking towards future challenges that remain to be addressed. PMID:25515586
Keenan, Martine; Alexander, Paul W; Chaplin, Jason H; Abbott, Michael J; Diao, Hugo; Wang, Zhisen; Best, Wayne M; Perez, Catherine J; Cornwall, Scott M J; Keatley, Sarah K; Thompson, R C Andrew; Charman, Susan A; White, Karen L; Ryan, Eileen; Chen, Gong; Ioset, Jean-Robert; von Geldern, Thomas W; Chatelain, Eric
2013-10-01
Inhibitors of Trypanosoma cruzi with novel mechanisms of action are urgently required to diversify the current clinical and preclinical pipelines. Increasing the number and diversity of hits available for assessment at the beginning of the discovery process will help to achieve this aim. We report the evaluation of multiple hits generated from a high-throughput screen to identify inhibitors of T. cruzi and from these studies the discovery of two novel series currently in lead optimization. Lead compounds from these series potently and selectively inhibit growth of T. cruzi in vitro and the most advanced compound is orally active in a subchronic mouse model of T. cruzi infection. High-throughput screening of novel compound collections has an important role to play in diversifying the trypanosomatid drug discovery portfolio. A new T. cruzi inhibitor series with good drug-like properties and promising in vivo efficacy has been identified through this process.
Method of pyrolyzing brown coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michel, W.; Heberlein, I.; Ossowski, M.
A two-step method and apparatus are disclosed based on the fluidized bed principle, for the production of coke, rich gas and pyrolysis tar, with the object of executing the method in a compact apparatus arrangement, with high energy efficiency and high throughput capacity. This is accomplished by a sequence in which the fine grains removed from the drying vapor mixture are removed from the actual pyrolysis process, and a hot gas, alien to the carbonization, is used as fluidization medium in the pyrolysis reactor, and with a hot gas-high performance separator being used for the dust separation from the pyrolysismore » gas, with the combustion exhaust gas produced in the combustion chamber being used for the indirect heating of the fluidization medium, for the pre-heating of the gas, which is alien to the carbonization, and for the direct heating in the dryer. The dryer has a double casing in the area of the fluidized bed, and a mixing chamber is arranged directly underneath its initial flow bottom, while the pyrolysis reactor is directly connected to the combustion chamber and the pre-heater.« less
Determining mechanism-based biomarkers that distinguish adaptive and adverse cellular processes is critical to understanding the health effects of environmental exposures. Shifting from in vivo, low-throughput toxicity studies to high-throughput screening (HTS) paradigms and risk...
Jia, Kun; Bijeon, Jean Louis; Adam, Pierre Michel; Ionescu, Rodica Elena
2013-02-21
A commercial TEM grid was used as a mask for the creation of extremely well-organized gold micro-/nano-structures on a glass substrate via a high temperature annealing process at 500 °C. The structured substrate was (bio)functionalized and used for the high throughput LSPR immunosensing of different concentrations of a model protein named bovine serum albumin.
Improving bed turnover time with a bed management system.
Tortorella, Frank; Ukanowicz, Donna; Douglas-Ntagha, Pamela; Ray, Robert; Triller, Maureen
2013-01-01
Efficient patient throughput requires a high degree of coordination and communication. Opportunities abound to improve the patient experience by eliminating waste from the process and improving communication among the multiple disciplines involved in facilitating patient flow. In this article, we demonstrate how an interdisciplinary team at a large tertiary cancer center implemented an electronic bed management system to improve the bed turnover component of the patient throughput process.
High-throughput imaging of heterogeneous cell organelles with an X-ray laser (CXIDB ID 25)
Hantke, Max, F.
2014-11-17
Preprocessed detector images that were used for the paper "High-throughput imaging of heterogeneous cell organelles with an X-ray laser". The CXI file contains the entire recorded data - including both hits and blanks. It also includes down-sampled images and LCLS machine parameters. Additionally, the Cheetah configuration file is attached that was used to create the pre-processed data.
NASA Astrophysics Data System (ADS)
Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung
2010-12-01
This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.
NASA Astrophysics Data System (ADS)
Mousa, MoatazBellah Mahmoud
Atomic Layer Deposition (ALD) is a vapor phase nano-coating process that deposits very uniform and conformal thin film materials with sub-angstrom level thickness control on various substrates. These unique properties made ALD a platform technology for numerous products and applications. However, most of these applications are limited to the lab scale due to the low process throughput relative to the other deposition techniques, which hinders its industrial adoption. In addition to the low throughput, the process development for certain applications usually faces other obstacles, such as: a required new processing mode (e.g., batch vs continuous) or process conditions (e.g., low temperature), absence of an appropriate reactor design for a specific substrate and sometimes the lack of a suitable chemistry. This dissertation studies different aspects of ALD process development for prospect applications in the semiconductor, textiles, and battery industries, as well as novel organic-inorganic hybrid materials. The investigation of a high pressure, low temperature ALD process for metal oxides deposition using multiple process chemistry revealed the vital importance of the gas velocity over the substrate to achieve fast depositions at these challenging processing conditions. Also in this work, two unique high throughput ALD reactor designs are reported. The first is a continuous roll-to-roll ALD reactor for ultra-fast coatings on porous, flexible substrates with very high surface area. While the second reactor is an ALD delivery head that allows for in loco ALD coatings that can be executed under ambient conditions (even outdoors) on large surfaces while still maintaining very high deposition rates. As a proof of concept, part of a parked automobile window was coated using the ALD delivery head. Another process development shown herein is the improvement achieved in the selective synthesis of organic-inorganic materials using an ALD based process called sequential vapor infiltration. Finally, the development of a new ALD chemistry for novel metal deposition is discussed and was used to deposit thin films of tin metal for the first time in literature using an ALD process. The various challenges addressed in this work for the development of different ALD processes help move ALD closer to widespread use and industrial integration.
Hattrick-Simpers, Jason R.; Gregoire, John M.; Kusne, A. Gilad
2016-05-26
With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. Here, we review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams andmore » beyond.« less
NASA Astrophysics Data System (ADS)
Parisi, P.; Mani, A.; Perry-Sullivan, C.; Kopp, J.; Simpson, G.; Renis, M.; Padovani, M.; Severgnini, C.; Piacentini, P.; Piazza, P.; Beccalli, A.
2009-12-01
After-develop inspection (ADI) and photo-cell monitoring (PM) are part of a comprehensive lithography process monitoring strategy. Capturing defects of interest (DOI) in the lithography cell rather than at later process steps shortens the cycle time and allows for wafer re-work, reducing overall cost and improving yield. Low contrast DOI and multiple noise sources make litho inspection challenging. Broadband brightfield inspectors provide the highest sensitivity to litho DOI and are traditionally used for ADI and PM. However, a darkfield imaging inspector has shown sufficient sensitivity to litho DOI, providing a high-throughput option for litho defect monitoring. On the darkfield imaging inspector, a very high sensitivity inspection is used in conjunction with advanced defect binning to detect pattern issues and other DOI and minimize nuisance defects. For ADI, this darkfield inspection methodology enables the separation and tracking of 'color variation' defects that correlate directly to CD variations allowing a high-sampling monitor for focus excursions, thereby reducing scanner re-qualification time. For PM, the darkfield imaging inspector provides sensitivity to critical immersion litho defects at a lower cost-of-ownership. This paper describes litho monitoring methodologies developed and implemented for flash devices for 65nm production and 45nm development using the darkfield imaging inspector.
Quesada-Cabrera, Raul; Weng, Xiaole; Hyett, Geoff; Clark, Robin J H; Wang, Xue Z; Darr, Jawwad A
2013-09-09
High-throughput continuous hydrothermal flow synthesis was used to manufacture 66 unique nanostructured oxide samples in the Ce-Zr-Y-O system. This synthesis approach resulted in a significant increase in throughput compared to that of conventional batch or continuous hydrothermal synthesis methods. The as-prepared library samples were placed into a wellplate for both automated high-throughput powder X-ray diffraction and Raman spectroscopy data collection, which allowed comprehensive structural characterization and phase mapping. The data suggested that a continuous cubic-like phase field connects all three Ce-Zr-O, Ce-Y-O, and Y-Zr-O binary systems together with a smooth and steady transition between the structures of neighboring compositions. The continuous hydrothermal process led to as-prepared crystallite sizes in the range of 2-7 nm (as determined by using the Scherrer equation).
Automated crystallographic system for high-throughput protein structure determination.
Brunzelle, Joseph S; Shafaee, Padram; Yang, Xiaojing; Weigand, Steve; Ren, Zhong; Anderson, Wayne F
2003-07-01
High-throughput structural genomic efforts require software that is highly automated, distributive and requires minimal user intervention to determine protein structures. Preliminary experiments were set up to test whether automated scripts could utilize a minimum set of input parameters and produce a set of initial protein coordinates. From this starting point, a highly distributive system was developed that could determine macromolecular structures at a high throughput rate, warehouse and harvest the associated data. The system uses a web interface to obtain input data and display results. It utilizes a relational database to store the initial data needed to start the structure-determination process as well as generated data. A distributive program interface administers the crystallographic programs which determine protein structures. Using a test set of 19 protein targets, 79% were determined automatically.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Novel method for the high-throughput processing of slides for the comet assay
Karbaschi, Mahsa; Cooke, Marcus S.
2014-01-01
Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. “Scoring”, or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure. PMID:25425241
Novel method for the high-throughput processing of slides for the comet assay.
Karbaschi, Mahsa; Cooke, Marcus S
2014-11-26
Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.
Jiang, Liren
2017-01-01
Background The aim was to develop scalable Whole Slide Imaging (sWSI), a WSI system based on mainstream smartphones coupled with regular optical microscopes. This ultra-low-cost solution should offer diagnostic-ready imaging quality on par with standalone scanners, supporting both oil and dry objective lenses of different magnifications, and reasonably high throughput. These performance metrics should be evaluated by expert pathologists and match those of high-end scanners. Objective The aim was to develop scalable Whole Slide Imaging (sWSI), a whole slide imaging system based on smartphones coupled with optical microscopes. This ultra-low-cost solution should offer diagnostic-ready imaging quality on par with standalone scanners, supporting both oil and dry object lens of different magnification. All performance metrics should be evaluated by expert pathologists and match those of high-end scanners. Methods In the sWSI design, the digitization process is split asynchronously between light-weight clients on smartphones and powerful cloud servers. The client apps automatically capture FoVs at up to 12-megapixel resolution and process them in real-time to track the operation of users, then give instant feedback of guidance. The servers first restitch each pair of FoVs, then automatically correct the unknown nonlinear distortion introduced by the lens of the smartphone on the fly, based on pair-wise stitching, before finally combining all FoVs into one gigapixel VS for each scan. These VSs can be viewed using Internet browsers anywhere. In the evaluation experiment, 100 frozen section slides from patients randomly selected among in-patients of the participating hospital were scanned by both a high-end Leica scanner and sWSI. All VSs were examined by senior pathologists whose diagnoses were compared against those made using optical microscopy as ground truth to evaluate the image quality. Results The sWSI system is developed for both Android and iPhone smartphones and is currently being offered to the public. The image quality is reliable and throughput is approximately 1 FoV per second, yielding a 15-by-15 mm slide under 20X object lens in approximately 30-35 minutes, with little training required for the operator. The expected cost for setup is approximately US $100 and scanning each slide costs between US $1 and $10, making sWSI highly cost-effective for infrequent or low-throughput usage. In the clinical evaluation of sample-wise diagnostic reliability, average accuracy scores achieved by sWSI-scan-based diagnoses were as follows: 0.78 for breast, 0.88 for uterine corpus, 0.68 for thyroid, and 0.50 for lung samples. The respective low-sensitivity rates were 0.05, 0.05, 0.13, and 0.25 while the respective low-specificity rates were 0.18, 0.08, 0.20, and 0.25. The participating pathologists agreed that the overall quality of sWSI was generally on par with that produced by high-end scanners, and did not affect diagnosis in most cases. Pathologists confirmed that sWSI is reliable enough for standard diagnoses of most tissue categories, while it can be used for quick screening of difficult cases. Conclusions As an ultra-low-cost alternative to whole slide scanners, diagnosis-ready VS quality and robustness for commercial usage is achieved in the sWSI solution. Operated on main-stream smartphones installed on normal optical microscopes, sWSI readily offers affordable and reliable WSI to resource-limited or infrequent clinical users. PMID:28916508
The Adverse Outcome Pathway (AOP) framework provides a systematic way to describe linkages between molecular and cellular processes and organism or population level effects. The current AOP assembly methods however, are inefficient. Our goal is to generate computationally-pr...
GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.
Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M
2010-04-05
Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.
NASA Astrophysics Data System (ADS)
Ahmad, Afandi; Roslan, Muhammad Faris; Amira, Abbes
2017-09-01
In high jump sports, approach take-off speed and force during the take-off are two (2) main important parts to gain maximum jump. To measure both parameters, wireless sensor network (WSN) that contains microcontroller and sensor are needed to describe the results of speed and force for jumpers. Most of the microcontroller exhibit transmission issues in terms of throughput, latency and cost. Thus, this study presents the comparison of wireless microcontrollers in terms of throughput, latency and cost, and the microcontroller that have best performances and cost will be implemented in high jump wearable device. In the experiments, three (3) parts have been integrated - input, process and output. Force (for ankle) and global positioning system (GPS) sensor (for body waist) acts as an input for data transmission. These data were then being processed by both microcontrollers, ESP8266 and Arduino Yun Mini to transmit the data from sensors to the server (host-PC) via message queuing telemetry transport (MQTT) protocol. The server acts as receiver and the results was calculated from the MQTT log files. At the end, results obtained have shown ESP8266 microcontroller had been chosen since it achieved high throughput, low latency and 11 times cheaper in term of prices compared to Arduino Yun Mini microcontroller.
Bharat, Amrita; Blanchard, Jan E.; Brown, Eric D.
2014-01-01
The synthesis of ribosomes is an essential process, which is aided by a variety of transacting factors in bacteria. Among these is a group of GTPases essential for bacterial viability and emerging as promising targets for new antibacterial agents. Herein, we describe a robust high-throughput screening process for inhibitors of one such GTPase, the Escherichia coli EngA protein. The primary screen employed an assay of phosphate production in 384-well density. Reaction conditions were chosen to maximize sensitivity for the discovery of competitive inhibitors while maintaining a strong signal amplitude and low noise. In a pilot screen of 31,800 chemical compounds, 44 active compounds were identified. Further, we describe the elimination of non-specific inhibitors that were detergent-sensitive or reactive as well as those that interfered with the high-throughput phosphate assay. Four inhibitors survived these common counter-screens for non-specificity but these chemicals were also inhibitors of the unrelated enzyme dihydrofolate reductase, suggesting that they too were promiscuously active. The high-throughput screen of the EngA protein described here provides a meticulous pilot study in the search for specific inhibitors of GTPases involved in ribosome biogenesis. PMID:23606650
NASA Technical Reports Server (NTRS)
1981-01-01
The modified CG2000 crystal grower construction, installation, and machine check-out was completed. The process development check-out proceeded with several dry runs and one growth run. Several machine calibrations and functional problems were discovered and corrected. Several exhaust gas analysis system alternatives were evaluated and an integrated system approved and ordered. A contract presentation was made at the Project Integration Meeting at JPL, including cost-projections using contract projected throughput and machine parameters. Several growth runs on a development CG200 RC grower show that complete neck, crown, and body automated growth can be achieved with only one operator input. Work continued for melt level, melt temperature, and diameter sensor development.
Gikanga, Benson; Turok, Robert; Hui, Ada; Bowen, Mayumi; Stauch, Oliver B; Maa, Yuh-Fun
2015-01-01
Spray-dried monoclonal antibody (mAb) powders may offer applications more versatile than the freeze-dried cake, including preparing high-concentration formulations for subcutaneous administration. Published studies on this topic, however, are generally scarce. This study evaluates a pilot-scale spray dryer against a laboratory-scale dryer to spray-dry multiple mAbs in consideration of scale-up, impact on mAb stability, and feasibility of a high-concentration preparation. Under similar conditions, both dryers produced powders of similar properties-for example, water content, particle size and morphology, and mAb stability profile-despite a 4-fold faster output by the pilot-scale unit. All formulations containing arginine salt or a combination of arginine salt and trehalose were able to be spray-dried with high powder collection efficiency (>95%), but yield was adversely affected in formulations with high trehalose content due to powder sticking to the drying chamber. Spray-drying production output was dictated by the size of the dryer operated at an optimal liquid feed rate. Spray-dried powders could be reconstituted to high-viscosity liquids, >300 cP, substantially beyond what an ultrafiltration process can achieve. The molar ratio of trehalose to mAb needed to be reduced to 50:1 in consideration of isotonicity of the formulation with mAb concentration at 250 mg/mL. Even with this low level of sugar protection, long-term stability of spray-dried formulations remained superior to their liquid counterparts based on size variant and potency data. This study offers a commercially viable spray-drying process for biological bulk storage and an option for high-concentration mAb manufacturing. This study evaluates a pilot-scale spray dryer against a laboratory-scale dryer to spray-dry multiple monoclonal antibodies (mAbs) from the perspective of scale-up, impact on mAb stability, and feasibility of a high-concentration preparation. The data demonstrated that there is no process limitation in solution viscosity when high-concentration mAb formulations are prepared from spray-dried powder reconstitution compared with concentration via the conventional ultrafiltration process. This study offers a commercially viable spray-drying process for biological bulk storage and a high-concentration mAb manufacturing option for subcutaneous administration. The outcomes of this study will benefit scientists and engineers who develop high-concentration mAb products by providing a viable manufacturing alternative. © PDA, Inc. 2015.
NCBI GEO: archive for high-throughput functional genomic data.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron
2009-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.
High-throughput determination of structural phase diagram and constituent phases using GRENDEL
NASA Astrophysics Data System (ADS)
Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.
2015-11-01
Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.
High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics
NASA Astrophysics Data System (ADS)
Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine
2016-06-01
Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 16 2014-07-01 2014-07-01 false Condensers used as control devices. 65...
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Condensers used as control devices. 65...
40 CFR 65.151 - Condensers used as control devices.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the design evaluation for storage vessels and low-throughput transfer rack controls. As provided in... control device on a Group 1 process vent or a high-throughput transfer rack with a condenser used as a... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Condensers used as control devices. 65...
Low inlet gas velocity high throughput biomass gasifier
Feldmann, Herman F.; Paisley, Mark A.
1989-01-01
The present invention discloses a novel method of operating a gasifier for production of fuel gas from carbonaceous fuels. The process disclosed enables operating in an entrained mode using inlet gas velocities of less than 7 feet per second, feedstock throughputs exceeding 4000 lbs/ft.sup.2 -hr, and pressures below 100 psia.
Rapid high-throughput cloning and stable expression of antibodies in HEK293 cells.
Spidel, Jared L; Vaessen, Benjamin; Chan, Yin Yin; Grasso, Luigi; Kline, J Bradford
2016-12-01
Single-cell based amplification of immunoglobulin variable regions is a rapid and powerful technique for cloning antigen-specific monoclonal antibodies (mAbs) for purposes ranging from general laboratory reagents to therapeutic drugs. From the initial screening process involving small quantities of hundreds or thousands of mAbs through in vitro characterization and subsequent in vivo experiments requiring large quantities of only a few, having a robust system for generating mAbs from cloning through stable cell line generation is essential. A protocol was developed to decrease the time, cost, and effort required by traditional cloning and expression methods by eliminating bottlenecks in these processes. Removing the clonal selection steps from the cloning process using a highly efficient ligation-independent protocol and from the stable cell line process by utilizing bicistronic plasmids to generate stable semi-clonal cell pools facilitated an increased throughput of the entire process from plasmid assembly through transient transfections and selection of stable semi-clonal cell pools. Furthermore, the time required by a single individual to clone, express, and select stable cell pools in a high-throughput format was reduced from 4 to 6months to only 4 to 6weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: 1) the main production quality characteristics; 2) the process features for quality assurance; 3) the internal quality characteristics and their specification designs; 4) the quality control and process capability evaluation methods, and 5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. PMID:23872356
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical bio-sensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Plasma Enhanced Growth of Carbon Nanotubes For Ultrasensitive Biosensors
NASA Technical Reports Server (NTRS)
Cassell, Alan M.; Li, J.; Ye, Q.; Koehne, J.; Chen, H.; Meyyappan, M.
2004-01-01
The multitude of considerations facing nanostructure growth and integration lends itself to combinatorial optimization approaches. Rapid optimization becomes even more important with wafer-scale growth and integration processes. Here we discuss methodology for developing plasma enhanced CVD growth techniques for achieving individual, vertically aligned carbon nanostructures that show excellent properties as ultrasensitive electrodes for nucleic acid detection. We utilize high throughput strategies for optimizing the upstream and downstream processing and integration of carbon nanotube electrodes as functional elements in various device types. An overview of ultrasensitive carbon nanotube based sensor arrays for electrochemical biosensing applications and the high throughput methodology utilized to combine novel electrode technology with conventional MEMS processing will be presented.
Murlidhar, Vasudha; Zeinali, Mina; Grabauskiene, Svetlana; Ghannad-Rezaie, Mostafa; Wicha, Max S; Simeone, Diane M; Ramnath, Nithya; Reddy, Rishindra M; Nagrath, Sunitha
2014-12-10
Circulating tumor cells (CTCs) are believed to play an important role in metastasis, a process responsible for the majority of cancer-related deaths. But their rarity in the bloodstream makes microfluidic isolation complex and time-consuming. Additionally the low processing speeds can be a hindrance to obtaining higher yields of CTCs, limiting their potential use as biomarkers for early diagnosis. Here, a high throughput microfluidic technology, the OncoBean Chip, is reported. It employs radial flow that introduces a varying shear profile across the device, enabling efficient cell capture by affinity at high flow rates. The recovery from whole blood is validated with cancer cell lines H1650 and MCF7, achieving a mean efficiency >80% at a throughput of 10 mL h(-1) in contrast to a flow rate of 1 mL h(-1) standardly reported with other microfluidic devices. Cells are recovered with a viability rate of 93% at these high speeds, increasing the ability to use captured CTCs for downstream analysis. Broad clinical application is demonstrated using comparable flow rates from blood specimens obtained from breast, pancreatic, and lung cancer patients. Comparable CTC numbers are recovered in all the samples at the two flow rates, demonstrating the ability of the technology to perform at high throughputs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D
2017-05-11
Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.
Mathematical and Computational Modeling in Complex Biological Systems
Li, Wenyang; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558
Mathematical and Computational Modeling in Complex Biological Systems.
Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.
High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.
Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E
2017-01-01
Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.
High-Throughput Toxicity Testing: New Strategies for ...
In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it
Microfluidic guillotine for single-cell wound repair studies
NASA Astrophysics Data System (ADS)
Blauch, Lucas R.; Gai, Ya; Khor, Jian Wei; Sood, Pranidhi; Marshall, Wallace F.; Tang, Sindy K. Y.
2017-07-01
Wound repair is a key feature distinguishing living from nonliving matter. Single cells are increasingly recognized to be capable of healing wounds. The lack of reproducible, high-throughput wounding methods has hindered single-cell wound repair studies. This work describes a microfluidic guillotine for bisecting single Stentor coeruleus cells in a continuous-flow manner. Stentor is used as a model due to its robust repair capacity and the ability to perform gene knockdown in a high-throughput manner. Local cutting dynamics reveals two regimes under which cells are bisected, one at low viscous stress where cells are cut with small membrane ruptures and high viability and one at high viscous stress where cells are cut with extended membrane ruptures and decreased viability. A cutting throughput up to 64 cells per minute—more than 200 times faster than current methods—is achieved. The method allows the generation of more than 100 cells in a synchronized stage of their repair process. This capacity, combined with high-throughput gene knockdown in Stentor, enables time-course mechanistic studies impossible with current wounding methods.
Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki
2014-12-01
As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.
Lu, Rong; Neff, Norma F.; Quake, Stephen R.; Weissman, Irving L.
2011-01-01
Disentangling cellular heterogeneity is a challenge in many fields, particularly in the stem cell and cancer biology fields. Here, we demonstrate how to combine viral genetic barcoding with high-throughput sequencing to track single cells in a heterogeneous population. We use this technique to track the in vivo differentiation of unitary hematopoietic stem cells (HSCs). The results are consistent with single cell transplantation studies, but require two orders of magnitude fewer mice. In addition to its high throughput, the high sensitivity of the technique allows for a direct examination of the clonality of sparse cell populations such as HSCs. We show how these capabilities offer a clonal perspective of the HSC differentiation process. In particular, our data suggests that HSCs do not equally contribute to blood cells after irradiation-mediated transplantation, and that two distinct HSC differentiation patterns co-exist in the same recipient mouse post irradiation. This technique can be applied to any viral accessible cell type for both in vitro and in vivo processes. PMID:21964413
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, L.J.; Uhler, A.D.
1988-09-01
In recent years materials not directly associated with food production, such as polychlorinated and brominated biphenyls, have been found in foods. According to the criteria to evaluate the likelihood for a chemical to contaminate food, the volatile halocarbons (VHCs) were selected as target compounds in an examination for potential contaminants in selected foods. The technique of multiple headspace extraction (MHE) was used in this study to minimize sample handling, and thereby reduce the potential for laboratory contamination and maximize throughput. Recently this laboratory reported findings of several VHCs in margarine, including PCE in four samples at levels above the usualmore » background findings. Those samples had been obtained from a food store located immediately next to a dry-cleaning establishment. Follow-up investigation was conducted to determine the frequency of occurrence and levels of PCE that may be present in fatty foods purchased from stores located both near and distant from dry cleaners. Butter was chosen as a model food because it is a highly uniform product of very high fat content, which would be expected to act as a good absorber of the lipophilic VHCs. This paper presents results of these analyses and correlations between level of VHCs in butter and the proximity to dry cleaners of the food store where the butter was purchased.« less
BEAMS Lab: Novel approaches to finding a balance between throughput and sensitivity
NASA Astrophysics Data System (ADS)
Liberman, Rosa G.; Skipper, Paul L.; Prakash, Chandra; Shaffer, Christopher L.; Flarakos, Jimmy; Tannenbaum, Steven R.
2007-06-01
Development of 14C AMS has long pursued the twin goals of maximizing both sensitivity and precision in the interest, among others, of optimizing radiocarbon dating. Application of AMS to biomedical research is less constrained with respect to sensitivity requirements, but more demanding of high throughput. This work presents some technical and conceptual developments in sample processing and analytical instrumentation designed to streamline the process of extracting quantitative data from the various types of samples encountered in analytical biochemistry.
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
Kastner, Elisabeth; Kaur, Randip; Lowry, Deborah; Moghaddam, Behfar; Wilkinson, Alexander; Perrie, Yvonne
2014-12-30
Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Automated solar cell assembly team process research
NASA Astrophysics Data System (ADS)
Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.
1994-06-01
This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Emerging commercial-level technology for aquatic sperm cryopreservation has not been modeled by computer simulation. Commercially available software (ARENA, Rockwell Automation, Inc. Milwaukee, WI) was applied to simulate high-throughput sperm cryopreservation of blue catfish (Ictalurus furcatus) based on existing processing capabilities. The goal was to develop a simulation model suitable for production planning and decision making. The objectives were to: 1) predict the maximum output for 8-hr workday; 2) analyze the bottlenecks within the process, and 3) estimate operational costs when run for daily maximum output. High-throughput cryopreservation was divided into six major steps modeled with time, resources and logic structures. The modeled production processed 18 fish and produced 1164 ± 33 (mean ± SD) 0.5-ml straws containing one billion cryopreserved sperm. Two such production lines could support all hybrid catfish production in the US and 15 such lines could support the entire channel catfish industry if it were to adopt artificial spawning techniques. Evaluations were made to improve efficiency, such as increasing scale, optimizing resources, and eliminating underutilized equipment. This model can serve as a template for other aquatic species and assist decision making in industrial application of aquatic germplasm in aquaculture, stock enhancement, conservation, and biomedical model fishes. PMID:25580079
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming
2015-01-01
High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.
Kokel, David; Rennekamp, Andrew J; Shah, Asmi H; Liebel, Urban; Peterson, Randall T
2012-08-01
For decades, studying the behavioral effects of individual drugs and genetic mutations has been at the heart of efforts to understand and treat nervous system disorders. High-throughput technologies adapted from other disciplines (e.g., high-throughput chemical screening, genomics) are changing the scale of data acquisition in behavioral neuroscience. Massive behavioral datasets are beginning to emerge, particularly from zebrafish labs, where behavioral assays can be performed rapidly and reproducibly in 96-well, high-throughput format. Mining these datasets and making comparisons across different assays are major challenges for the field. Here, we review behavioral barcoding, a process by which complex behavioral assays are reduced to a string of numeric features, facilitating analysis and comparison within and across datasets. Copyright © 2012 Elsevier Ltd. All rights reserved.
Shahini, Mehdi; Yeow, John T W
2011-08-12
We report on the enhancement of electrical cell lysis using carbon nanotubes (CNTs). Electrical cell lysis systems are widely utilized in microchips as they are well suited to integration into lab-on-a-chip devices. However, cell lysis based on electrical mechanisms has high voltage requirements. Here, we demonstrate that by incorporating CNTs into microfluidic electrolysis systems, the required voltage for lysis is reduced by half and the lysis throughput at low voltages is improved by ten times, compared to non-CNT microchips. In our experiment, E. coli cells are lysed while passing through an electric field in a microchannel. Based on the lightning rod effect, the electric field strengthened at the tip of the CNTs enhances cell lysis at lower voltage and higher throughput. This approach enables easy integration of cell lysis with other on-chip high-throughput sample-preparation processes.
Xu, Xianglong; Liu, Guohua; Wang, Yuanyuan; Zhang, Yuankai; Wang, Hao; Qi, Lu; Wang, Hongchen
2018-02-01
A sequencing batch reactor (SBR)-anaerobic ammonium oxidation (anammox) system was started up with the paddy soil as inoculated sludge. The key microbial community structure in the system along with the enrichment time was investigated by using molecular biology methods (e.g., high-throughput 16S rRNA gene sequencing and quantitative PCR). Meanwhile, the influent and effluent water quality was continuously monitored during the whole start-up stage. The results showed that the microbial diversity decreased as the operation time initially and increased afterwards, and the microbial niches in the system were redistributed. The anammox bacterial community structure in the SBR-anammox system shifted during the enrichment, the most dominant anammox bacteria were CandidatusJettenia. The maximum biomass of anammox bacteria achieved 1.68×10 9 copies/g dry sludge during the enrichment period, and the highest removal rate of TN achieved around 75%. Copyright © 2017. Published by Elsevier B.V.
2017-01-01
Clofazimine, a lipophilic (log P = 7.66) riminophenazine antibiotic approved by the US Food and Drug Administration (FDA) with a good safety record, was recently identified as a lead hit for cryptosporidiosis through a high-throughput phenotypic screen. Cryptosporidiosis requires fast-acting treatment as it leads to severe symptoms which, if untreated, result in morbidity for infants and small children. Consequently, a fast-releasing oral formulation of clofazimine in a water-dispersible form for pediatric administration is highly desirable. In this work, clofazimine nanoparticles were prepared with three surface stabilizers, hypromellose acetate succinate (HPMCAS), lecithin, and zein, using the flash nanoprecipitation (FNP) process. Drug encapsulation efficiencies of over 92% were achieved. Lyophilization and spray-drying were applied and optimized to produce redispersible nanoparticle powders. The release kinetics of these clofazimine nanoparticle powders in biorelevant media were measured and compared with those of crystalline clofazimine and the currently marketed formulation Lamprene. Remarkably improved dissolution rates and clofazimine supersaturation levels up to 90 times equilibrium solubility were observed with all clofazimine nanoparticles tested. Differential scanning calorimetry indicated a reduction of crystallinity of clofazimine in nanoparticles. These results strongly suggest that the new clofazimine nanoparticles prepared with affordable materials in this low-cost nanoparticle formulation process can be used as viable cryptosporidiosis therapeutics. PMID:28929769
Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E
2013-02-15
We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.
Heo, Young Jin; Lee, Donghyeon; Kang, Junsu; Lee, Keondo; Chung, Wan Kyun
2017-09-14
Imaging flow cytometry (IFC) is an emerging technology that acquires single-cell images at high-throughput for analysis of a cell population. Rich information that comes from high sensitivity and spatial resolution of a single-cell microscopic image is beneficial for single-cell analysis in various biological applications. In this paper, we present a fast image-processing pipeline (R-MOD: Real-time Moving Object Detector) based on deep learning for high-throughput microscopy-based label-free IFC in a microfluidic chip. The R-MOD pipeline acquires all single-cell images of cells in flow, and identifies the acquired images as a real-time process with minimum hardware that consists of a microscope and a high-speed camera. Experiments show that R-MOD has the fast and reliable accuracy (500 fps and 93.3% mAP), and is expected to be used as a powerful tool for biomedical and clinical applications.
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-08-01
In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.
Beeman, Katrin; Baumgärtner, Jens; Laubenheimer, Manuel; Hergesell, Karlheinz; Hoffmann, Martin; Pehl, Ulrich; Fischer, Frank; Pieck, Jan-Carsten
2017-12-01
Mass spectrometry (MS) is known for its label-free detection of substrates and products from a variety of enzyme reactions. Recent hardware improvements have increased interest in the use of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS for high-throughput drug discovery. Despite interest in this technology, several challenges remain and must be overcome before MALDI-MS can be integrated as an automated "in-line reader" for high-throughput drug discovery. Two such hurdles include in situ sample processing and deposition, as well as integration of MALDI-MS for enzymatic screening assays that usually contain high levels of MS-incompatible components. Here we adapt our c-MET kinase assay to optimize for MALDI-MS compatibility and test its feasibility for compound screening. The pros and cons of the Echo (Labcyte) as a transfer system for in situ MALDI-MS sample preparation are discussed. We demonstrate that this method generates robust data in a 1536-grid format. We use the MALDI-MS to directly measure the ratio of c-MET substrate and phosphorylated product to acquire IC50 curves and demonstrate that the pharmacology is unaffected. The resulting IC50 values correlate well between the common label-based capillary electrophoresis and the label-free MALDI-MS detection method. We predict that label-free MALDI-MS-based high-throughput screening will become increasingly important and more widely used for drug discovery.
Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.
Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N
2004-01-01
Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.
NASA Astrophysics Data System (ADS)
Hayasaki, Yoshio
2017-02-01
Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.
High throughput computing: a solution for scientific analysis
O'Donnell, M.
2011-01-01
handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).
Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing
Shafer, Aaron B. A.; Northrup, Joseph M.; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B. W.
2016-01-01
Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations. PMID:26745372
A Family of LIC Vectors for High-Throughput Cloning and Purification of Proteins1
Eschenfeldt, William H.; Stols, Lucy; Millard, Cynthia Sanville; Joachimiak, Andrzej; Donnelly, Mark I.
2009-01-01
Summary Fifteen related ligation-independent cloning vectors were constructed for high-throughput cloning and purification of proteins. The vectors encode a TEV protease site for removal of tags that facilitate protein purification (his-tag) or improve solubility (MBP, GST). Specialized vectors allow coexpression and copurification of interacting proteins, or in vivo removal of MBP by TVMV protease to improve screening and purification. All target genes and vectors are processed by the same protocols, which we describe here. PMID:18988021
High-Throughput Printing Process for Flexible Electronics
NASA Astrophysics Data System (ADS)
Hyun, Woo Jin
Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.
Wu, Zhongchen; Chen, Huanwen; Wang, Weiling; Jia, Bin; Yang, Tianlin; Zhao, Zhanfeng; Ding, Jianhua; Xiao, Xuxian
2009-10-28
Without any sample pretreatment, mass spectral fingerprints of 486 dried sea cucumber slices were rapidly recorded in the mass range of m/z 50-800 by using surface desorption atmospheric pressure chemical ionization mass spectrometry (DAPCI-MS). A set of 162 individual sea cucumbers (Apostichopus japonicus Selenka) grown up in 3 different geographical regions (Weihai: 59 individuals, 177 slices; Yantai: 53 individuals, 159 slices; Dalian: 50 individuals, 150 slices;) in north China sea were successfully differentiated according to their habitats both by Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) of the mass spectral raw data, demonstrating that DAPCI-MS is a practically convenient tool for high-throughput differentiation of sea cucumber products. It has been found that the difference between the body wall tissue and the epidermal tissue is heavily dependent on the habitats. The experimental data also show that the roughness of the sample surface contributes to the variance of the signal levels in a certain extent, but such variance does not fail the differentiation of the dried sea cucumber samples.
Development of a Bile Acid-Based Newborn Screen for Niemann-Pick C Disease
Jiang, Xuntian; Sidhu, Rohini; Mydock, Laurel; Hsu, Fong-Fu; Covey, Douglas F.; Scherrer, David E.; Earley, Brian; Gale, Sarah E.; Farhat, Nicole Y.; Porter, Forbes D.; Dietzen, Dennis J.; Orsini, Joseph J.; Berry-Kravis, Elizabeth; Zhang, Xiaokui; Reunert, Janice; Marquardt, Thorsten; Runz, Heiko; Giugliani, Roberto; Schaffer, Jean E.; Ory, Daniel S.
2017-01-01
Niemann-Pick disease type C (NPC) is a fatal, neurodegenerative, cholesterol storage disorder. With new therapeutics in clinical trials, it is imperative to improve diagnostics and facilitate early intervention. We used metabolomic profiling to identify potential markers and discovered three unknown bile acids that were increased in plasma from NPC but not control subjects. The bile acids most elevated in the NPC subjects were identified as 3β,5α,6β-trihydroxycholanic acid and its glycine conjugate, both of which were shown to be metabolites of cholestane-3β,5α,6β-triol, an oxysterol elevated in NPC. A high-throughput, mass spectrometry-based method was developed and validated to measure the glycine-conjugated bile acid in dried blood spots. Analysis of dried blood spots from 4992 controls, 134 NPC carriers, and 44 NPC subjects provided 100% sensitivity and specificity in the study samples. Quantification of the bile acid in dried blood spots, therefore, provides the basis for a newborn screen for NPC that is ready for piloting in newborn screening programs. PMID:27147587
Grant, Yitzchak; Matejtschuk, Paul; Bird, Christopher; Wadhwa, Meenu; Dalby, Paul A
2012-04-01
The lyophilization of proteins in microplates, to assess and optimise formulations rapidly, has been applied for the first time to a therapeutic protein and, in particular, one that requires a cell-based biological assay, in order to demonstrate the broader usefulness of the approach. Factorial design of experiment methods were combined with lyophilization in microplates to identify optimum formulations that stabilised granulocyte colony-stimulating factor during freeze drying. An initial screen rapidly identified key excipients and potential interactions, which was then followed by a central composite face designed optimisation experiment. Human serum albumin and Tween 20 had significant effects on maintaining protein stability. As previously, the optimum formulation was then freeze-dried in stoppered vials to verify that the microscale data is relevant to pilot scales. However, to validate the approach further, the selected formulation was also assessed for solid-state shelf-life through the use of accelerated stability studies. This approach allows for a high-throughput assessment of excipient options early on in product development, while also reducing costs in terms of time and quantity of materials required.
Ma, Nyuk Ling; Teh, Kit Yinn; Lam, Su Shiung; Kaben, Anne Marie; Cha, Thye San
2015-08-01
This study demonstrates the use of NMR techniques coupled with chemometric analysis as a high throughput data mining method to identify and examine the efficiency of different disruption techniques tested on microalgae (Chlorella variabilis, Scenedesmus regularis and Ankistrodesmus gracilis). The yield and chemical diversity from the disruptions together with the effects of pre-oven and pre-freeze drying prior to disruption techniques were discussed. HCl extraction showed the highest recovery of oil compounds from the disrupted microalgae (up to 90%). In contrast, NMR analysis showed the highest intensity of bioactive metabolites obtained for homogenized extracts pre-treated with freeze-drying, indicating that homogenizing is a more favorable approach to recover bioactive substances from the disrupted microalgae. The results show the potential of NMR as a useful metabolic fingerprinting tool for assessing compound diversity in complex microalgae extracts. Copyright © 2015 Elsevier Ltd. All rights reserved.
Collard, Marie; Teychené, Benoit; Lemée, Laurent
2017-12-01
Drying process aims at minimising the volume of wastewater sludge (WWS) before disposal, however it can impact sludge characteristics. Due to its high content in organic matter (OM) and lipids, sludge are mainly valorised by land farming but can also be considered as a feedstock for biodiesel production. As sludge composition is a major parameter for the choice of disposal techniques, the objective of this study was to determine the influence of the drying process. To reach this goal, three sludges obtained from solar, reed beds and thermal drying processes were investigated at the global and molecular scales. Before the drying step the sludges presented similar physico-chemical (OM content, elemental analysis, pH, infrared spectra) characteristics and lipid contents. A strong influence of the drying process on lipids and humic-like substances contents was observed through OM fractionation. Thermochemolysis-GCMS of raw sludge and lipids revealed similar molecular content mainly constituted with steroids and fatty acids. Molecular changes were noticeable for thermal drying through differences in branched to linear fatty acids ratio. Finally the thermal drying induced a weakening of OM whereas the solar drying led to a complexification. These findings show that smooth drying processes such as solar or reed-beds are preferable for amendment production whereas thermal process leads to pellets with a high lipid content which could be considered for fuel production. Copyright © 2016 Elsevier Ltd. All rights reserved.
Moret, Sabrina; Scolaro, Marianna; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S
2016-04-01
A high throughput, high-sensitivity procedure, involving simultaneous microwave-assisted extraction (MAS) and unsaponifiable extraction, followed by on-line liquid chromatography (LC)-gas chromatography (GC), has been optimised for rapid and efficient extraction and analytical determination of mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH) in cereal-based products of different composition. MAS has the advantage of eliminating fat before LC-GC analysis, allowing an increase in the amount of sample extract injected, and hence in sensitivity. The proposed method gave practically quantitative recoveries and good repeatability. Among the different cereal-based products analysed (dry semolina and egg pasta, bread, biscuits, and cakes), egg pasta packed in direct contact with recycled paperboard had on average the highest total MOSH level (15.9 mg kg(-1)), followed by cakes (10.4 mg kg(-1)) and bread (7.5 mg kg(-1)). About 50% of the pasta and bread samples and 20% of the biscuits and cake samples had detectable MOAH amounts. The highest concentrations were found in an egg pasta in direct contact with recycled paperboard (3.6 mg kg(-1)) and in a milk bread (3.6 mg kg(-1)). Copyright © 2015 Elsevier Ltd. All rights reserved.
A simple and sensitive high-throughput GFP screening in woody and herbaceous plants.
Hily, Jean-Michel; Liu, Zongrang
2009-03-01
Green fluorescent protein (GFP) has been used widely as a powerful bioluminescent reporter, but its visualization by existing methods in tissues or whole plants and its utilization for high-throughput screening remains challenging in many species. Here, we report a fluorescence image analyzer-based method for GFP detection and its utility for high-throughput screening of transformed plants. Of three detection methods tested, the Typhoon fluorescence scanner was able to detect GFP fluorescence in all Arabidopsis thaliana tissues and apple leaves, while regular fluorescence microscopy detected it only in Arabidopsis flowers and siliques but barely in the leaves of either Arabidopsis or apple. The hand-held UV illumination method failed in all tissues of both species. Additionally, the Typhoon imager was able to detect GFP fluorescence in both green and non-green tissues of Arabidopsis seedlings as well as in imbibed seeds, qualifying it as a high-throughput screening tool, which was further demonstrated by screening the seedlings of primary transformed T(0) seeds. Of the 30,000 germinating Arabidopsis seedlings screened, at least 69 GFP-positive lines were identified, accounting for an approximately 0.23% transformation efficiency. About 14,000 seedlings grown in 16 Petri plates could be screened within an hour, making the screening process significantly more efficient and robust than any other existing high-throughput screening method for transgenic plants.
Atmospheric Pressure Plasma Jet as a Dry Alternative to Inkjet Printing in Flexible Electronics
NASA Technical Reports Server (NTRS)
Gandhiraman, Ram Prasad; Lopez, Arlene; Koehne, Jessica; Meyyappan, M.
2016-01-01
We have developed an atmospheric pressure plasma jet printing system that works at room temperature to 50 deg C unlike conventional aerosol assisted techniques which require a high temperature sintering step to obtain desired thin films. Multiple jets can be configured to increase throughput or to deposit multiple materials, and the jet(s) can be moved across large areas using a x-y stage. The plasma jet has been used to deposit carbon nanotubes, graphene, silver nanowires, copper nanoparticles and other materials on substrates such as paper, cotton, plastic and thin metal foils.
2007-12-14
contained dried residues from a collection of terrestrial plants , marine inver- tebrates, and various fungi. NCI plate numbers, sources of extracts, and... plants ), while Fig. 3B displays results from row G of the same plate. In these examples, wells B3, B5, B9, G9, and G12 were selected for further...sources of extracts Plate no. Source Extraction solvent 96110120 Terrestrial plants Water 96110125 Terrestrial plants CH3OH-CH2Cl2 12000707 Marine
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
Solar thermal drum drying performance of prune and tomato pomaces
USDA-ARS?s Scientific Manuscript database
Fruit and vegetable pomaces are co-products of the food processing industry; they are underutilized in part because their high water activity (aw) renders them unstable. Drum drying is one method that can dry/stabilize pomaces, but current drum drying methods utilize conventional, high-environmental...
Zhang, Dong-Qing; He, Pin-Jing; Jin, Tai-Feng; Shao, Li-Ming
2008-12-01
To improve the water content reduction of municipal solid waste with high water content, the operations of supplementing a hydrolytic stage prior to aerobic degradation and inoculating the bio-drying products were conducted. A 'bio-drying index' was used to evaluate the bio-drying performance. For the aerobic processes, the inoculation accelerated organics degradation, enhanced the lignocelluloses degradation rate by 10.4%, and lowered water content by 7.0%. For the combined hydrolytic-aerobic processes, the inoculum addition had almost no positive effect on the bio-drying efficiency, but it enhanced the lignocelluloses degradation rate by 9.6% and strengthened the acidogenesis in the hydrolytic stage. Compared with the aerobic processes, the combined processes had a higher bio-drying index (4.20 for non-inoculated and 3.67 for the inoculated trials). Moreover, the lowest final water content occurred in the combined process without inoculation (50.5% decreased from an initial 72.0%).
High-throughput diagnosis of potato cyst nematodes in soil samples.
Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon
2015-01-01
Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.
A high-throughput semi-automated preparation for filtered synaptoneurosomes.
Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A
2014-09-30
Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Life in the fast lane: high-throughput chemistry for lead generation and optimisation.
Hunter, D
2001-01-01
The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.
Microarray Detection of Duplex and Triplex DNA Binders with DNA-Modified Gold Nanoparticles
Lytton-Jean, Abigail K. R.; Han, Min Su; Mirkin, Chad A.
2008-01-01
We have designed a chip-based assay, using microarray technology, for determining the relative binding affinities of duplex and triplex DNA binders. This assay combines the high discrimination capabilities afforded by DNA-modified Au nanoparticles with the high-throughput capabilities of DNA microarrays. The detection and screening of duplex DNA binders are important because these molecules, in many cases, are potential anticancer agents as well as toxins. Triplex DNA binders are also promising drug candidates. These molecules, in conjunction with triplex forming oligonucleotides, could potentially be used to achieve control of gene expression by interfering with transcription factors that bind to DNA. Therefore, the ability to screen for these molecules in a high-throughput fashion could dramatically improve the drug screening process. The assay reported here provides excellent discrimination between strong, intermediate, and weak duplex and triplex DNA binders in a high-throughput fashion. PMID:17614366
High-throughput GPU-based LDPC decoding
NASA Astrophysics Data System (ADS)
Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin
2010-08-01
Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.
Study of data I/O performance on distributed disk system in mask data preparation
NASA Astrophysics Data System (ADS)
Ohara, Shuichiro; Odaira, Hiroyuki; Chikanaga, Tomoyuki; Hamaji, Masakazu; Yoshioka, Yasuharu
2010-09-01
Data volume is getting larger every day in Mask Data Preparation (MDP). In the meantime, faster data handling is always required. MDP flow typically introduces Distributed Processing (DP) system to realize the demand because using hundreds of CPU is a reasonable solution. However, even if the number of CPU were increased, the throughput might be saturated because hard disk I/O and network speeds could be bottlenecks. So, MDP needs to invest a lot of money to not only hundreds of CPU but also storage and a network device which make the throughput faster. NCS would like to introduce new distributed processing system which is called "NDE". NDE could be a distributed disk system which makes the throughput faster without investing a lot of money because it is designed to use multiple conventional hard drives appropriately over network. NCS studies I/O performance with OASIS® data format on NDE which contributes to realize the high throughput in this paper.
Optimizing SIEM Throughput on the Cloud Using Parallelization.
Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid
2016-01-01
Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.
Characterisation of Aronia powders obtained by different drying processes.
Horszwald, Anna; Julien, Heritier; Andlauer, Wilfried
2013-12-01
Nowadays, food industry is facing challenges connected with the preservation of the highest possible quality of fruit products obtained after processing. Attention has been drawn to Aronia fruits due to numerous health promoting properties of their products. However, processing of Aronia, like other berries, leads to difficulties that stem from the preparation process, as well as changes in the composition of bioactive compounds. Consequently, in this study, Aronia commercial juice was subjected to different drying techniques: spray drying, freeze drying and vacuum drying with the temperature range of 40-80 °C. All powders obtained had a high content of total polyphenols. Powders gained by spray drying had the highest values which corresponded to a high content of total flavonoids, total monomeric anthocyanins, cyaniding-3-glucoside and total proanthocyanidins. Analysis of the results exhibited a correlation between selected bioactive compounds and their antioxidant capacity. In conclusion, drying techniques have an impact on selected quality parameters, and different drying techniques cause changes in the content of bioactives analysed. Spray drying can be recommended for preservation of bioactives in Aronia products. Powder quality depends mainly on the process applied and parameters chosen. Therefore, Aronia powders production should be adapted to the requirements and design of the final product. Copyright © 2013 Elsevier Ltd. All rights reserved.
Toward reliable and repeatable automated STEM-EDS metrology with high throughput
NASA Astrophysics Data System (ADS)
Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii
2018-03-01
New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.
Large-scale Topographical Screen for Investigation of Physical Neural-Guidance Cues
NASA Astrophysics Data System (ADS)
Li, Wei; Tang, Qing Yuan; Jadhav, Amol D.; Narang, Ankit; Qian, Wei Xian; Shi, Peng; Pang, Stella W.
2015-03-01
A combinatorial approach was used to present primary neurons with a large library of topographical features in the form of micropatterned substrate for high-throughput screening of physical neural-guidance cues that can effectively promote different aspects of neuronal development, including axon and dendritic outgrowth. Notably, the neuronal-guidance capability of specific features was automatically identified using a customized image processing software, thus significantly increasing the screening throughput with minimal subjective bias. Our results indicate that the anisotropic topographies promote axonal and in some cases dendritic extension relative to the isotropic topographies, while dendritic branching showed preference to plain substrates over the microscale features. The results from this work can be readily applied towards engineering novel biomaterials with precise surface topography that can serve as guidance conduits for neuro-regenerative applications. This novel topographical screening strategy combined with the automated processing capability can also be used for high-throughput screening of chemical or genetic regulatory factors in primary neurons.
Assessment of Advanced Coal Gasification Processes
NASA Technical Reports Server (NTRS)
McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John
1981-01-01
This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.
High-throughput cultivation and screening platform for unicellular phototrophs.
Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus
2014-09-16
High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.
Bhagat, Ali Asgar S; Hou, Han Wei; Li, Leon D; Lim, Chwee Teck; Han, Jongyoon
2011-06-07
Blood is a highly complex bio-fluid with cellular components making up >40% of the total volume, thus making its analysis challenging and time-consuming. In this work, we introduce a high-throughput size-based separation method for processing diluted blood using inertial microfluidics. The technique takes advantage of the preferential cell focusing in high aspect-ratio microchannels coupled with pinched flow dynamics for isolating low abundance cells from blood. As an application of the developed technique, we demonstrate the isolation of cancer cells (circulating tumor cells (CTCs)) spiked in blood by exploiting the difference in size between CTCs and hematologic cells. The microchannel dimensions and processing parameters were optimized to enable high throughput and high resolution separation, comparable to existing CTC isolation technologies. Results from experiments conducted with MCF-7 cells spiked into whole blood indicate >80% cell recovery with an impressive 3.25 × 10(5) fold enrichment over red blood cells (RBCs) and 1.2 × 10(4) fold enrichment over peripheral blood leukocytes (PBL). In spite of a 20× sample dilution, the fast operating flow rate allows the processing of ∼10(8) cells min(-1) through a single microfluidic device. The device design can be easily customized for isolating other rare cells from blood including peripheral blood leukocytes and fetal nucleated red blood cells by simply varying the 'pinching' width. The advantage of simple label-free separation, combined with the ability to retrieve viable cells post enrichment and minimal sample pre-processing presents numerous applications for use in clinical diagnosis and conducting fundamental studies.
NASA Technical Reports Server (NTRS)
Egen, N. B.; Twitty, G. E.; Bier, M.
1979-01-01
Isoelectric focusing is a high-resolution technique for separating and purifying large peptides, proteins, and other biomolecules. The apparatus described in the present paper constitutes a new approach to fluid stabilization and increased throughput. Stabilization is achieved by flowing the process fluid uniformly through an array of closely spaced filter elements oriented parallel both to the electrodes and the direction of the flow. This seems to overcome the major difficulties of parabolic flow and electroosmosis at the walls, while limiting the convection to chamber compartments defined by adjacent spacers. Increased throughput is achieved by recirculating the process fluid through external heat exchange reservoirs, where the Joule heat is dissipated.
NASA Astrophysics Data System (ADS)
Yu, Hao Yun; Liu, Chun-Hung; Shen, Yu Tian; Lee, Hsuan-Ping; Tsai, Kuen Yu
2014-03-01
Line edge roughness (LER) influencing the electrical performance of circuit components is a key challenge for electronbeam lithography (EBL) due to the continuous scaling of technology feature sizes. Controlling LER within an acceptable tolerance that satisfies International Technology Roadmap for Semiconductors requirements while achieving high throughput become a challenging issue. Although lower dosage and more-sensitive resist can be used to improve throughput, they would result in serious LER-related problems because of increasing relative fluctuation in the incident positions of electrons. Directed self-assembly (DSA) is a promising technique to relax LER-related pattern fidelity (PF) requirements because of its self-healing ability, which may benefit throughput. To quantify the potential of throughput improvement in EBL by introducing DSA for post healing, rigorous numerical methods are proposed to simultaneously maximize throughput by adjusting writing parameters of EBL systems subject to relaxed LER-related PF requirements. A fast, continuous model for parameter sweeping and a hybrid model for more accurate patterning prediction are employed for the patterning simulation. The tradeoff between throughput and DSA self-healing ability is investigated. Preliminary results indicate that significant throughput improvements are achievable at certain process conditions.
IRAS: High-Throughput Identification of Novel Alternative Splicing Regulators.
Zheng, S
2016-01-01
Alternative splicing is a fundamental regulatory process of gene expression. Defects in alternative splicing can lead to various diseases, and modification of disease-causing splicing events presents great therapeutic promise. Splicing outcome is commonly affected by extracellular stimuli and signaling cascades that converge on RNA-binding splicing regulators. These trans-acting factors recognize cis-elements in pre-mRNA transcripts to affect spliceosome assembly and splice site choices. Identification of these splicing regulators and/or upstream modulators has been difficult and traditionally done by piecemeal. High-throughput screening strategies to find multiple regulators of exon splicing have great potential to accelerate the discovery process, but typically confront low sensitivity and low specificity of screening assays. Here we describe a unique screening strategy, IRAS (identifying regulators of alternative splicing), using a pair of dual-output minigene reporters to allow for sensitive detection of exon splicing changes. Each dual-output reporter produces green fluorescent protein (GFP) and red fluorescent protein (RFP) fluorescent signals to assay the two spliced isoforms exclusively. The two complementary minigene reporters alter GFP/RFP output ratios in the opposite direction in response to splicing change. Applying IRAS in cell-based high-throughput screens allows sensitive and specific identification of splicing regulators and modulators for any alternative exons of interest. In comparison to previous high-throughput screening methods, IRAS substantially enhances the specificity of the screening assay. This strategy significantly eliminates false positives without sacrificing sensitive identification of true regulators of splicing. © 2016 Elsevier Inc. All rights reserved.
A high-throughput media design approach for high performance mammalian fed-batch cultures
Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé
2013-01-01
An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame. PMID:23563583
The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, C.A.; Cohen, A.E.
2009-05-26
The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screenedmore » in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.« less
Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László
2014-08-01
Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Numerical techniques for high-throughput reflectance interference biosensing
NASA Astrophysics Data System (ADS)
Sevenler, Derin; Ünlü, M. Selim
2016-06-01
We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.
High-throughput process development: I. Process chromatography.
Rathore, Anurag S; Bhambure, Rahul
2014-01-01
Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.93). We think that this protocol will be of significant value to those involved in performing high-throughput process development of process chromatography.
Xi-cam: Flexible High Throughput Data Processing for GISAXS
NASA Astrophysics Data System (ADS)
Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander
With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.
High throughput nanoimprint lithography for semiconductor memory applications
NASA Astrophysics Data System (ADS)
Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun
2017-03-01
Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non-fill defectivity is well under 1.0 defects/cm2 for both field types. Next, by further reducing drop volume and optimizing drop patterns, a fill time of 1.1 seconds was demonstrated.
Quality control methodology for high-throughput protein-protein interaction screening.
Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha
2011-01-01
Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.
A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy
Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian
2016-01-01
Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925
NASA Astrophysics Data System (ADS)
Ohene-Kwofie, Daniel; Otoo, Ekow
2015-10-01
The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level. We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
Networking Omic Data to Envisage Systems Biological Regulation.
Kalapanulak, Saowalak; Saithong, Treenut; Thammarongtham, Chinae
To understand how biological processes work, it is necessary to explore the systematic regulation governing the behaviour of the processes. Not only driving the normal behavior of organisms, the systematic regulation evidently underlies the temporal responses to surrounding environments (dynamics) and long-term phenotypic adaptation (evolution). The systematic regulation is, in effect, formulated from the regulatory components which collaboratively work together as a network. In the drive to decipher such a code of lives, a spectrum of technologies has continuously been developed in the post-genomic era. With current advances, high-throughput sequencing technologies are tremendously powerful for facilitating genomics and systems biology studies in the attempt to understand system regulation inside the cells. The ability to explore relevant regulatory components which infer transcriptional and signaling regulation, driving core cellular processes, is thus enhanced. This chapter reviews high-throughput sequencing technologies, including second and third generation sequencing technologies, which support the investigation of genomics and transcriptomics data. Utilization of this high-throughput data to form the virtual network of systems regulation is explained, particularly transcriptional regulatory networks. Analysis of the resulting regulatory networks could lead to an understanding of cellular systems regulation at the mechanistic and dynamics levels. The great contribution of the biological networking approach to envisage systems regulation is finally demonstrated by a broad range of examples.
Foam-mat drying technology: A review.
Hardy, Z; Jideani, V A
2017-08-13
This article reviews various aspects of foam-mat drying such as foam-mat drying processing technique, main additives used for foam-mat drying, foam-mat drying of liquid and solid foods, quality characteristics of foam-mat dried foods, and economic and technical benefits for employing foam-mat drying. Foam-mat drying process is an alternative method that allows the removal of water from liquid materials and pureed materials. In this drying process, a liquid material is converted into foam that is stable by being whipped after adding an edible foaming agent. The stable foam is then spread out in sheet or mat and dried by using hot air (40-90°C) at atmospheric pressure. Methyl cellulose (0.25-2%), egg white (3-20%), maltodextrin (0.5-05%), and gum Arabic (2-9%) are the commonly utilized additives for the foam-mat drying process at the given range, either combined together for their effectiveness or individual effect. The foam-mat drying process is suitable for heat sensitive, viscous, and sticky products that cannot be dried using other forms of drying methods such as spray drying because of the state of product. More interest has developed for foam-mat drying because of the simplicity, cost effectiveness, high speed drying, and improved product quality it provides.
Hegab, Hanaa M.; ElMekawy, Ahmed; Stakenborg, Tim
2013-01-01
Microbial fermentation process development is pursuing a high production yield. This requires a high throughput screening and optimization of the microbial strains, which is nowadays commonly achieved by applying slow and labor-intensive submerged cultivation in shake flasks or microtiter plates. These methods are also limited towards end-point measurements, low analytical data output, and control over the fermentation process. These drawbacks could be overcome by means of scaled-down microfluidic microbioreactors (μBR) that allow for online control over cultivation data and automation, hence reducing cost and time. This review goes beyond previous work not only by providing a detailed update on the current μBR fabrication techniques but also the operation and control of μBRs is compared to large scale fermentation reactors. PMID:24404006
Klukas, Christian; Chen, Dijun; Pape, Jean-Michel
2014-01-01
High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818
High performance hybrid magnetic structure for biotechnology applications
Humphries, David E; Pollard, Martin J; Elkin, Christopher J
2005-10-11
The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.
High performance hybrid magnetic structure for biotechnology applications
Humphries, David E.; Pollard, Martin J.; Elkin, Christopher J.
2006-12-12
The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are: a method of assembling the hybrid magnetic plates, a high throughput protocol featuring the hybrid magnetic structure, and other embodiments of the ferromagnetic pole shape, attachment and adapter interfaces for adapting the use of the hybrid magnetic structure for use with liquid handling and other robots for use in high throughput processes.
Control structures for high speed processors
NASA Technical Reports Server (NTRS)
Maki, G. K.; Mankin, R.; Owsley, P. A.; Kim, G. M.
1982-01-01
A special processor was designed to function as a Reed Solomon decoder with throughput data rate in the Mhz range. This data rate is significantly greater than is possible with conventional digital architectures. To achieve this rate, the processor design includes sequential, pipelined, distributed, and parallel processing. The processor was designed using a high level language register transfer language. The RTL can be used to describe how the different processes are implemented by the hardware. One problem of special interest was the development of dependent processes which are analogous to software subroutines. For greater flexibility, the RTL control structure was implemented in ROM. The special purpose hardware required approximately 1000 SSI and MSI components. The data rate throughput is 2.5 megabits/second. This data rate is achieved through the use of pipelined and distributed processing. This data rate can be compared with 800 kilobits/second in a recently proposed very large scale integration design of a Reed Solomon encoder.
Sakwanichol, Jarunee; Puttipipatkhachorn, Satit; Ingenerf, Gernot; Kleinebudde, Peter
2012-01-01
Different experimental factorial designs were employed to evaluate granule properties obtained from oscillating granulator and roll mill. Four oscillating-granulator parameters were varied, i.e. rotor speed, oscillating angle, aperture of mesh screen and rotor type. Six roll-mill parameters that were throughput, speed ratio in both first and second stages, gap between roll pair in both stages and roll-surface texture were also investigated. Afterwards, the granule properties obtained from two milling types with similar median particle size were compared. All milling parameters in both milling types affected significantly the median particle size, size distribution and amount of fine particles (P < 0.05), except the rotor types of oscillating granulator on fines. Only three milling parameters influenced significantly the flowability (P < 0.05). These were the throughput and the gap size in the first stage of roll mill and the sieve size of oscillating granulator. In comparison between milling types, the differences of granule properties were not practically relevant. However, the roll mill had much higher capacity than the oscillating granulator about seven times, resulting in improving energy savings per unit of product. Consequently, the roll mill can be applied instead of oscillating granulator for roll compaction/dry granulation technique.
Cai, Jinhai; Okamoto, Mamoru; Atieno, Judith; Sutton, Tim; Li, Yongle; Miklavcic, Stanley J.
2016-01-01
Leaf senescence, an indicator of plant age and ill health, is an important phenotypic trait for the assessment of a plant’s response to stress. Manual inspection of senescence, however, is time consuming, inaccurate and subjective. In this paper we propose an objective evaluation of plant senescence by color image analysis for use in a high throughput plant phenotyping pipeline. As high throughput phenotyping platforms are designed to capture whole-of-plant features, camera lenses and camera settings are inappropriate for the capture of fine detail. Specifically, plant colors in images may not represent true plant colors, leading to errors in senescence estimation. Our algorithm features a color distortion correction and image restoration step prior to a senescence analysis. We apply our algorithm to two time series of images of wheat and chickpea plants to quantify the onset and progression of senescence. We compare our results with senescence scores resulting from manual inspection. We demonstrate that our procedure is able to process images in an automated way for an accurate estimation of plant senescence even from color distorted and blurred images obtained under high throughput conditions. PMID:27348807
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
A high throughput array microscope for the mechanical characterization of biomaterials
NASA Astrophysics Data System (ADS)
Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard
2015-02-01
In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.
Lee, Myung Gwon; Shin, Joong Ho; Bae, Chae Yun; Choi, Sungyoung; Park, Je-Kyun
2013-07-02
We report a contraction-expansion array (CEA) microchannel device that performs label-free high-throughput separation of cancer cells from whole blood at low Reynolds number (Re). The CEA microfluidic device utilizes hydrodynamic field effect for cancer cell separation, two kinds of inertial effects: (1) inertial lift force and (2) Dean flow, which results in label-free size-based separation with high throughput. To avoid cell damages potentially caused by high shear stress in conventional inertial separation techniques, the CEA microfluidic device isolates the cells with low operational Re, maintaining high-throughput separation, using nondiluted whole blood samples (hematocrit ~45%). We characterized inertial particle migration and investigated the migration of blood cells and various cancer cells (MCF-7, SK-BR-3, and HCC70) in the CEA microchannel. The separation of cancer cells from whole blood was demonstrated with a cancer cell recovery rate of 99.1%, a blood cell rejection ratio of 88.9%, and a throughput of 1.1 × 10(8) cells/min. In addition, the blood cell rejection ratio was further improved to 97.3% by a two-step filtration process with two devices connected in series.
Wu, Zhenlong; Chen, Yu; Wang, Moran; Chung, Aram J
2016-02-07
Fluid inertia which has conventionally been neglected in microfluidics has been gaining much attention for particle and cell manipulation because inertia-based methods inherently provide simple, passive, precise and high-throughput characteristics. Particularly, the inertial approach has been applied to blood separation for various biomedical research studies mainly using spiral microchannels. For higher throughput, parallelization is essential; however, it is difficult to realize using spiral channels because of their large two dimensional layouts. In this work, we present a novel inertial platform for continuous sheathless particle and blood cell separation in straight microchannels containing microstructures. Microstructures within straight channels exert secondary flows to manipulate particle positions similar to Dean flow in curved channels but with higher controllability. Through a balance between inertial lift force and microstructure-induced secondary flow, we deterministically position microspheres and cells based on their sizes to be separated downstream. Using our inertial platform, we successfully sorted microparticles and fractionized blood cells with high separation efficiencies, high purities and high throughputs. The inertial separation platform developed here can be operated to process diluted blood with a throughput of 10.8 mL min(-1)via radially arrayed single channels with one inlet and two rings of outlets.
Lens-free shadow image based high-throughput continuous cell monitoring technique.
Jin, Geonsoo; Yoo, In-Hwa; Pack, Seung Pil; Yang, Ji-Woon; Ha, Un-Hwan; Paek, Se-Hwan; Seo, Sungkyu
2012-01-01
A high-throughput continuous cell monitoring technique which does not require any labeling reagents or destruction of the specimen is demonstrated. More than 6000 human alveolar epithelial A549 cells are monitored for up to 72 h simultaneously and continuously with a single digital image within a cost and space effective lens-free shadow imaging platform. In an experiment performed within a custom built incubator integrated with the lens-free shadow imaging platform, the cell nucleus division process could be successfully characterized by calculating the signal-to-noise ratios (SNRs) and the shadow diameters (SDs) of the cell shadow patterns. The versatile nature of this platform also enabled a single cell viability test followed by live cell counting. This study firstly shows that the lens-free shadow imaging technique can provide a continuous cell monitoring without any staining/labeling reagent and destruction of the specimen. This high-throughput continuous cell monitoring technique based on lens-free shadow imaging may be widely utilized as a compact, low-cost, and high-throughput cell monitoring tool in the fields of drug and food screening or cell proliferation and viability testing. Copyright © 2012 Elsevier B.V. All rights reserved.
Li, Teng; Ye, Jianwen; Shen, Rui; Zong, Yeqing; Zhao, Xuejin; Lou, Chunbo; Chen, Guo-Qiang
2016-11-18
As a product of a multistep enzymatic reaction, accumulation of poly(3-hydroxybutyrate) (PHB) in Escherichia coli (E. coli) can be achieved by overexpression of the PHB synthesis pathway from a native producer involving three genes phbC, phbA, and phbB. Pathway optimization by adjusting expression levels of the three genes can influence properties of the final product. Here, we reported a semirational approach for highly efficient PHB pathway optimization in E. coli based on a phbCAB operon cloned from the native producer Ralstonia entropha (R. entropha). Rationally designed ribosomal binding site (RBS) libraries with defined strengths for each of the three genes were constructed based on high or low copy number plasmids in a one-pot reaction by an oligo-linker mediated assembly (OLMA) method. Strains with desired properties were evaluated and selected by three different methodologies, including visual selection, high-throughput screening, and detailed in-depth analysis. Applying this approach, strains accumulating 0%-92% PHB contents in cell dry weight (CDW) were achieved. PHB with various weight-average molecular weights (M w ) of 2.7-6.8 × 10 6 were also efficiently produced in relatively high contents. These results suggest that the semirational approach combining library design, construction, and proper screening is an efficient way to optimize PHB and other multienzyme pathways.
Lee, Ju Hee; Chen, Hongxiang; Kolev, Vihren; Aull, Katherine H.; Jung, Inhee; Wang, Jun; Miyamoto, Shoko; Hosoi, Junichi; Mandinova, Anna; Fisher, David E.
2014-01-01
Skin pigmentation is a complex process including melanogenesis within melanocytes and melanin transfer to the keratinocytes. To develop a comprehensive screening method for novel pigmentation regulators, we used immortalized melanocytes and keratinocytes in co-culture to screen large numbers of compounds. High-throughput screening plates were subjected to digital automated microscopy to quantify the pigmentation via brightfield microscopy. Compounds with pigment suppression were secondarily tested for their effects on expression of MITF and several pigment regulatory genes, and further validated in terms of non-toxicity to keratinocytes/melanocytes and dose dependent activity. The results demonstrate a high-throughput, high-content screening approach, which is applicable to the analysis of large chemical libraries using a co-culture system. We identified candidate pigmentation inhibitors from 4,000 screened compounds including zoxazolamine, 3-methoxycatechol, and alpha-mangostin, which were also shown to modulate expression of MITF and several key pigmentation factors, and are worthy of further evaluation for potential translation to clinical use. PMID:24438532
A Robust Framework for Microbial Archaeology
Warinner, Christina; Herbig, Alexander; Mann, Allison; Yates, James A. Fellows; Weiβ, Clemens L.; Burbano, Hernán A.; Orlando, Ludovic; Krause, Johannes
2017-01-01
Microbial archaeology is flourishing in the era of high-throughput sequencing, revealing the agents behind devastating historical plagues, identifying the cryptic movements of pathogens in prehistory, and reconstructing the ancestral microbiota of humans. Here, we introduce the fundamental concepts and theoretical framework of the discipline, then discuss applied methodologies for pathogen identification and microbiome characterization from archaeological samples. We give special attention to the process of identifying, validating, and authenticating ancient microbes using high-throughput DNA sequencing data. Finally, we outline standards and precautions to guide future research in the field. PMID:28460196
Correction of Microplate Data from High-Throughput Screening.
Wang, Yuhong; Huang, Ruili
2016-01-01
High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.
Ultra-high-throughput Production of III-V/Si Wafer for Electronic and Photonic Applications
Geum, Dae-Myeong; Park, Min-Su; Lim, Ju Young; Yang, Hyun-Duk; Song, Jin Dong; Kim, Chang Zoo; Yoon, Euijoon; Kim, SangHyeon; Choi, Won Jun
2016-01-01
Si-based integrated circuits have been intensively developed over the past several decades through ultimate device scaling. However, the Si technology has reached the physical limitations of the scaling. These limitations have fuelled the search for alternative active materials (for transistors) and the introduction of optical interconnects (called “Si photonics”). A series of attempts to circumvent the Si technology limits are based on the use of III-V compound semiconductor due to their superior benefits, such as high electron mobility and direct bandgap. To use their physical properties on a Si platform, the formation of high-quality III-V films on the Si (III-V/Si) is the basic technology ; however, implementing this technology using a high-throughput process is not easy. Here, we report new concepts for an ultra-high-throughput heterogeneous integration of high-quality III-V films on the Si using the wafer bonding and epitaxial lift off (ELO) technique. We describe the ultra-fast ELO and also the re-use of the III-V donor wafer after III-V/Si formation. These approaches provide an ultra-high-throughput fabrication of III-V/Si substrates with a high-quality film, which leads to a dramatic cost reduction. As proof-of-concept devices, this paper demonstrates GaAs-based high electron mobility transistors (HEMTs), solar cells, and hetero-junction phototransistors on Si substrates. PMID:26864968
Life Cycle Cost of Solar Biomass Hybrid Dryer Systems for Cashew Drying of Nuts in India
NASA Astrophysics Data System (ADS)
Dhanushkodi, Saravanan; Wilson, Vincent H.; Sudhakar, Kumarasamy
2015-12-01
Cashew nut farming in India is mostly carried out in small and marginal holdings. Energy consumption in the small scale cashew nut processing industry is very high and is mainly due to the high energy consumption of the drying process. The drying operation provides a lot of scope for energy saving and substitutions of other renewable energy sources. Renewable energy-based drying systems with loading capacity of 40 kg were proposed for application in small scale cashew nut processing industries. The main objective of this work is to perform economic feasibility of substituting solar, biomass and hybrid dryer in place of conventional steam drying for cashew drying. Four economic indicators were used to assess the feasibility of three renewable based drying technologies. The payback time was 1.58 yr. for solar, 1.32 for biomass and 1.99 for the hybrid drying system, whereas as the cost-benefit estimates were 5.23 for solar, 4.15 for biomass and 3.32 for the hybrid system. It was found that it is of paramount importance to develop solar biomass hybrid dryer for small scale processing industries.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-08-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-01-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/. PMID:23657089
Laser processes and system technology for the production of high-efficient crystalline solar cells
NASA Astrophysics Data System (ADS)
Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.
2012-10-01
The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-01-01
Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274
Wen, X.; Datta, A.; Traverso, L. M.; Pan, L.; Xu, X.; Moon, E. E.
2015-01-01
Optical lithography, the enabling process for defining features, has been widely used in semiconductor industry and many other nanotechnology applications. Advances of nanotechnology require developments of high-throughput optical lithography capabilities to overcome the optical diffraction limit and meet the ever-decreasing device dimensions. We report our recent experimental advancements to scale up diffraction unlimited optical lithography in a massive scale using the near field nanolithography capabilities of bowtie apertures. A record number of near-field optical elements, an array of 1,024 bowtie antenna apertures, are simultaneously employed to generate a large number of patterns by carefully controlling their working distances over the entire array using an optical gap metrology system. Our experimental results reiterated the ability of using massively-parallel near-field devices to achieve high-throughput optical nanolithography, which can be promising for many important nanotechnology applications such as computation, data storage, communication, and energy. PMID:26525906
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
High-speed zero-copy data transfer for DAQ applications
NASA Astrophysics Data System (ADS)
Pisani, Flavio; Cámpora Pérez, Daniel Hugo; Neufeld, Niko
2015-05-01
The LHCb Data Acquisition (DAQ) will be upgraded in 2020 to a trigger-free readout. In order to achieve this goal we will need to connect around 500 nodes with a total network capacity of 32 Tb/s. To get such an high network capacity we are testing zero-copy technology in order to maximize the theoretical link throughput without adding excessive CPU and memory bandwidth overhead, leaving free resources for data processing resulting in less power, space and money used for the same result. We develop a modular test application which can be used with different transport layers. For the zero-copy implementation we choose the OFED IBVerbs API because it can provide low level access and high throughput. We present throughput and CPU usage measurements of 40 GbE solutions using Remote Direct Memory Access (RDMA), for several network configurations to test the scalability of the system.
High-throughput sequencing: a failure mode analysis.
Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A
2005-01-04
Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.
Lo Cicero, Alessandra; Jaskowiak, Anne-Laure; Egesipe, Anne-Laure; Tournois, Johana; Brinon, Benjamin; Pitrez, Patricia R.; Ferreira, Lino; de Sandre-Giovannoli, Annachiara; Levy, Nicolas; Nissan, Xavier
2016-01-01
Hutchinson-Gilford progeria syndrome (HGPS) is a rare fatal genetic disorder that causes systemic accelerated aging in children. Thanks to the pluripotency and self-renewal properties of induced pluripotent stem cells (iPSC), HGPS iPSC-based modeling opens up the possibility of access to different relevant cell types for pharmacological approaches. In this study, 2800 small molecules were explored using high-throughput screening, looking for compounds that could potentially reduce the alkaline phosphatase activity of HGPS mesenchymal stem cells (MSCs) committed into osteogenic differentiation. Results revealed seven compounds that normalized the osteogenic differentiation process and, among these, all-trans retinoic acid and 13-cis-retinoic acid, that also decreased progerin expression. This study highlights the potential of high-throughput drug screening using HGPS iPS-derived cells, in order to find therapeutic compounds for HGPS and, potentially, for other aging-related disorders. PMID:27739443
Lo Cicero, Alessandra; Jaskowiak, Anne-Laure; Egesipe, Anne-Laure; Tournois, Johana; Brinon, Benjamin; Pitrez, Patricia R; Ferreira, Lino; de Sandre-Giovannoli, Annachiara; Levy, Nicolas; Nissan, Xavier
2016-10-14
Hutchinson-Gilford progeria syndrome (HGPS) is a rare fatal genetic disorder that causes systemic accelerated aging in children. Thanks to the pluripotency and self-renewal properties of induced pluripotent stem cells (iPSC), HGPS iPSC-based modeling opens up the possibility of access to different relevant cell types for pharmacological approaches. In this study, 2800 small molecules were explored using high-throughput screening, looking for compounds that could potentially reduce the alkaline phosphatase activity of HGPS mesenchymal stem cells (MSCs) committed into osteogenic differentiation. Results revealed seven compounds that normalized the osteogenic differentiation process and, among these, all-trans retinoic acid and 13-cis-retinoic acid, that also decreased progerin expression. This study highlights the potential of high-throughput drug screening using HGPS iPS-derived cells, in order to find therapeutic compounds for HGPS and, potentially, for other aging-related disorders.
Development and Application of a High Throughput Protein Unfolding Kinetic Assay
Wang, Qiang; Waterhouse, Nicklas; Feyijinmi, Olusegun; Dominguez, Matthew J.; Martinez, Lisa M.; Sharp, Zoey; Service, Rachel; Bothe, Jameson R.; Stollar, Elliott J.
2016-01-01
The kinetics of folding and unfolding underlie protein stability and quantification of these rates provides important insights into the folding process. Here, we present a simple high throughput protein unfolding kinetic assay using a plate reader that is applicable to the studies of the majority of 2-state folding proteins. We validate the assay by measuring kinetic unfolding data for the SH3 (Src Homology 3) domain from Actin Binding Protein 1 (AbpSH3) and its stabilized mutants. The results of our approach are in excellent agreement with published values. We further combine our kinetic assay with a plate reader equilibrium assay, to obtain indirect estimates of folding rates and use these approaches to characterize an AbpSH3-peptide hybrid. Our high throughput protein unfolding kinetic assays allow accurate screening of libraries of mutants by providing both kinetic and equilibrium measurements and provide a means for in-depth ϕ-value analyses. PMID:26745729
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J
2017-07-14
In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Real-time traffic sign detection and recognition
NASA Astrophysics Data System (ADS)
Herbschleb, Ernst; de With, Peter H. N.
2009-01-01
The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput
Perspectives on Validation of High-Throughput Assays Supporting 21st Century Toxicity Testing1
Judson, Richard; Kavlock, Robert; Martin, Matt; Reif, David; Houck, Keith; Knudsen, Thomas; Richard, Ann; Tice, Raymond R.; Whelan, Maurice; Xia, Menghang; Huang, Ruili; Austin, Christopher; Daston, George; Hartung, Thomas; Fowle, John R.; Wooge, William; Tong, Weida; Dix, David
2014-01-01
Summary In vitro, high-throughput screening (HTS) assays are seeing increasing use in toxicity testing. HTS assays can simultaneously test many chemicals, but have seen limited use in the regulatory arena, in part because of the need to undergo rigorous, time-consuming formal validation. Here we discuss streamlining the validation process, specifically for prioritization applications in which HTS assays are used to identify a high-concern subset of a collection of chemicals. The high-concern chemicals could then be tested sooner rather than later in standard guideline bioassays. The streamlined validation process would continue to ensure the reliability and relevance of assays for this application. We discuss the following practical guidelines: (1) follow current validation practice to the extent possible and practical; (2) make increased use of reference compounds to better demonstrate assay reliability and relevance; (3) deemphasize the need for cross-laboratory testing, and; (4) implement a web-based, transparent and expedited peer review process. PMID:23338806
Quijada, Narciso M; De Filippis, Francesca; Sanz, José Javier; García-Fernández, María Del Camino; Rodríguez-Lázaro, David; Ercolini, Danilo; Hernández, Marta
2018-04-01
"Chorizo de Léon" is a high-value Spanish dry fermented sausage traditionally manufactured without the use of starter cultures, owing to the activity of a house-specific autochthonous microbiota that naturally contaminates the meat from the environment, the equipment and the raw materials. Lactic acid bacteria (particularly Lactobacillus) and coagulase-negative cocci (mainly Staphylococcus) have been reported as the most important bacterial groups regarding the organoleptic and safety properties of the dry fermented sausages. In this study, samples from raw minced meat to final products were taken from five different producers and the microbial diversity was investigated by high-throughput sequencing of 16S rRNA gene amplicons. The diverse microbial composition observed during the first stages of "Chorizo de Léon" evolved during ripening to a microbiota mainly composed by Lactobacillus in the final product. Oligotyping performed on 16S rRNA gene sequences of Lactobacillus and Staphylococcus populations revealed sub-genus level diversity within the different manufacturers, likely responsible of the characteristic organoleptic properties of the products from different companies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hiew, Tze Ning; Huang, Rongying; Popov, Ivan; Feldman, Yuri; Heng, Paul Wan Sia
2017-12-01
This study explored the potential of combining the use of moisture sorption isotherms and dielectric relaxation profiles of starch and sodium starch glycolate (SSG) to probe the location of moisture in dried and hydrated samples. Starch and SSG samples, dried and hydrated, were prepared. For hydrated samples, their moisture contents were determined. The samples were probed by dielectric spectroscopy using a frequency band of 0.1 Hz to 1 MHz to investigate their moisture-related relaxation profiles. The moisture sorption and desorption isotherms of starch and SSG were generated using a vapor sorption analyzer, and modeled using the Guggenheim-Anderson-de Boer equation. A clear high frequency relaxation process was detected in both dried and hydrated starches, while for dried starch, an additional slower low frequency process was also detected. The high frequency relaxation processes in hydrated and dried starches were assigned to the coupled starch-hydrated water relaxation. The low frequency relaxation in dried starch was attributed to the local chain motions of the starch backbone. No relaxation process associated with water was detected in both hydrated and dried SSG within the frequency and temperature range used in this study. The moisture sorption isotherms of SSG suggest the presence of high energy free water, which could have masked the relaxation process of the bound water during dielectric measurements. The combined study of moisture sorption isotherms and dielectric spectroscopy was shown to be beneficial and complementary in probing the effects of moisture on the relaxation processes of starch and SSG.
NASA Astrophysics Data System (ADS)
Muravev, Dmitri; Rakhmangulov, Aleksandr
2016-11-01
Currently, container shipping development is directly associated with an increase of warehouse areas for containers' storage. One of the most successful types of container terminal is an intermodal terminal called a dry port. Main pollution sources during the organization of intermodal transport are considered. A system of dry port parameters, which are recommended for the evaluation of different scenarios for a seaport infrastructure development at the stage of its strategic planning, is proposed in this paper. The authors have developed a method for determining the optimal values of the main dry port parameters by simulation modeling in the programming software Any- Logic. Dependencies thatwere obtained as a result of modeling experiments prove the adequacy of main selected dry port parameters for the effective scenarios' evaluation of throughput and handling capacity at existing seaports at the stage of strategic planning and a rational dry port location, allowed ensuring the improvement of the ecological situation in a port city.
NASA Technical Reports Server (NTRS)
Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.
2013-01-01
In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).
Transcriptome Response Mediated by Cold Stress in Lotus japonicus.
Calzadilla, Pablo I; Maiale, Santiago J; Ruiz, Oscar A; Escaray, Francisco J
2016-01-01
Members of the Lotus genus are important as agricultural forage sources under marginal environmental conditions given their high nutritional value and tolerance of various abiotic stresses. However, their dry matter production is drastically reduced in cooler seasons, while their response to such conditions is not well studied. This paper analyzes cold acclimation of the genus by studying Lotus japonicus over a stress period of 24 h. High-throughput RNA sequencing was used to identify and classify 1077 differentially expressed genes, of which 713 were up-regulated and 364 were down-regulated. Up-regulated genes were principally related to lipid, cell wall, phenylpropanoid, sugar, and proline regulation, while down-regulated genes affected the photosynthetic process and chloroplast development. Together, a total of 41 cold-inducible transcription factors were identified, including members of the AP2/ERF, NAC, MYB, and WRKY families; two of them were described as putative novel transcription factors. Finally, DREB1/CBFs were described with respect to their cold stress expression profiles. This is the first transcriptome profiling of the model legume L. japonicus under cold stress. Data obtained may be useful in identifying candidate genes for breeding modified species of forage legumes that more readily acclimate to low temperatures.
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Integrative Systems Biology for Data Driven Knowledge Discovery
Greene, Casey S.; Troyanskaya, Olga G.
2015-01-01
Integrative systems biology is an approach that brings together diverse high throughput experiments and databases to gain new insights into biological processes or systems at molecular through physiological levels. These approaches rely on diverse high-throughput experimental techniques that generate heterogeneous data by assaying varying aspects of complex biological processes. Computational approaches are necessary to provide an integrative view of these experimental results and enable data-driven knowledge discovery. Hypotheses generated from these approaches can direct definitive molecular experiments in a cost effective manner. Using integrative systems biology approaches, we can leverage existing biological knowledge and large-scale data to improve our understanding of yet unknown components of a system of interest and how its malfunction leads to disease. PMID:21044756
ADMET in silico modelling: towards prediction paradise?
van de Waterbeemd, Han; Gifford, Eric
2003-03-01
Following studies in the late 1990s that indicated that poor pharmacokinetics and toxicity were important causes of costly late-stage failures in drug development, it has become widely appreciated that these areas should be considered as early as possible in the drug discovery process. However, in recent years, combinatorial chemistry and high-throughput screening have significantly increased the number of compounds for which early data on absorption, distribution, metabolism, excretion (ADME) and toxicity (T) are needed, which has in turn driven the development of a variety of medium and high-throughput in vitro ADMET screens. Here, we describe how in silico approaches will further increase our ability to predict and model the most relevant pharmacokinetic, metabolic and toxicity endpoints, thereby accelerating the drug discovery process.
High speed all-optical networks
NASA Technical Reports Server (NTRS)
Chlamtac, Imrich
1993-01-01
An inherent problem of conventional point-to-point WAN architectures is that they cannot translate optical transmission bandwidth into comparable user available throughput due to the limiting electronic processing speed of the switching nodes. This report presents the first solution to WDM based WAN networks that overcomes this limitation. The proposed Lightnet architecture takes into account the idiosyncrasies of WDM switching/transmission leading to an efficient and pragmatic solution. The Lightnet architecture trades the ample WDM bandwidth for a reduction in the number of processing stages and a simplification of each switching stage, leading to drastically increased effective network throughputs.
NASA Technical Reports Server (NTRS)
Pang, Jackson; Liddicoat, Albert; Ralston, Jesse; Pingree, Paula
2006-01-01
The current implementation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays (TRIGA) is equipped with CFDP protocol and CCSDS Telemetry and Telecommand framing schemes to replace the CPU intensive software counterpart implementation for reliable deep space communication. We present the hardware/software co-design methodology used to accomplish high data rate throughput. The hardware CFDP protocol stack implementation is then compared against the two recent flight implementations. The results from our experiments show that TRIGA offers more than 3 orders of magnitude throughput improvement with less than one-tenth of the power consumption.
Payne, Andrew C; Andregg, Michael; Kemmish, Kent; Hamalainen, Mark; Bowell, Charlotte; Bleloch, Andrew; Klejwa, Nathan; Lehrach, Wolfgang; Schatz, Ken; Stark, Heather; Marblestone, Adam; Church, George; Own, Christopher S; Andregg, William
2013-01-01
We present "molecular threading", a surface independent tip-based method for stretching and depositing single and double-stranded DNA molecules. DNA is stretched into air at a liquid-air interface, and can be subsequently deposited onto a dry substrate isolated from solution. The design of an apparatus used for molecular threading is presented, and fluorescence and electron microscopies are used to characterize the angular distribution, straightness, and reproducibility of stretched DNA deposited in arrays onto elastomeric surfaces and thin membranes. Molecular threading demonstrates high straightness and uniformity over length scales from nanometers to micrometers, and represents an alternative to existing DNA deposition and linearization methods. These results point towards scalable and high-throughput precision manipulation of single-molecule polymers.
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Shihui; Franden, Mary A; Yang, Qing
The aim of this work was to identify inhibitors in pretreated lignocellulosic slurries, evaluate high-throughput screening strategies, and investigate the impact of inhibitors on potential hydrocarbon-producing microorganisms. Compounds present in slurries that could inhibit microbial growth were identified through a detailed analysis of saccharified slurries by applying a combination of approaches of high-performance liquid chromatography, GC-MS, LC-DAD-MS, and ICP-MS. Several high-throughput assays were then evaluated to generate toxicity profiles. Our results demonstrated that Bioscreen C was useful for analyzing bacterial toxicity but not for yeast. AlamarBlue reduction assay can be a useful high-throughput assay for both bacterial and yeast strainsmore » as long as medium components do not interfere with fluorescence measurements. In addition, this work identified two major inhibitors (furfural and ammonium acetate) for three potential hydrocarbon-producing bacterial species that include Escherichia coli, Cupriavidus necator, and Rhodococcus opacus PD630, which are also the primary inhibitors for ethanologens. Here, this study was strived to establish a pipeline to quantify inhibitory compounds in biomass slurries and high-throughput approaches to investigate the effect of inhibitors on microbial biocatalysts, which can be applied for various biomass slurries or hydrolyzates generated through different pretreatment and enzymatic hydrolysis processes or different microbial candidates.« less
Yang, Shihui; Franden, Mary A; Yang, Qing; ...
2018-04-04
The aim of this work was to identify inhibitors in pretreated lignocellulosic slurries, evaluate high-throughput screening strategies, and investigate the impact of inhibitors on potential hydrocarbon-producing microorganisms. Compounds present in slurries that could inhibit microbial growth were identified through a detailed analysis of saccharified slurries by applying a combination of approaches of high-performance liquid chromatography, GC-MS, LC-DAD-MS, and ICP-MS. Several high-throughput assays were then evaluated to generate toxicity profiles. Our results demonstrated that Bioscreen C was useful for analyzing bacterial toxicity but not for yeast. AlamarBlue reduction assay can be a useful high-throughput assay for both bacterial and yeast strainsmore » as long as medium components do not interfere with fluorescence measurements. In addition, this work identified two major inhibitors (furfural and ammonium acetate) for three potential hydrocarbon-producing bacterial species that include Escherichia coli, Cupriavidus necator, and Rhodococcus opacus PD630, which are also the primary inhibitors for ethanologens. Here, this study was strived to establish a pipeline to quantify inhibitory compounds in biomass slurries and high-throughput approaches to investigate the effect of inhibitors on microbial biocatalysts, which can be applied for various biomass slurries or hydrolyzates generated through different pretreatment and enzymatic hydrolysis processes or different microbial candidates.« less
Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.
Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai
2018-03-01
With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.
Optimizing SIEM Throughput on the Cloud Using Parallelization
Alam, Masoom; Ihsan, Asif; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, M Khurram; Farooq, Sajid
2016-01-01
Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage. PMID:27851762
Inertial-ordering-assisted droplet microfluidics for high-throughput single-cell RNA-sequencing.
Moon, Hui-Sung; Je, Kwanghwi; Min, Jae-Woong; Park, Donghyun; Han, Kyung-Yeon; Shin, Seung-Ho; Park, Woong-Yang; Yoo, Chang Eun; Kim, Shin-Hyun
2018-02-27
Single-cell RNA-seq reveals the cellular heterogeneity inherent in the population of cells, which is very important in many clinical and research applications. Recent advances in droplet microfluidics have achieved the automatic isolation, lysis, and labeling of single cells in droplet compartments without complex instrumentation. However, barcoding errors occurring in the cell encapsulation process because of the multiple-beads-in-droplet and insufficient throughput because of the low concentration of beads for avoiding multiple-beads-in-a-droplet remain important challenges for precise and efficient expression profiling of single cells. In this study, we developed a new droplet-based microfluidic platform that significantly improved the throughput while reducing barcoding errors through deterministic encapsulation of inertially ordered beads. Highly concentrated beads containing oligonucleotide barcodes were spontaneously ordered in a spiral channel by an inertial effect, which were in turn encapsulated in droplets one-by-one, while cells were simultaneously encapsulated in the droplets. The deterministic encapsulation of beads resulted in a high fraction of single-bead-in-a-droplet and rare multiple-beads-in-a-droplet although the bead concentration increased to 1000 μl -1 , which diminished barcoding errors and enabled accurate high-throughput barcoding. We successfully validated our device with single-cell RNA-seq. In addition, we found that multiple-beads-in-a-droplet, generated using a normal Drop-Seq device with a high concentration of beads, underestimated transcript numbers and overestimated cell numbers. This accurate high-throughput platform can expand the capability and practicality of Drop-Seq in single-cell analysis.
Sensitive high-throughput screening for the detection of reducing sugars.
Mellitzer, Andrea; Glieder, Anton; Weis, Roland; Reisinger, Christoph; Flicker, Karlheinz
2012-01-01
The exploitation of renewable resources for the production of biofuels relies on efficient processes for the enzymatic hydrolysis of lignocellulosic materials. The development of enzymes and strains for these processes requires reliable and fast activity-based screening assays. Additionally, these assays are also required to operate on the microscale and on the high-throughput level. Herein, we report the development of a highly sensitive reducing-sugar assay in a 96-well microplate screening format. The assay is based on the formation of osazones from reducing sugars and para-hydroxybenzoic acid hydrazide. By using this sensitive assay, the enzyme loads and conversion times during lignocellulose hydrolysis can be reduced, thus allowing higher throughput. The assay is about five times more sensitive than the widely applied dinitrosalicylic acid based assay and can reliably detect reducing sugars down to 10 μM. The assay-specific variation over one microplate was determined for three different lignocellulolytic enzymes and ranges from 2 to 8%. Furthermore, the assay was combined with a microscale cultivation procedure for the activity-based screening of Pichia pastoris strains expressing functional Thermomyces lanuginosus xylanase A, Trichoderma reesei β-mannanase, or T. reesei cellobiohydrolase 2. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Xiao, Xiaodong; Douthwaite, Julie A; Chen, Yan; Kemp, Ben; Kidd, Sara; Percival-Alwyn, Jennifer; Smith, Alison; Goode, Kate; Swerdlow, Bonnie; Lowe, David; Wu, Herren; Dall'Acqua, William F; Chowdhury, Partha S
Phage display antibody libraries are a rich resource for discovery of potential therapeutic antibodies. Single-chain variable fragment (scFv) libraries are the most common format due to the efficient display of scFv by phage particles and the ease by which soluble scFv antibodies can be expressed for high-throughput screening. Typically, a cascade of screening and triaging activities are performed, beginning with the assessment of large numbers of E. coli-expressed scFv, and progressing through additional assays with individual reformatting of the most promising scFv to full-length IgG. However, use of high-throughput screening of scFv for the discovery of full-length IgG is not ideal because of the differences between these molecules. Furthermore, the reformatting step represents a bottle neck in the process because each antibody has to be handled individually to preserve the unique VH and VL pairing. These problems could be resolved if populations of scFv could be reformatted to full-length IgG before screening without disrupting the variable region pairing. Here, we describe a novel strategy that allows the reformatting of diverse populations of scFv from phage selections to full-length IgG in a batch format. The reformatting process maintains the diversity and variable region pairing with high fidelity, and the resulted IgG pool enables high-throughput expression of IgG in mammalian cells and cell-based functional screening. The improved process led to the discovery of potent candidates that are comparable or better than those obtained by traditional methods. This strategy should also be readily applicable to Fab-based phage libraries. Our approach, Screening in Product Format (SiPF), represents a substantial improvement in the field of antibody discovery using phage display.
High speed micromachining with high power UV laser
NASA Astrophysics Data System (ADS)
Patel, Rajesh S.; Bovatsek, James M.
2013-03-01
Increasing demand for creating fine features with high accuracy in manufacturing of electronic mobile devices has fueled growth for lasers in manufacturing. High power, high repetition rate ultraviolet (UV) lasers provide an opportunity to implement a cost effective high quality, high throughput micromachining process in a 24/7 manufacturing environment. The energy available per pulse and the pulse repetition frequency (PRF) of diode pumped solid state (DPSS) nanosecond UV lasers have increased steadily over the years. Efficient use of the available energy from a laser is important to generate accurate fine features at a high speed with high quality. To achieve maximum material removal and minimal thermal damage for any laser micromachining application, use of the optimal process parameters including energy density or fluence (J/cm2), pulse width, and repetition rate is important. In this study we present a new high power, high PRF QuasarR 355-40 laser from Spectra-Physics with TimeShiftTM technology for unique software adjustable pulse width, pulse splitting, and pulse shaping capabilities. The benefits of these features for micromachining include improved throughput and quality. Specific example and results of silicon scribing are described to demonstrate the processing benefits of the Quasar's available power, PRF, and TimeShift technology.
Development of automated control system for wood drying
NASA Astrophysics Data System (ADS)
Sereda, T. G.; Kostarev, S. N.
2018-05-01
The article considers the parameters of convective wood drying which allows changing the characteristics of the air that performs drying at different stages: humidity, temperature, speed and direction of air movement. Despite the prevalence of this type of drying equipment, the main drawbacks of it are: the high temperature and humidity, negatively affecting the working conditions of maintenance personnel when they enter the drying chambers. It makes the automation of wood drying process necessary. The synthesis of a finite state of a machine control of wood drying process is implemented on a programmable logic device Omron.
DEVELOPMENT OF DEWATERING AIDS FOR MINERALS AND COAL FINES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roe-Hoam Yoon; Ramazan Asmatulu; Ismail Yildirim
2004-07-01
MCT has developed a suite of novel dewatering chemicals (or aids) that are designed to cause a decrease in the capillary pressures of the water trapped in a filter cake by (1) decreasing the surface tension of water, (2) increasing the contact angles of the particles to be dewatered, and (3) causing the particles to coagulate, all at the same time. The decrease in capillary pressure in turn causes an increase in the rate filtration, an increase in throughput, and a decrease in pressure drop requirement for filtration. The reagents are used frequently as blends of different chemicals in ordermore » to bring about the changes in all of the process variables noted above. The minerals and coal samples tested in the present work included copper sulfide, lead sulfide, zinc sulfide, kaolin clay, talc, and silica. The laboratory-scale test work included studies of reagent types, drying cycle times, cake thickness, slurry temperature, conditioning intensity and time, solid content, and reagent dosages. To better understand the mechanisms involved, fundamental studies were also conducted. These included the measurements of the contact angles of the particles to be dewatered (which are the measures of particle hydrophobicity) and the surface tensions of the filtrates produced from dewatering tests. The results of the laboratory-scale filtration experiments showed that the use of the novel dewatering aids can reduce the moistures of the filter cake by 30 to 50% over what can be achieved using no dewatering aids. In many cases, such high levels of moisture reductions are sufficient to obviate the needs for thermal drying, which is costly and energy intensive. Furthermore, the use of the novel dewatering aids cause a substantial increase in the kinetics of dewatering, which in turn results in increased throughput. As a result of these technological advantages, the novel dewatering aids have been licensed to Nalco, which is one of the largest mining chemicals companies of the world. At least one mineral company is currently using the technology in full-scale plant operation, which has resulted in the shutdown of a thermal dryer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Hongqiang; Westover, Tyler L.; Cherry, Robert
Naturally occurring and introduced inorganic species (ash) in biomass feedstocks negatively impact thermochemical energy conversion processes such as pyrolysis, hydrothermal liquefaction, gasification and combustion to biopower. As such, it is desirable to better understand the cost:benefit ratios of various ash reduction processes. Here, a novel process simulation model was developed using AspenPlus to reduce the ash content of Loblolly logging residues using both air classification and a dilute-acid leaching process. For costing purposes, a throughput of 25 tons/hour was selected. At this scale, the process cost for a standalone air classification process was $3 per ton for a biomass feedstock.more » Ash reduction via dilute –acid leaching was simulated based on experimentally determined kinetics of ion diffusion at an acid concentration of 0.5% H2SO4 and temperature of 75°F. The total estimated processing cost for leaching at these conditions was approximately $14/ton of dry biomass. Sensitivity analysis of three parameters on mineral reduction in the leaching process revealed that increasing leaching temperature was not economically feasible, while it was viable to apply a longer retention time in leaching for higher ash removal or achieve a lower water content in final products with reasonable extra costs. In addition, scenarios combining air classification with leaching were examined. A whole process cost of approximately $16/ton of biomass at a biomass feedstock rate of 25 ton/hour considering a 9% of biomass classified as light fraction to be leached. The leaching operating costs constituted 75% of this amount, of which the heating costs of dryer was 44%. This suggests that the process costs would be substantially reduced if more efficient drying methods are applied in future.« less
Bioprinting towards Physiologically Relevant Tissue Models for Pharmaceutics.
Peng, Weijie; Unutmaz, Derya; Ozbolat, Ibrahim T
2016-09-01
Improving the ability to predict the efficacy and toxicity of drug candidates earlier in the drug discovery process will speed up the introduction of new drugs into clinics. 3D in vitro systems have significantly advanced the drug screening process as 3D tissue models can closely mimic native tissues and, in some cases, the physiological response to drugs. Among various in vitro systems, bioprinting is a highly promising technology possessing several advantages such as tailored microarchitecture, high-throughput capability, coculture ability, and low risk of cross-contamination. In this opinion article, we discuss the currently available tissue models in pharmaceutics along with their limitations and highlight the possibilities of bioprinting physiologically relevant tissue models, which hold great potential in drug testing, high-throughput screening, and disease modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.
High-temperature drying of 7/4 yellow-poplar flitches for S-D-R studs
R. Sidney Boone; Robert R. Maeglin
1980-01-01
Yellow-poplar was dried as 7/4 flitches at high temperatures and subsequently ripped into studs to meet National Grading Rule Standards for STUD grade. The effects of growth stresses in these flitches from smaller logs appear to be minimized by this process. Dry bulb temperatures from 235° to 295° F were explored in five drying trials. Best results were by drying for...
Optima MDxt: A high throughput 335 keV mid-dose implanter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eisner, Edward; David, Jonathan; Justesen, Perry
2012-11-06
The continuing demand for both energy purity and implant angle control along with high wafer throughput drove the development of the Axcelis Optima MDxt mid-dose ion implanter. The system utilizes electrostatic scanning, an electrostatic parallelizing lens and an electrostatic energy filter to produce energetically pure beams with high angular integrity. Based on field proven components, the Optima MDxt beamline architecture offers the high beam currents possible with singly charged species including arsenic at energies up to 335 keV as well as large currents from multiply charged species at energies extending over 1 MeV. Conversely, the excellent energy filtering capability allowsmore » high currents at low beam energies, since it is safe to utilize large deceleration ratios. This beamline is coupled with the >500 WPH capable endstation technology used on the Axcelis Optima XEx high energy ion implanter. The endstation includes in-situ angle measurements of the beam in order to maintain excellent beam-to-wafer implant angle control in both the horizontal and vertical directions. The Optima platform control system provides new generation dose control system that assures excellent dosimetry and charge control. This paper will describe the features and technologies that allow the Optima MDxt to provide superior process performance at the highest wafer throughput, and will provide examples of the process performance achievable.« less
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
Vinícius de Melo, Gilberto
2018-01-01
Summary Coffee bean fermentation is a spontaneous, on-farm process involving the action of different microbial groups, including bacteria and fungi. In this study, high-throughput sequencing approach was employed to study the diversity and dynamics of bacteria associated with Brazilian coffee bean fermentation. The total DNA from fermenting coffee samples was extracted at different time points, and the 16S rRNA gene with segments around the V4 variable region was sequenced by Illumina high-throughput platform. Using this approach, the presence of over eighty bacterial genera was determined, many of which have been detected for the first time during coffee bean fermentation, including Fructobacillus, Pseudonocardia, Pedobacter, Sphingomonas and Hymenobacter. The presence of Fructobacillus suggests an influence of these bacteria on fructose metabolism during coffee fermentation. Temporal analysis showed a strong dominance of lactic acid bacteria with over 97% of read sequences at the end of fermentation, mainly represented by the Leuconostoc and Lactococcus. Metabolism of lactic acid bacteria was associated with the high formation of lactic acid during fermentation, as determined by HPLC analysis. The results reported in this study confirm the underestimation of bacterial diversity associated with coffee fermentation. New microbial groups reported in this study may be explored as functional starter cultures for on-farm coffee processing.
Mass spectrometry-driven drug discovery for development of herbal medicine.
Zhang, Aihua; Sun, Hui; Wang, Xijun
2018-05-01
Herbal medicine (HM) has made a major contribution to the drug discovery process with regard to identifying products compounds. Currently, more attention has been focused on drug discovery from natural compounds of HM. Despite the rapid advancement of modern analytical techniques, drug discovery is still a difficult and lengthy process. Fortunately, mass spectrometry (MS) can provide us with useful structural information for drug discovery, has been recognized as a sensitive, rapid, and high-throughput technology for advancing drug discovery from HM in the post-genomic era. It is essential to develop an efficient, high-quality, high-throughput screening method integrated with an MS platform for early screening of candidate drug molecules from natural products. We have developed a new chinmedomics strategy reliant on MS that is capable of capturing the candidate molecules, facilitating their identification of novel chemical structures in the early phase; chinmedomics-guided natural product discovery based on MS may provide an effective tool that addresses challenges in early screening of effective constituents of herbs against disease. This critical review covers the use of MS with related techniques and methodologies for natural product discovery, biomarker identification, and determination of mechanisms of action. It also highlights high-throughput chinmedomics screening methods suitable for lead compound discovery illustrated by recent successes. © 2016 Wiley Periodicals, Inc.
High speed all optical networks
NASA Technical Reports Server (NTRS)
Chlamtac, Imrich; Ganz, Aura
1990-01-01
An inherent problem of conventional point-to-point wide area network (WAN) architectures is that they cannot translate optical transmission bandwidth into comparable user available throughput due to the limiting electronic processing speed of the switching nodes. The first solution to wavelength division multiplexing (WDM) based WAN networks that overcomes this limitation is presented. The proposed Lightnet architecture takes into account the idiosyncrasies of WDM switching/transmission leading to an efficient and pragmatic solution. The Lightnet architecture trades the ample WDM bandwidth for a reduction in the number of processing stages and a simplification of each switching stage, leading to drastically increased effective network throughputs. The principle of the Lightnet architecture is the construction and use of virtual topology networks, embedded in the original network in the wavelength domain. For this construction Lightnets utilize the new concept of lightpaths which constitute the links of the virtual topology. Lightpaths are all-optical, multihop, paths in the network that allow data to be switched through intermediate nodes using high throughput passive optical switches. The use of the virtual topologies and the associated switching design introduce a number of new ideas, which are discussed in detail.
Quality assessment of dried okara as a source of production of gluten-free flour.
Ostermann-Porcel, María V; Rinaldoni, Ana N; Rodriguez-Furlán, Laura T; Campderrós, Mercedes E
2017-07-01
Okara is a by-product of soymilk and of tofu elaboration that is rich in protein, fiber and vegetable oils as a source of gluten-free flour. In order to take advantage of the nutritional characteristics of okara and to be able to determine an appropriate drying methodology, microwave, rotary dryer and freeze-drying were assessed. Furthermore, flour with an enzymatic treatment was characterized as well as its functional, physicochemical, and textural properties. The results showed that the physiochemical characteristics of the flour were affected by the drying process, reaching adequate water content, and high protein and fiber content. The freeze-drying process produced clearer flours with porous structure and high water absorption capacity, and with a higher protein denaturation. Okara dried by microwave and rotary dryer exhibited a denser structure with similar functional properties and improved textural characteristics such as firmness and consistency. The microwave-produced flour was darker due to the non-enzymatic browning reactions. The enzymatic treatment employed improved the consistency of the flour. It was possible to choose the drying process to be applied according to the feasible use of the flour, intended to preserve the favorable nutritional aspects of the okara flour. Based on the results, it can be affirmed that the physicochemical properties and attributes of okara are influenced by the drying process employed. Okara dried by freeze-drying resulted in a better product because it had a low final moisture content and the highest whiteness index. The flour presented a porous structure with high solubility, which is an indicator of potential applications in foods developments. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Yu, Hong; Gao, Feng; Jiang, Liren; Ma, Shuoxin
2017-09-15
The aim was to develop scalable Whole Slide Imaging (sWSI), a WSI system based on mainstream smartphones coupled with regular optical microscopes. This ultra-low-cost solution should offer diagnostic-ready imaging quality on par with standalone scanners, supporting both oil and dry objective lenses of different magnifications, and reasonably high throughput. These performance metrics should be evaluated by expert pathologists and match those of high-end scanners. The aim was to develop scalable Whole Slide Imaging (sWSI), a whole slide imaging system based on smartphones coupled with optical microscopes. This ultra-low-cost solution should offer diagnostic-ready imaging quality on par with standalone scanners, supporting both oil and dry object lens of different magnification. All performance metrics should be evaluated by expert pathologists and match those of high-end scanners. In the sWSI design, the digitization process is split asynchronously between light-weight clients on smartphones and powerful cloud servers. The client apps automatically capture FoVs at up to 12-megapixel resolution and process them in real-time to track the operation of users, then give instant feedback of guidance. The servers first restitch each pair of FoVs, then automatically correct the unknown nonlinear distortion introduced by the lens of the smartphone on the fly, based on pair-wise stitching, before finally combining all FoVs into one gigapixel VS for each scan. These VSs can be viewed using Internet browsers anywhere. In the evaluation experiment, 100 frozen section slides from patients randomly selected among in-patients of the participating hospital were scanned by both a high-end Leica scanner and sWSI. All VSs were examined by senior pathologists whose diagnoses were compared against those made using optical microscopy as ground truth to evaluate the image quality. The sWSI system is developed for both Android and iPhone smartphones and is currently being offered to the public. The image quality is reliable and throughput is approximately 1 FoV per second, yielding a 15-by-15 mm slide under 20X object lens in approximately 30-35 minutes, with little training required for the operator. The expected cost for setup is approximately US $100 and scanning each slide costs between US $1 and $10, making sWSI highly cost-effective for infrequent or low-throughput usage. In the clinical evaluation of sample-wise diagnostic reliability, average accuracy scores achieved by sWSI-scan-based diagnoses were as follows: 0.78 for breast, 0.88 for uterine corpus, 0.68 for thyroid, and 0.50 for lung samples. The respective low-sensitivity rates were 0.05, 0.05, 0.13, and 0.25 while the respective low-specificity rates were 0.18, 0.08, 0.20, and 0.25. The participating pathologists agreed that the overall quality of sWSI was generally on par with that produced by high-end scanners, and did not affect diagnosis in most cases. Pathologists confirmed that sWSI is reliable enough for standard diagnoses of most tissue categories, while it can be used for quick screening of difficult cases. As an ultra-low-cost alternative to whole slide scanners, diagnosis-ready VS quality and robustness for commercial usage is achieved in the sWSI solution. Operated on main-stream smartphones installed on normal optical microscopes, sWSI readily offers affordable and reliable WSI to resource-limited or infrequent clinical users. ©Hong Yu, Feng Gao, Liren Jiang, Shuoxin Ma. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 15.09.2017.
Advanced Drying Process for Lower Manufacturing Cost of Electrodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmad, Iftikhar; Zhang, Pu
For this Vehicle Technologies Incubator/Energy Storage R&D topic, Lambda Technologies teamed with Navitas Systems and proposed a new advanced drying process that promised a 5X reduction in electrode drying time and significant reduction in the cost of large format lithium batteries used in PEV's. The operating principle of the proposed process was to use penetrating radiant energy source Variable Frequency Microwaves (VFM), that are selectively absorbed by the polar water or solvent molecules instantly in the entire volume of the electrode. The solvent molecules are thus driven out of the electrode thickness making the process more efficient and much fastermore » than convective drying method. To evaluate the Advanced Drying Process (ADP) a hybrid prototype system utilizing VFM and hot air flow was designed and fabricated. While VFM drives the solvent out of the electrode thickness, the hot air flow exhausts the solvent vapors out of the chamber. The drying results from this prototype were very encouraging. For water based anodes there is a 5X drying advantage (time & length of oven) in using ADP over standard drying system and for the NMP based cathodes the reduction in drying time has 3X benefit. For energy savings the power consumption measurements were performed to ADP prototype and compared with the convection standard drying oven. The data collected demonstrated over 40% saving in power consumption with ADP as compared to the convection drying systems. The energy savings are one of the operational cost benefits possible with ADP. To further speed up the drying process, the ADP prototype was explored as a booster module before the convection oven and for the electrode material being evaluated it was possible to increase the drying speed by a factor of 4, which could not be accomplished with the standard dryer without surface defects and cracks. The instantaneous penetration of microwave in the entire slurry thickness showed a major advantage in rapid drying of the electrode materials. For the existing electrode materials, the material analysis and cell characterization data from ADP dried electrodes showed equivalent (or slightly better) performance. However, for high loading and thicker electrode materials (for high energy densities) the ADP advantages are more prominent. There was less binder migration, the resistance was lower hence the current capacities and retention of the battery cells were higher. The success of the project has enabled credible communications with commercial end users as well as battery coating line integrators. Goal is to scale ADP up for high volume manufacturing of Li-ion battery electrodes. The implementation of ADP in high volume manufacturing will reduce a high cost production step to bring the overall price of Li-ion batteries down. This will ultimately have a positive impact on the public by making electric and hybrid vehicles more affordable.« less
Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott
2018-05-01
The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.
ToxCast Data Generation: Chemical Workflow
This page describes the process EPA follows to select chemicals, procure chemicals, register chemicals, conduct a quality review of the chemicals, and prepare the chemicals for high-throughput screening.
Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa
2017-10-18
High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.
As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), “(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels.” The goals of metamodeling include, but are not limited to (1) developing func...
A Primer on High-Throughput Computing for Genomic Selection
Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel
2011-01-01
High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized genetic gain). Eventually, HTC may change our view of data analysis as well as decision-making in the post-genomic era of selection programs in animals and plants, or in the study of complex diseases in humans. PMID:22303303
Zhou, Jun; Yang, Jun; Yu, Qing; Yong, Xiaoyu; Xie, Xinxin; Zhang, Lijuan; Wei, Ping; Jia, Honghua
2017-11-01
The aim of this work was to investigate the mesophilic methane fermentation of rice straw at different organic loading rates (OLRs) in a 300m 3 bioreactor. It was found that biogas production increased when the OLR was below 2.00kg VS substrate /(m 3 ·d). The average volumetric biogas production reached 0.86m 3 /(m 3 ·d) at an OLR of 2.00kg VS substrate /(m 3 ·d). Biogas production rate was 323m 3 /t dry rice straw over the whole process. The pH, chemical oxygen demand, volatile fatty acid, and NH 4 + -N concentrations were all in optimal range at different OLRs. High-throughput sequencing analysis indicated that Firmicutes, Fibrobacteres, and Spirochaetes predominated in straw samples. Chloroflexi, Proteobacteria, and Planctomycetes were more abundant in the slurry. The hydrogenotrophic pathway was the main biochemical pathway of methanogenesis in the reactor. This study provides new information regarding the OLR and the differences in the spatial distribution of specific microbiota in a rice straw biogas plant. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zebrafish: A marvel of high-throughput biology for 21st century toxicology.
Bugel, Sean M; Tanguay, Robert L; Planchart, Antonio
2014-09-07
The evolutionary conservation of genomic, biochemical and developmental features between zebrafish and humans is gradually coming into focus with the end result that the zebrafish embryo model has emerged as a powerful tool for uncovering the effects of environmental exposures on a multitude of biological processes with direct relevance to human health. In this review, we highlight advances in automation, high-throughput (HT) screening, and analysis that leverage the power of the zebrafish embryo model for unparalleled advances in our understanding of how chemicals in our environment affect our health and wellbeing.
Zebrafish: A marvel of high-throughput biology for 21st century toxicology
Bugel, Sean M.; Tanguay, Robert L.; Planchart, Antonio
2015-01-01
The evolutionary conservation of genomic, biochemical and developmental features between zebrafish and humans is gradually coming into focus with the end result that the zebrafish embryo model has emerged as a powerful tool for uncovering the effects of environmental exposures on a multitude of biological processes with direct relevance to human health. In this review, we highlight advances in automation, high-throughput (HT) screening, and analysis that leverage the power of the zebrafish embryo model for unparalleled advances in our understanding of how chemicals in our environment affect our health and wellbeing. PMID:25678986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hattrick-Simpers, Jason R.; Gregoire, John M.; Kusne, A. Gilad
With their ability to rapidly elucidate composition-structure-property relationships, high-throughput experimental studies have revolutionized how materials are discovered, optimized, and commercialized. It is now possible to synthesize and characterize high-throughput libraries that systematically address thousands of individual cuts of fabrication parameter space. An unresolved issue remains transforming structural characterization data into phase mappings. This difficulty is related to the complex information present in diffraction and spectroscopic data and its variation with composition and processing. Here, we review the field of automated phase diagram attribution and discuss the impact that emerging computational approaches will have in the generation of phase diagrams andmore » beyond.« less
Discovery of 100K SNP array and its utilization in sugarcane
USDA-ARS?s Scientific Manuscript database
Next generation sequencing (NGS) enable us to identify thousands of single nucleotide polymorphisms (SNPs) marker for genotyping and fingerprinting. However, the process requires very precise bioinformatics analysis and filtering process. High throughput SNP array with predefined genomic location co...
Vanhoorne, V; Vanbillemont, B; Vercruysse, J; De Leersnyder, F; Gomes, P; Beer, T De; Remon, J P; Vervaet, C
2016-05-30
The aim of this study was to evaluate the potential of twin screw granulation for the continuous production of controlled release formulations with hydroxypropylmethylcellulose as hydrophilic matrix former. Metoprolol tartrate was included in the formulation as very water soluble model drug. A premix of metoprolol tartrate, hydroxypropylmethylcellulose and filler (ratio 20/20/60, w/w) was granulated with demineralized water via twin screw granulation. After oven drying and milling, tablets were produced on a rotary Modul™ P tablet press. A D-optimal design (29 experiments) was used to assess the influence of process (screw speed, throughput, barrel temperature and screw design) and formulation parameters (starch content of the filler) on the process (torque), granule (size distribution, shape, friability, density) and tablet (hardness, friability and dissolution) critical quality attributes. The torque was dominated by the number of kneading elements and throughput, whereas screw speed and filling degree only showed a minor influence on torque. Addition of screw mixing elements after a block of kneading elements improved the yield of the process before milling as it resulted in less oversized granules and also after milling as less fines were present. Temperature was also an important parameter to optimize as a higher temperature yielded less fines and positively influenced the aspect ratio. The shape of hydroxypropylmethylcellulose granules was comparable to that of immediate release formulations. Tensile strength and friability of tablets were not dependent on the process parameters. The use of starch as filler was not beneficial with regard to granule and tablet properties. Complete drug release was obtained after 16-20h and was independent of the design's parameters. Copyright © 2016 Elsevier B.V. All rights reserved.
Strategic and Operational Plan for Integrating Transcriptomics ...
Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016
High-Throughput Experimental Approach Capabilities | Materials Science |
NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non
Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.
2016-01-01
Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240
NASA Astrophysics Data System (ADS)
Mamat, K. A.; Yusof, M. S.; Yusoff, Wan Fauziah Wan; Zulafif Rahim, M.; Hassan, S.; Rahman, M. Qusyairi. A.; Karim, M. A. Abd
2017-05-01
Drying process is an essential step to produce instant noodles. Yet, the industries especially Small and Medium Enterprises (SMEs), is seeking for an efficient method to dry the noodles. This paper discusses the performance of an invented drying system which employed heating and humidifying process. The drying system was tested using 30 kilogram of the raw noodle known as “Mee Siput”. Temperature controlled system were used in the study to control the temperature of the drying process and prevent the dried noodles from damage by maintaining the temperature of lower than 80°C. The analysis shows that the system was drastically decreased the humidity from 80% to 40% just after 200 minutes of the drying process. The complete dehydration time of noodle has also decreased to only 4 hours from 16 hours when using traditional drying system without sacrificed the good quality of the dried noodle. In overall, the invented system believed to increase the production capacity of the noodle, reduce cost of production which would highly beneficial for Small Medium Industries (SMEs) in Malaysia.
Single-shot femtosecond laser ablation of gold surface in air and isopropyl alcohol
NASA Astrophysics Data System (ADS)
Kudryashov, S. I.; Saraeva, I. N.; Lednev, V. N.; Pershin, S. M.; Rudenko, A. A.; Ionin, A. A.
2018-05-01
Single-shot IR femtosecond-laser ablation of gold surfaces in ambient air and liquid isopropyl alcohol was studied by scanning electron microscopy characterization of crater topographies and time-resolved optical emission spectroscopy of ablative plumes in regimes, typical for non-filamentary and non-fragmentation laser production of nanoparticle sols. Despite one order of magnitude shorter (few nanoseconds) lifetimes and almost two orders of magnitude lower intensities of the quenched ablative plume emission in the alcohol ambient at the same peak laser fluence, craters for the dry and wet conditions appeared with rather similar nanofoam-like spallative topographies and the same thresholds. These facts envision the underlying surface spallation as one of the basic ablation mechanisms relevant for both dry and wet advanced femtosecond laser surface nano/micro-machining and texturing, as well as for high-throughput femtosecond laser ablative production of colloidal nanoparticles by MHz laser-pulse trains via their direct nanoscale jetting from the nanofoam in air and fluid environments.
Haug, Megan T; King, Ellena S; Heymann, Hildegarde; Crisosto, Carlos H
2013-08-01
A trained sensory panel evaluated the 6 fig cultivars currently sold in the California dried fig market. The main flavor and aroma attributes determined by the sensory panel were "caramel," "honey," "raisin," and "fig," with additional aroma attributes: "common date," "dried plum," and "molasses." Sensory differences were observed between dried fig cultivars. All figs were processed by 2 commercial handlers. Processing included potassium sorbate as a preservative and SO2 application as an antibrowning agent for white cultivars. As a consequence of SO2 use during processing, high sulfite residues affected the sensory profiles of the white dried fig cultivars. Significant differences between dried fig cultivars and sources demonstrate perceived differences between processing and storage methods. The panel-determined sensory lexicon can help with California fig marketing. © 2013 The Regents of California, Davis Campus Department of Plant Sciences.
Tatsumi, E; Konishi, Y; Tsujiyama, S
2016-11-01
To examine the activities of residual enzymes in dried shiitake mushrooms, which are a traditional foodstuff in Japanese cuisine, for possible applications in food processing. Polysaccharide-degrading enzymes remained intact in dried shiitake mushrooms and the activities of amylase, β-glucosidase and pectinase were high. A potato digestion was tested using dried shiitake powder. The enzymes reacted with potato tuber specimens to solubilize sugars even under a heterogeneous solid-state condition and that their reaction modes were different at 38 and 50 °C. Dried shiitake mushrooms have a potential use in food processing as an enzyme preparation.
Achieving High Throughput for Data Transfer over ATM Networks
NASA Technical Reports Server (NTRS)
Johnson, Marjory J.; Townsend, Jeffrey N.
1996-01-01
File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.
Shinozuka, Hiroshi; Forster, John W
2016-01-01
Background. Multiplexed sequencing is commonly performed on massively parallel short-read sequencing platforms such as Illumina, and the efficiency of library normalisation can affect the quality of the output dataset. Although several library normalisation approaches have been established, none are ideal for highly multiplexed sequencing due to issues of cost and/or processing time. Methods. An inexpensive and high-throughput library quantification method has been developed, based on an adaptation of the melting curve assay. Sequencing libraries were subjected to the assay using the Bio-Rad Laboratories CFX Connect(TM) Real-Time PCR Detection System. The library quantity was calculated through summation of reduction of relative fluorescence units between 86 and 95 °C. Results.PCR-enriched sequencing libraries are suitable for this quantification without pre-purification of DNA. Short DNA molecules, which ideally should be eliminated from the library for subsequent processing, were differentiated from the target DNA in a mixture on the basis of differences in melting temperature. Quantification results for long sequences targeted using the melting curve assay were correlated with those from existing methods (R (2) > 0.77), and that observed from MiSeq sequencing (R (2) = 0.82). Discussion.The results of multiplexed sequencing suggested that the normalisation performance of the described method is equivalent to that of another recently reported high-throughput bead-based method, BeNUS. However, costs for the melting curve assay are considerably lower and processing times shorter than those of other existing methods, suggesting greater suitability for highly multiplexed sequencing applications.
The report describes a mini-pilot test program to investigate potential new sorbents and processes for dry SO2 removal. Initial tests showed that the 85 cu m/h pilot plant could be used successfully to evaluate both spray dryer and dry injection processes using traditional calciu...
NASA Astrophysics Data System (ADS)
Lan, Ding-Hung; Hong, Shao-Huan; Chou, Li-Hui; Wang, Xiao-Feng; Liu, Cheng-Liang
2018-06-01
Organometal halide perovskite materials have demonstrated tremendous advances in the photovoltaic field recently because of their advantageous features of simple fabrication and high power conversion efficiency. To meet the high demand for high throughput and cost-effective, we present a wet process method that enables the probing of the parameters for perovskite layer deposition through two-step sequential ultrasonic spray-coating. This paper describes a detailed investigation on the effects of modification of spray precursor solution (PbI2 and CH3NH3I precursor concentration and solvents used) and post-annealing condition (temperature and time), which can be performed to create optimal film quality as well as improve device efficiency. Through the systematic optimization, the inverted planar perovskite solar cells show the reproducible photovoltaic properties with best power conversion efficiency (PCE) of 10.40% and average PCE of 9.70 ± 0.40%. A continuous spray-coating technique for rapid fabrication of total 16 pieces of perovskite films was demonstrated for providing a viable alternative for the high throughput production of the perovskite solar cells.
Lu, Zhi-Yan; Guo, Xiao-Jue; Li, Hui; Huang, Zhong-Zi; Lin, Kuang-Fei; Liu, Yong-Di
2015-01-01
A high-throughput screening system for moderately halophilic phenol-degrading bacteria from various habitats was developed to replace the conventional strain screening owing to its high efficiency. Bacterial enrichments were cultivated in 48 deep well microplates instead of shake flasks or tubes. Measurement of phenol concentrations was performed in 96-well microplates instead of using the conventional spectrophotometric method or high-performance liquid chromatography (HPLC). The high-throughput screening system was used to cultivate forty-three bacterial enrichments and gained a halophilic bacterial community E3 with the best phenol-degrading capability. Halomonas sp. strain 4-5 was isolated from the E3 community. Strain 4-5 was able to degrade more than 94% of the phenol (500 mg·L−1 starting concentration) over a range of 3%–10% NaCl. Additionally, the strain accumulated the compatible solute, ectoine, with increasing salt concentrations. PCR detection of the functional genes suggested that the largest subunit of multicomponent phenol hydroxylase (LmPH) and catechol 1,2-dioxygenase (C12O) were active in the phenol degradation process. PMID:26020478
Adaptation to high throughput batch chromatography enhances multivariate screening.
Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried
2015-09-01
High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin
2015-10-19
The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.
Thousands of untested chemicals in the environment require efficient characterization of carcinogenic potential in humans. A proposed solution is rapid testing of chemicals using in vitro high-throughput screening (HTS) assays for targets in pathways linked to disease processes ...
Metabolome progression during early gut microbial colonization of gnotobiotic mice
Marcobal, Angela; Yusufaly, Tahir; Higginbottom, Steven; Snyder, Michael; Sonnenburg, Justin L.; Mias, George I.
2015-01-01
The microbiome has been implicated directly in host health, especially host metabolic processes and development of immune responses. These are particularly important in infants where the gut first begins being colonized, and such processes may be modeled in mice. In this investigation we follow longitudinally the urine metabolome of ex-germ-free mice, which are colonized with two bacterial species, Bacteroides thetaiotaomicron and Bifidobacterium longum. High-throughput mass spectrometry profiling of urine samples revealed dynamic changes in the metabolome makeup, associated with the gut bacterial colonization, enabled by our adaptation of non-linear time-series analysis to urine metabolomics data. Results demonstrate both gradual and punctuated changes in metabolite production and that early colonization events profoundly impact the nature of small molecules circulating in the host. The identified small molecules are implicated in amino acid and carbohydrate metabolic processes, and offer insights into the dynamic changes occurring during the colonization process, using high-throughput longitudinal methodology. PMID:26118551
Modelling for Ship Design and Production
1991-09-01
the physical production process. The product has to be delivered within the chain of order processing . The process “ship production” is defined by the...environment is of increasing importance. Changing product types, complexity and parallelism of order processing , short throughput times and fixed due...specialized and high quality products under manu- facturing conditions which ensure economic and effective order processing . Mapping these main
High Throughput Multispectral Image Processing with Applications in Food Science.
Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John
2015-01-01
Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.
Yi, Hoon; Hwang, Insol; Lee, Jeong Hyeon; Lee, Dael; Lim, Haneol; Tahk, Dongha; Sung, Minho; Bae, Won-Gyu; Choi, Se-Jin; Kwak, Moon Kyu; Jeong, Hoon Eui
2014-08-27
A simple yet scalable strategy for fabricating dry adhesives with mushroom-shaped micropillars is achieved by a combination of the roll-to-roll process and modulated UV-curable elastic poly(urethane acrylate) (e-PUA) resin. The e-PUA combines the major benefits of commercial PUA and poly(dimethylsiloxane) (PDMS). It not only can be cured within a few seconds like commercial PUA but also possesses good mechanical properties comparable to those of PDMS. A roll-type fabrication system equipped with a rollable mold and a UV exposure unit is also developed for the continuous process. By integrating the roll-to-roll process with the e-PUA, dry adhesives with spatulate tips in the form of a thin flexible film can be generated in a highly continuous and scalable manner. The fabricated dry adhesives with mushroom-shaped microstructures exhibit a strong pull-off strength of up to ∼38.7 N cm(-2) on the glass surface as well as high durability without any noticeable degradation. Furthermore, an automated substrate transportation system equipped with the dry adhesives can transport a 300 mm Si wafer over 10,000 repeating cycles with high accuracy.
High density circuit technology, part 3
NASA Technical Reports Server (NTRS)
Wade, T. E.
1982-01-01
Dry processing - both etching and deposition - and present/future trends in semiconductor technology are discussed. In addition to a description of the basic apparatus, terminology, advantages, glow discharge phenomena, gas-surface chemistries, and key operational parameters for both dry etching and plasma deposition processes, a comprehensive survey of dry processing equipment (via vendor listing) is also included. The following topics are also discussed: fine-line photolithography, low-temperature processing, packaging for dense VLSI die, the role of integrated optics, and VLSI and technology innovations.
[Investigation on Spray Drying Technology of Auricularia auricular Extract].
Zhou, Rong; Chen, Hui; Xie, Yuan; Chen, Peng; Wang, Luo-lin
2015-07-01
To investigate the feasibility of spray drying technology of Auricularia auricular extract and its optimum process. On the basis of single factor test, with the yield of dry extract and the content of polysaccharide as indexes, orthogonal test method was used to optimize the spray drying technology on the inlet air temperature, injection speed and crude drug content. Using ultraviolet spectrophotometry, thin layer chromatography(TLC) and pharmacodynamics as indicators, extracts prepared by traditional alcohol precipitation drying process and spray drying process were compared. Compared with the traditional preparation method, the extract prepared by spray drying had little differences from the polysaccharide content, TLC and the function of reducing TG and TC, and its optimum technology condition were as follows: The inlet air temperature was 180 °C, injection speed was 10 ml/min and crude drugs content was 0. 4 g/mL. Auricularia auricular extract by spray drying technology is stable and feasible with high economic benefit.
Subnuclear foci quantification using high-throughput 3D image cytometry
NASA Astrophysics Data System (ADS)
Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.
2015-07-01
Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.
High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Shabbir, Faizan; Gong, Chao
2015-04-13
We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less
Enabling a high throughput real time data pipeline for a large radio telescope array with GPUs
NASA Astrophysics Data System (ADS)
Edgar, R. G.; Clark, M. A.; Dale, K.; Mitchell, D. A.; Ord, S. M.; Wayth, R. B.; Pfister, H.; Greenhill, L. J.
2010-10-01
The Murchison Widefield Array (MWA) is a next-generation radio telescope currently under construction in the remote Western Australia Outback. Raw data will be generated continuously at 5 GiB s-1, grouped into 8 s cadences. This high throughput motivates the development of on-site, real time processing and reduction in preference to archiving, transport and off-line processing. Each batch of 8 s data must be completely reduced before the next batch arrives. Maintaining real time operation will require a sustained performance of around 2.5 TFLOP s-1 (including convolutions, FFTs, interpolations and matrix multiplications). We describe a scalable heterogeneous computing pipeline implementation, exploiting both the high computing density and FLOP-per-Watt ratio of modern GPUs. The architecture is highly parallel within and across nodes, with all major processing elements performed by GPUs. Necessary scatter-gather operations along the pipeline are loosely synchronized between the nodes hosting the GPUs. The MWA will be a frontier scientific instrument and a pathfinder for planned peta- and exa-scale facilities.
High-Throughput, Motility-Based Sorter for Microswimmers and Gene Discovery Platform
NASA Astrophysics Data System (ADS)
Yuan, Jinzhou; Raizen, David; Bau, Haim
2015-11-01
Animal motility varies with genotype, disease progression, aging, and environmental conditions. In many studies, it is desirable to carry out high throughput motility-based sorting to isolate rare animals for, among other things, forward genetic screens to identify genetic pathways that regulate phenotypes of interest. Many commonly used screening processes are labor-intensive, lack sensitivity, and require extensive investigator training. Here, we describe a sensitive, high throughput, automated, motility-based method for sorting nematodes. Our method was implemented in a simple microfluidic device capable of sorting many thousands of animals per hour per module, and is amenable to parallelism. The device successfully enriched for known C. elegans motility mutants. Furthermore, using this device, we isolated low-abundance mutants capable of suppressing the somnogenic effects of the flp-13 gene, which regulates sleep-like quiescence in C. elegans. Subsequent genomic sequencing led to the identification of a flp-13-suppressor gene. This research was supported, in part, by NIH NIA Grant 5R03AG042690-02.
Mapping the miRNA interactome by crosslinking ligation and sequencing of hybrids (CLASH)
Helwak, Aleksandra; Tollervey, David
2014-01-01
RNA-RNA interactions play critical roles in many cellular processes but studying them is difficult and laborious. Here, we describe an experimental procedure, termed crosslinking ligation and sequencing of hybrids (CLASH), which allows high-throughput identification of sites of RNA-RNA interaction. During CLASH, a tagged bait protein is UV crosslinked in vivo to stabilise RNA interactions and purified under denaturing conditions. RNAs associated with the bait protein are partially truncated, and the ends of RNA-duplexes are ligated together. Following linker addition, cDNA library preparation and high-throughput sequencing, the ligated duplexes give rise to chimeric cDNAs, which unambiguously identify RNA-RNA interaction sites independent of bioinformatic predictions. This protocol is optimized for studying miRNA targets bound by Argonaute proteins, but should be easily adapted for other RNA-binding proteins and classes of RNA. The protocol requires around 5 days to complete, excluding the time required for high-throughput sequencing and bioinformatic analyses. PMID:24577361
Ngo, Tony; Coleman, James L J; Smith, Nicola J
2015-01-01
Orphan G protein-coupled receptors represent an underexploited resource for drug discovery but pose a considerable challenge for assay development because their cognate G protein signaling pathways are often unknown. In this methodological chapter, we describe the use of constitutive activity, that is, the inherent ability of receptors to couple to their cognate G proteins in the absence of ligand, to inform the development of high-throughput screening assays for a particular orphan receptor. We specifically focus on a two-step process, whereby constitutive G protein coupling is first determined using yeast Gpa1/human G protein chimeras linked to growth and β-galactosidase generation. Coupling selectivity is then confirmed in mammalian cells expressing endogenous G proteins and driving accumulation of transcription factor-fused luciferase reporters specific to each of the classes of G protein. Based on these findings, high-throughput screening campaigns can be performed on the already miniaturized mammalian reporter system.
Burdick, David B; Cavnor, Chris C; Handcock, Jeremy; Killcoyne, Sarah; Lin, Jake; Marzolf, Bruz; Ramsey, Stephen A; Rovira, Hector; Bressler, Ryan; Shmulevich, Ilya; Boyle, John
2010-07-14
High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services.
2010-01-01
Background High throughput sequencing has become an increasingly important tool for biological research. However, the existing software systems for managing and processing these data have not provided the flexible infrastructure that research requires. Results Existing software solutions provide static and well-established algorithms in a restrictive package. However as high throughput sequencing is a rapidly evolving field, such static approaches lack the ability to readily adopt the latest advances and techniques which are often required by researchers. We have used a loosely coupled, service-oriented infrastructure to develop SeqAdapt. This system streamlines data management and allows for rapid integration of novel algorithms. Our approach also allows computational biologists to focus on developing and applying new methods instead of writing boilerplate infrastructure code. Conclusion The system is based around the Addama service architecture and is available at our website as a demonstration web application, an installable single download and as a collection of individual customizable services. PMID:20630057
Structuring intuition with theory: The high-throughput way
NASA Astrophysics Data System (ADS)
Fornari, Marco
2015-03-01
First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.
Wojdyło, Aneta; Figiel, Adam; Oszmiański, Jan
2009-02-25
The objective of this study was to evaluate the application of vacuum-microwave drying (240, 360, and 480 W) in the production process of dehydrated strawberry and to compare and contrast the quality of these dehydrated strawberries in terms of their polyphenol compounds, concentration of some heat liable components, and color to that of freeze-dried, convective, and vacuum-dried strawberry. Thus, the effect of vacuum-microwave drying and other drying methods on the antioxidant activity of berries was evaluated. Whole fresh and dried fruits were assessed for phenolics (anthocyanins, flavanols, hydroxycinnamic acids, and flavonols), ascorbic acid, and antioxidant activity (all parameters were calculated on a dry matter basis). Analysis of data shows that ellagic acid and flavanol changes were affected by drying techniques and cultivar. Drying destroyed anthocyanins, flavanols, and ascorbic acid, and there was a significant decrease in antioxidant activity. The most striking result was that conventional and vacuum drying decreased antioxidant activity in both cultivars, whereas contradictory results were found for vacuum-microwave processed strawberry. This study has demonstrated that vacuum-microwave drying, especially at 240 W, can produce high-quality products, with the additional advantage of reduced processing times, compared to other processes such as freeze-drying.
toxoMine: an integrated omics data warehouse for Toxoplasma gondii systems biology research
Rhee, David B.; Croken, Matthew McKnight; Shieh, Kevin R.; Sullivan, Julie; Micklem, Gos; Kim, Kami; Golden, Aaron
2015-01-01
Toxoplasma gondii (T. gondii) is an obligate intracellular parasite that must monitor for changes in the host environment and respond accordingly; however, it is still not fully known which genetic or epigenetic factors are involved in regulating virulence traits of T. gondii. There are on-going efforts to elucidate the mechanisms regulating the stage transition process via the application of high-throughput epigenomics, genomics and proteomics techniques. Given the range of experimental conditions and the typical yield from such high-throughput techniques, a new challenge arises: how to effectively collect, organize and disseminate the generated data for subsequent data analysis. Here, we describe toxoMine, which provides a powerful interface to support sophisticated integrative exploration of high-throughput experimental data and metadata, providing researchers with a more tractable means toward understanding how genetic and/or epigenetic factors play a coordinated role in determining pathogenicity of T. gondii. As a data warehouse, toxoMine allows integration of high-throughput data sets with public T. gondii data. toxoMine is also able to execute complex queries involving multiple data sets with straightforward user interaction. Furthermore, toxoMine allows users to define their own parameters during the search process that gives users near-limitless search and query capabilities. The interoperability feature also allows users to query and examine data available in other InterMine systems, which would effectively augment the search scope beyond what is available to toxoMine. toxoMine complements the major community database ToxoDB by providing a data warehouse that enables more extensive integrative studies for T. gondii. Given all these factors, we believe it will become an indispensable resource to the greater infectious disease research community. Database URL: http://toxomine.org PMID:26130662
Droplet-based microfluidic analysis and screening of single plant cells.
Yu, Ziyi; Boehm, Christian R; Hibberd, Julian M; Abell, Chris; Haseloff, Jim; Burgess, Steven J; Reyna-Llorens, Ivan
2018-01-01
Droplet-based microfluidics has been used to facilitate high-throughput analysis of individual prokaryote and mammalian cells. However, there is a scarcity of similar workflows applicable to rapid phenotyping of plant systems where phenotyping analyses typically are time-consuming and low-throughput. We report on-chip encapsulation and analysis of protoplasts isolated from the emergent plant model Marchantia polymorpha at processing rates of >100,000 cells per hour. We use our microfluidic system to quantify the stochastic properties of a heat-inducible promoter across a population of transgenic protoplasts to demonstrate its potential for assessing gene expression activity in response to environmental conditions. We further demonstrate on-chip sorting of droplets containing YFP-expressing protoplasts from wild type cells using dielectrophoresis force. This work opens the door to droplet-based microfluidic analysis of plant cells for applications ranging from high-throughput characterisation of DNA parts to single-cell genomics to selection of rare plant phenotypes.
Contreras-Padilla, Margarita; Gutiérrez-Cortez, Elsa; Valderrama-Bravo, María Del Carmen; Rojas-Molina, Isela; Espinosa-Arbeláez, Diego Germán; Suárez-Vargas, Raúl; Rodríguez-García, Mario Enrique
2012-03-01
Chemical proximate analysis was done in order to determine the changes of nutritional characteristics of nopal powders from three different maturity stages 50, 100, and 150 days and obtained by three different drying processes: freeze dried, forced air oven, and tunnel. Results indicate that nopal powder obtained by the process of freeze dried retains higher contents of protein, soluble fiber, and fat than the other two processes. Also, freeze dried process had less effect on color hue variable. No changes were observed in insoluble fiber content, chroma and lightness with the three different drying processes. Furthermore, the soluble fibers decreased with the age of nopal while insoluble fibers and ash content shows an opposite trend. In addition, the luminosity and hue values did not show differences among the maturity stages studied. The high content of dietary fibers of nopal pad powder could to be an interesting source of these important components for human diets and also could be used in food, cosmetics and pharmaceutical industry.
Application of Solar Electric Propulsion to a Comet Surface Sample Return Mission
NASA Technical Reports Server (NTRS)
Cupples, Mike; Coverstone, Victoria; Woo, Byoungsam
2004-01-01
Current NSTAR (planned for the Discovery Mission: Dawn) and NASA's Evolutionary Xenon Thruster based propulsion systems were compared for a comet surface sample return mission to Tempe1 1. Mission and systems analyses were conducted over a range of array power for each propulsion system with an array of 12 kW EOL at 1 AU chosen for a baseline. Engine configurations investigated for NSTAR included 4 operational engines with 1 spare and 5 operational engines with 1 spare. The NEXT configuration investigated included 2 operational engines plus 1 spare, with performance estimated for high thrust and high Isp throttling modes. Figures of merit for this comparison include Solar Electric Propulsion dry mass, average engine throughput, and net non-propulsion payload returned to Earth flyby.
As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), “(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels.” The goals of metamodeling include, but are not limited to (1) developing functional or st...
High-conversion hydrolysates and corn sweetener production in dry-grind corn process.
USDA-ARS?s Scientific Manuscript database
Most corn is processed to fuel ethanol and distillers’ grain animal feed using the dry grind process. However, wet milling is needed to refine corn starch. Corn starch is in turn processed to numerous products, including glucose and syrup. However, wet milling is a capital, labor, and energy intensi...
Printing Proteins as Microarrays for High-Throughput Function Determination
NASA Astrophysics Data System (ADS)
MacBeath, Gavin; Schreiber, Stuart L.
2000-09-01
Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.
A field survey on coffee beans drying methods of Indonesian small holder farmers
NASA Astrophysics Data System (ADS)
Siagian, Parulian; Setyawan, Eko Y.; Gultom, Tumiur; Napitupulu, Farel H.; Ambarita, Himsar
2017-09-01
Drying agricultural product is a post-harvest process that consumes significant energy. It can affect the quality of the product. This paper deals with literature review and field survey of drying methods of coffee beans of Indonesia farmers. The objective is to supply the necessary information on developing continuous solar drier. The results show that intermittent characteristic of sun drying results in a better quality of coffee beans in comparison with constant convective drying. In order to use energy efficiently, the drying process should be divided into several stages. In the first stage when the moist content is high, higher drying air temperature is more effective. After this step, where the moist content is low, lower drying air temperature is better. The field survey of drying coffee beans in Sumatera Utara province reveals that the used drying process is very traditional. It can be divided into two modes and depend on the coffee beans type. The Arabica coffee is firstly fermented and dried to moisture content of 80% using sun drying method, then followed by Green House model of drying up to moisture content about 12%. The latter typically spends 3 days of drying time. On the other hand, The Robusta coffee is dried by exposing to the sun directly without any treatment. After the coffee beans dried follow by peeled process. These findings can be considered to develop a continuous solar drying that suitable for coffee beans drying.
Slatnar, Ana; Klancar, Urska; Stampar, Franci; Veberic, Robert
2011-11-09
Fresh figs were subjected to two different drying processes: sun-drying and oven-drying. To assess their effect on the nutritional and health-related properties of figs, sugars, organic acids, single phenolics, total phenolics, and antioxidant activity were determined before and after processing. Samples were analyzed three times in a year, and phenolic compounds were determined using high-performance liquid chromatography coupled with mass spectrometry (HPLC-MS). In figs, monomer sugars predominate, which is important nutritional information, and the content of sugars as well as organic acids in fresh figs was lower than in dried fruits. However, the best sugar/organic acid ratio was measured after the sun-drying process. Analysis of individual phenolic compounds revealed a higher content of all phenolic groups determined after the oven-drying process, with the exception of cyanidin-3-O-rutinoside. Similarly, higher total phenolic content and antioxidant activity were detected after the drying process. With these results it can be concluded that the differences in analyzed compounds in fresh and dried figs are significant. The differences between the sun-dried and oven-dried fruits were determined in organic acids, sugars, chlorogenic acid, catechin, epicatechin, kaempferol-3-O-glucoside, luteolin-8-C-glucoside, and total phenolic contents. The results indicate that properly dried figs can be used as a good source of phenolic compounds.
Affordable Imaging Lab for Noninvasive Analysis of Biomass and Early Vigour in Cereal Crops
2018-01-01
Plant phenotyping by imaging allows automated analysis of plants for various morphological and physiological traits. In this work, we developed a low-cost RGB imaging phenotyping lab (LCP lab) for low-throughput imaging and analysis using affordable imaging equipment and freely available software. LCP lab comprising RGB imaging and analysis pipeline is set up and demonstrated with early vigour analysis in wheat. Using this lab, a few hundred pots can be photographed in a day and the pots are tracked with QR codes. The software pipeline for both imaging and analysis is built from freely available software. The LCP lab was evaluated for early vigour analysis of five wheat cultivars. A high coefficient of determination (R2 0.94) was obtained between the dry weight and the projected leaf area of 20-day-old wheat plants and R2 of 0.9 for the relative growth rate between 10 and 20 days of plant growth. Detailed description for setting up such a lab is provided together with custom scripts built for imaging and analysis. The LCP lab is an affordable alternative for analysis of cereal crops when access to a high-throughput phenotyping facility is unavailable or when the experiments require growing plants in highly controlled climate chambers. The protocols described in this work are useful for building affordable imaging system for small-scale research projects and for education. PMID:29850536
Roll-to-roll nanopatterning using jet and flash imprint lithography
NASA Astrophysics Data System (ADS)
Ahn, Sean; Ganapathisubramanian, Maha; Miller, Mike; Yang, Jack; Choi, Jin; Xu, Frank; Resnick, Douglas J.; Sreenivasan, S. V.
2012-03-01
The ability to pattern materials at the nanoscale can enable a variety of applications ranging from high density data storage, displays, photonic devices and CMOS integrated circuits to emerging applications in the biomedical and energy sectors. These applications require varying levels of pattern control, short and long range order, and have varying cost tolerances. Extremely large area R2R manufacturing on flexible substrates is ubiquitous for applications such as paper and plastic processing. It combines the benefits of high speed and inexpensive substrates to deliver a commodity product at low cost. The challenge is to extend this approach to the realm of nanopatterning and realize similar benefits. The cost of manufacturing is typically driven by speed (or throughput), tool complexity, cost of consumables (materials used, mold or master cost, etc.), substrate cost, and the downstream processing required (annealing, deposition, etching, etc.). In order to achieve low cost nanopatterning, it is imperative to move towards high speed imprinting, less complex tools, near zero waste of consumables and low cost substrates. The Jet and Flash Imprint Lithography (J-FILTM) process uses drop dispensing of UV curable resists to assist high resolution patterning for subsequent dry etch pattern transfer. The technology is actively being used to develop solutions for memory markets including Flash memory and patterned media for hard disk drives. In this paper we address the key challenges for roll based nanopatterning by introducing a novel concept: Ink Jet based Roll-to-Roll Nanopatterning. To address this challenge, we have introduced a J-FIL based demonstrator product, the LithoFlex 100. Topics that are discussed in the paper include tool design and process performance. In addition, we have used the LithoFlex 100 to fabricate high performance wire grid polarizers on flexible polycarbonate (PC) films. Transmission of better than 80% and extinction ratios on the order of 4500 have been achieved.
NASA Astrophysics Data System (ADS)
Saxena, Shefali; Hawari, Ayman I.
2017-07-01
Digital signal processing techniques have been widely used in radiation spectrometry to provide improved stability and performance with compact physical size over the traditional analog signal processing. In this paper, field-programmable gate array (FPGA)-based adaptive digital pulse shaping techniques are investigated for real-time signal processing. National Instruments (NI) NI 5761 14-bit, 250-MS/s adaptor module is used for digitizing high-purity germanium (HPGe) detector's preamplifier pulses. Digital pulse processing algorithms are implemented on the NI PXIe-7975R reconfigurable FPGA (Kintex-7) using the LabVIEW FPGA module. Based on the time separation between successive input pulses, the adaptive shaping algorithm selects the optimum shaping parameters (rise time and flattop time of trapezoid-shaping filter) for each incoming signal. A digital Sallen-Key low-pass filter is implemented to enhance signal-to-noise ratio and reduce baseline drifting in trapezoid shaping. A recursive trapezoid-shaping filter algorithm is employed for pole-zero compensation of exponentially decayed (with two-decay constants) preamplifier pulses of an HPGe detector. It allows extraction of pulse height information at the beginning of each pulse, thereby reducing the pulse pileup and increasing throughput. The algorithms for RC-CR2 timing filter, baseline restoration, pile-up rejection, and pulse height determination are digitally implemented for radiation spectroscopy. Traditionally, at high-count-rate conditions, a shorter shaping time is preferred to achieve high throughput, which deteriorates energy resolution. In this paper, experimental results are presented for varying count-rate and pulse shaping conditions. Using adaptive shaping, increased throughput is accepted while preserving the energy resolution observed using the longer shaping times.
Logares, Ramiro; Haverkamp, Thomas H A; Kumar, Surendra; Lanzén, Anders; Nederbragt, Alexander J; Quince, Christopher; Kauserud, Håvard
2012-10-01
The incursion of High-Throughput Sequencing (HTS) in environmental microbiology brings unique opportunities and challenges. HTS now allows a high-resolution exploration of the vast taxonomic and metabolic diversity present in the microbial world, which can provide an exceptional insight on global ecosystem functioning, ecological processes and evolution. This exploration has also economic potential, as we will have access to the evolutionary innovation present in microbial metabolisms, which could be used for biotechnological development. HTS is also challenging the research community, and the current bottleneck is present in the data analysis side. At the moment, researchers are in a sequence data deluge, with sequencing throughput advancing faster than the computer power needed for data analysis. However, new tools and approaches are being developed constantly and the whole process could be depicted as a fast co-evolution between sequencing technology, informatics and microbiologists. In this work, we examine the most popular and recently commercialized HTS platforms as well as bioinformatics methods for data handling and analysis used in microbial metagenomics. This non-exhaustive review is intended to serve as a broad state-of-the-art guide to researchers expanding into this rapidly evolving field. Copyright © 2012 Elsevier B.V. All rights reserved.
In Vitro Toxicity Screening Technique for Volatile Substances ...
In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However the challenge is that many of these chemicals are volatile and not amenable to HTS robotic liquid handling applications. We assembled an in vitro cell culture apparatus capable of screening volatile chemicals for toxicity with potential for miniaturization for high throughput. BEAS-2B lung cells were grown in an enclosed culture apparatus under air-liquid interface (ALI) conditions, and exposed to an array of xenobiotics in 5% CO2. Use of ALI conditions allows direct contact of cells with a gas xenobiotic, as well as release of endogenous gaseous molecules without interference by medium on the apical surface. To identify potential xenobiotic-induced perturbations in cell homeostasis, we monitored for alterations of endogenously-produced gaseous molecules in air directly above the cells, termed “headspace”. Alterations in specific endogenously-produced gaseous molecules (e.g., signaling molecules nitric oxide (NO) and carbon monoxide (CO) in headspace is indicative of xenobiotic-induced perturbations of specific cellular processes. Additionally, endogenously produced volatile organic compounds (VOCs) may be monitored in a nonspecific, discovery manner to determine whether cell processes are
Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda
2017-10-01
The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.
GPU Lossless Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.
2014-01-01
Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.
Sources of PCR-induced distortions in high-throughput sequencing data sets
Kebschull, Justus M.; Zador, Anthony M.
2015-01-01
PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991
Key composition optimization of meat processed protein source by vacuum freeze-drying technology.
Ma, Yan; Wu, Xingzhuang; Zhang, Qi; Giovanni, Vigna; Meng, Xianjun
2018-05-01
Vacuum freeze-drying technology is a high technology content, a wide range of knowledge of technology in the field of drying technology is involved, it is also a method of the most complex drying equipment, the largest energy consumption, the highest cost of drying method, but due to the particularity of its dry goods: the freeze-drying food has the advantages of complex water performance is good, cooler and luster of freezing and drying food to maintain good products, less nutrient loss, light weight, easy to carry transportation, easy to long-term preservation, and on the quality is far superior to the obvious advantages of other dried food, making it become the forefront of drying technology research and development. The freeze-drying process of Chinese style ham and western Germany fruit tree tenderloin is studied in this paper, their eutectic point, melting point and collapse temperature, freeze-drying curve and its heat and mass transfer characteristics are got, then the precool temperature and the highest limiting temperature of sublimation interface are determined. The effect of system pressure on freeze-dried rate in freeze-drying process is discussed, and the method of regulating pressure circularly is determined.
Towards roll-to-roll manufacturing of polymer photonic devices
NASA Astrophysics Data System (ADS)
Subbaraman, Harish; Lin, Xiaohui; Ling, Tao; Guo, L. Jay; Chen, Ray T.
2014-03-01
Traditionally, polymer photonic devices are fabricated using clean-room processes such as photolithography, e-beam lithography, reactive ion etching (RIE) and lift-off methods etc, which leads to long fabrication time, low throughput and high cost. We have utilized a novel process for fabricating polymer photonic devices using a combination of imprinting and ink jet printing methods, which provides high throughput on a variety of rigid and flexible substrates with low cost. We discuss the manufacturing challenges that need to be overcome in order to realize true implementation of roll-to-roll manufacturing of flexible polymer photonic systems. Several metrology and instrumentation challenges involved such as availability of particulate-free high quality substrate, development and implementation of high-speed in-line and off-line inspection and diagnostic tools with adaptive control for patterned and unpatterned material films, development of reliable hardware, etc need to be addressed and overcome in order to realize a successful manufacturing process. Due to extreme resolution requirements compared to print media, the burden of software and hardware tools on the throughput also needs to be carefully determined. Moreover, the effect of web wander and variations in web speed need to accurately be determined in the design of the system hardware and software. In this paper, we show the realization of solutions for few challenges, and utilizing these solutions for developing a high-rate R2R dual stage ink-jet printer that can provide alignment accuracy of <10μm at a web speed of 5m/min. The development of a roll-to-roll manufacturing system for polymer photonic systems opens limitless possibilities for the deployment of high performance components in a variety of applications including communication, sensing, medicine, agriculture, energy, lighting etc.
Luan, Peng; Lee, Sophia; Paluch, Maciej; Kansopon, Joe; Viajar, Sharon; Begum, Zahira; Chiang, Nancy; Nakamura, Gerald; Hass, Philip E.; Wong, Athena W.; Lazar, Greg A.
2018-01-01
ABSTRACT To rapidly find “best-in-class” antibody therapeutics, it has become essential to develop high throughput (HTP) processes that allow rapid assessment of antibodies for functional and molecular properties. Consequently, it is critical to have access to sufficient amounts of high quality antibody, to carry out accurate and quantitative characterization. We have developed automated workflows using liquid handling systems to conduct affinity-based purification either in batch or tip column mode. Here, we demonstrate the capability to purify >2000 antibodies per day from microscale (1 mL) cultures. Our optimized, automated process for human IgG1 purification using MabSelect SuRe resin achieves ∼70% recovery over a wide range of antibody loads, up to 500 µg. This HTP process works well for hybridoma-derived antibodies that can be purified by MabSelect SuRe resin. For rat IgG2a, which is often encountered in hybridoma cultures and is challenging to purify via an HTP process, we established automated purification with GammaBind Plus resin. Using these HTP purification processes, we can efficiently recover sufficient amounts of antibodies from mammalian transient or hybridoma cultures with quality comparable to conventional column purification. PMID:29494273
Langsrud, S; Moen, B; Møretrø, T; Løype, M; Heir, E
2016-02-01
The microbiota surviving sanitation of salmon-processing conveyor belts was identified and its growth dynamics further investigated in a model mimicking processing surfaces in such plants. A diverse microbiota dominated by Gram-negative bacteria was isolated after regular sanitation in three salmon processing plants. A cocktail of 14 bacterial isolates representing all genera isolated from conveyor belts (Listeria, Pseudomonas, Stenotrophomonas, Brochothrix, Serratia, Acinetobacter, Rhodococcus and Chryseobacterium) formed stable biofilms on steel coupons (12°C, salmon broth) of about 10(9) CFU cm(-2) after 2 days. High-throughput sequencing showed that Listeria monocytogenes represented 0·1-0·01% of the biofilm population and that Pseudomonas spp dominated. Interestingly, both Brochothrix sp. and a Pseudomonas sp. dominated in the surrounding suspension. The microbiota surviving sanitation is dominated by Pseudomonas spp. The background microbiota in biofilms inhibit, but do not eliminate L. monocytogenes. The results highlights that sanitation procedures have to been improved in the salmon-processing industry, as high numbers of a diverse microbiota survived practical sanitation. High-throughput sequencing enables strain level studies of population dynamics in biofilm. © 2015 The Society for Applied Microbiology.
Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas
2014-01-01
A quantitative characterization of root system architecture is currently being attempted for various reasons. Non-destructive, rapid analyses of root system architecture are difficult to perform due to the hidden nature of the root. Hence, improved methods to measure root architecture are necessary to support knowledge-based plant breeding and to analyse root growth responses to environmental changes. Here, we report on the development of a novel method to reveal growth and architecture of maize root systems. The method is based on the cultivation of different root types within several layers of two-dimensional, large (50 × 60 cm) plates (rhizoslides). A central plexiglass screen stabilizes the system and is covered on both sides with germination paper providing water and nutrients for the developing root, followed by a transparent cover foil to prevent the roots from falling dry and to stabilize the system. The embryonic roots grow hidden between a Plexiglas surface and paper, whereas crown roots grow visible between paper and the transparent cover. Long cultivation with good image quality up to 20 days (four fully developed leaves) was enhanced by suppressing fungi with a fungicide. Based on hyperspectral microscopy imaging, the quality of different germination papers was tested and three provided sufficient contrast to distinguish between roots and background (segmentation). Illumination, image acquisition and segmentation were optimised to facilitate efficient root image analysis. Several software packages were evaluated with regard to their precision and the time investment needed to measure root system architecture. The software 'Smart Root' allowed precise evaluation of root development but needed substantial user interference. 'GiaRoots' provided the best segmentation method for batch processing in combination with a good analysis of global root characteristics but overestimated root length due to thinning artefacts. 'WhinRhizo' offered the most rapid and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.
Optimising the Encapsulation of an Aqueous Bitter Melon Extract by Spray-Drying
Tan, Sing Pei; Kha, Tuyen Chan; Parks, Sophie; Stathopoulos, Costas; Roach, Paul D.
2015-01-01
Our aim was to optimise the encapsulation of an aqueous bitter melon extract by spray-drying with maltodextrin (MD) and gum Arabic (GA). The response surface methodology models accurately predicted the process yield and retentions of bioactive concentrations and activity (R2 > 0.87). The optimal formulation was predicted and validated as 35% (w/w) stock solution (MD:GA, 1:1) and a ratio of 1.5:1 g/g of the extract to the stock solution. The spray-dried powder had a high process yield (66.2% ± 9.4%) and high retention (>79.5% ± 8.4%) and the quality of the powder was high. Therefore, the bitter melon extract was well encapsulated into a powder using MD/GA and spray-drying. PMID:28231214
Sanaie, Nooshafarin; Cecchini, Douglas; Pieracci, John
2012-10-01
Micro-scale chromatography formats are becoming more routinely used in purification process development because of their ability to rapidly screen large number of process conditions at a time with minimal material. Given the usual constraints that exist on development timelines and resources, these systems can provide a means to maximize process knowledge and process robustness compared to traditional packed column formats. In this work, a high-throughput, 96-well filter plate format was used in the development of the cation exchange and hydrophobic interaction chromatography steps of a purification process designed to alter the glycoform distribution of a small protein. The significant input parameters affecting process performance were rapidly identified for both steps and preliminary operating conditions were identified. These ranges were verified in a packed chromatography column in order to assess the ability of the 96-well plate to predict packed column performance. In both steps, the 96-well plate format consistently led to underestimated glycoform-enrichment levels and to overestimated product recovery rates compared to the column-based approach. These studies demonstrate that the plate format can be used as a screening tool to narrow the operating ranges prior to further optimization on packed chromatography columns. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Lopez, Carlos; Watanabe, Takaichi; Cabral, Joao; Graham, Peter; Porcar, Lionel; Martel, Anne
2014-03-01
The coupling of microfluidics and small angle neutron scattering (SANS) is successfully demonstrated for the first time. We have developed novel microdevices with suitably low SANS background and high pressure compatibility for the investigation of flow-induced phenomena and high throughput phase mapping of complex fluids. We successfully obtained scattering profiles from 50 micron channels, in 10s - 100s second acquisition times. The microfluidic geometry enables the variation of both flow type and magnitude, beyond traditional rheo-SANS setups, and is exceptionally well-suited for complex fluids due to the commensurability of relevant time and lengthscales. We demonstrate our approach by studying model flow responsive systems, including surfactant/co-surfactant/water mixtures, with well-known equilibrium phase behaviour,: sodium dodecyl sulfate (SDS)/octanol/brine, cetyltrimethyl ammonium chloride (C16TAC)/pentanol/water and a model microemulsion system (C10E4 /decane/ D20), as well as polyelectrolyte solutions. Finally, using an online micromixer we are able to implement a high throughput approach, scanning in excess of 10 scattering profiles/min for a continuous aqueous surfactant dilution over two decades in concentration.
NASA Astrophysics Data System (ADS)
Lashkov, V. A.; Levashko, E. I.; Safin, R. G.
2006-05-01
The heat and mass transfer in the process of drying of high-humidity materials by their depressurization has been investigated. The results of experimental investigation and mathematical simulation of the indicated process are presented. They allow one to determine the regularities of this process and predict the quality of the finished product. A technological scheme and an engineering procedure for calculating the drying of the liquid base of a soap are presented.
Haser, Abbe; Cao, Tu; Lubach, Joe; Listro, Tony; Acquarulo, Larry; Zhang, Feng
2017-05-01
Our hypothesis is that melt extrusion is a more suitable processing method than spray drying to prepare amorphous solid dispersions of drugs with a high crystallization tendency. Naproxen-povidone K25 was used as the model system in this study. Naproxen-povidone K25 solid dispersions at 30% and 60% drug loadings were characterized by modulated DSC, powder X-ray diffraction, FT-IR, and solid-state 13 C NMR to identify phase separation and drug recrystallization during processing and storage. At 30% drug loading, hydrogen bond (H-bond) sites of povidone K25 were not saturated and the glass transition (T g ) temperature of the formulation was higher. As a result, both melt-extruded and spray-dried materials were amorphous initially and remained so after storage at 40°C. At 60% drug loading, H-bond sites were saturated, and T g was low. We were not able to prepare amorphous materials. The initial crystallinity of the formulations was 0.4%±0.2% and 5.6%±0.6%, and increased to 2.7%±0.3% and 21.6%±1.0% for melt-extruded and spray-dried materials, respectively. Spray-dried material was more susceptible to re-crystallization during processing, due to the high diffusivity of naproxen molecules in the formulation matrix and lack of kinetic stabilization from polymer solution. A larger number of crystalline nucleation sites and high surface area made the spray-dried material more susceptible to recrystallization during storage. This study demonstrated the unique advantages of melt extrusion over spray drying for the preparation of amorphous solid dispersions of naproxen at high drug level. Copyright © 2017 Elsevier B.V. All rights reserved.
Tumuluru, Jaya Shankar; Conner, Craig C.; Hoover, Amber N.
2016-01-01
A major challenge in the production of pellets is the high cost associated with drying biomass from 30 to 10% (w.b.) moisture content. At Idaho National Laboratory, a high-moisture pelleting process was developed to reduce the drying cost. In this process the biomass pellets are produced at higher feedstock moisture contents than conventional methods, and the high moisture pellets produced are further dried in energy efficient dryers. This process helps to reduce the feedstock moisture content by about 5-10% during pelleting, which is mainly due to frictional heat developed in the die. The objective of this research was to explore how binder addition influences the pellet quality and energy consumption of the high-moisture pelleting process in a flat die pellet mill. In the present study, raw corn stover was pelleted at moistures of 33, 36, and 39% (w.b.) by addition of 0, 2, and 4% pure corn starch. The partially dried pellets produced were further dried in a laboratory oven at 70 °C for 3-4 hr to lower the pellet moisture to less than 9% (w.b.). The high moisture and dried pellets were evaluated for their physical properties, such as bulk density and durability. The results indicated that increasing the binder percentage to 4% improved pellet durability and reduced the specific energy consumption by 20-40% compared to pellets with no binder. At higher binder addition (4%), the reduction in feedstock moisture during pelleting was <4%, whereas the reduction was about 7-8% without the binder. With 4% binder and 33% (w.b.) feedstock moisture content, the bulk density and durability values observed of the dried pellets were >510 kg/m3 and >98%, respectively, and the percent fine particles generated was reduced to <3%. PMID:27340875
Tumuluru, Jaya Shankar; Conner, Craig C; Hoover, Amber N
2016-06-15
A major challenge in the production of pellets is the high cost associated with drying biomass from 30 to 10% (w.b.) moisture content. At Idaho National Laboratory, a high-moisture pelleting process was developed to reduce the drying cost. In this process the biomass pellets are produced at higher feedstock moisture contents than conventional methods, and the high moisture pellets produced are further dried in energy efficient dryers. This process helps to reduce the feedstock moisture content by about 5-10% during pelleting, which is mainly due to frictional heat developed in the die. The objective of this research was to explore how binder addition influences the pellet quality and energy consumption of the high-moisture pelleting process in a flat die pellet mill. In the present study, raw corn stover was pelleted at moistures of 33, 36, and 39% (w.b.) by addition of 0, 2, and 4% pure corn starch. The partially dried pellets produced were further dried in a laboratory oven at 70 °C for 3-4 hr to lower the pellet moisture to less than 9% (w.b.). The high moisture and dried pellets were evaluated for their physical properties, such as bulk density and durability. The results indicated that increasing the binder percentage to 4% improved pellet durability and reduced the specific energy consumption by 20-40% compared to pellets with no binder. At higher binder addition (4%), the reduction in feedstock moisture during pelleting was <4%, whereas the reduction was about 7-8% without the binder. With 4% binder and 33% (w.b.) feedstock moisture content, the bulk density and durability values observed of the dried pellets were >510 kg/m(3) and >98%, respectively, and the percent fine particles generated was reduced to <3%.
Improved Protocols for Illumina Sequencing
Bronner, Iraad F.; Quail, Michael A.; Turner, Daniel J.; Swerdlow, Harold
2013-01-01
In this unit, we describe a set of improvements we have made to the standard Illumina protocols to make the sequencing process more reliable in a high-throughput environment, reduce amplification bias, narrow the distribution of insert sizes, and reliably obtain high yields of data. PMID:19582764
NASA Astrophysics Data System (ADS)
Lim, Jiseok; Vrignon, Jérémy; Gruner, Philipp; Karamitros, Christos S.; Konrad, Manfred; Baret, Jean-Christophe
2013-11-01
We demonstrate the use of a hybrid microfluidic-micro-optical system for the screening of enzymatic activity at the single cell level. Escherichia coli β-galactosidase activity is revealed by a fluorogenic assay in 100 pl droplets. Individual droplets containing cells are screened by measuring their fluorescence signal using a high-speed camera. The measurement is parallelized over 100 channels equipped with microlenses and analyzed by image processing. A reinjection rate of 1 ml of emulsion per minute was reached corresponding to more than 105 droplets per second, an analytical throughput larger than those obtained using flow cytometry.
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
High throughput screening technologies for ion channels
Yu, Hai-bo; Li, Min; Wang, Wei-ping; Wang, Xiao-liang
2016-01-01
Ion channels are involved in a variety of fundamental physiological processes, and their malfunction causes numerous human diseases. Therefore, ion channels represent a class of attractive drug targets and a class of important off-targets for in vitro pharmacological profiling. In the past decades, the rapid progress in developing functional assays and instrumentation has enabled high throughput screening (HTS) campaigns on an expanding list of channel types. Chronologically, HTS methods for ion channels include the ligand binding assay, flux-based assay, fluorescence-based assay, and automated electrophysiological assay. In this review we summarize the current HTS technologies for different ion channel classes and their applications. PMID:26657056
Quantitative high throughput screening identifies inhibitors of anthrax-induced cell death
Zhu, Ping Jun; Hobson, Peyton; Southall, Noel; Qiu, Cunping; Thomas, Craig J.; Lu, Jiamo; Inglese, James; Zheng, Wei; Leppla, Stephen H.; Bugge, Thomas H.; Austin, Christopher P.; Liu, Shihui
2009-01-01
Here, we report the results of a quantitative high-throughput screen (qHTS) measuring the endocytosis and translocation of a β-lactamase-fused-lethal factor and the identification of small molecules capable of obstructing the process of anthrax toxin internalization. Several small molecules protect RAW264.7 macrophages and CHO cells from anthrax lethal toxin and protected cells from an LF-Pseudomonas exotoxin fusion protein and diphtheria toxin. Further efforts demonstrated that these compounds impaired the PA heptamer pre-pore to pore conversion in cells expressing the CMG2 receptor, but not the related TEM8 receptor, indicating that these compounds likely interfere with toxin internalization. PMID:19540764
Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.
Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil
2015-07-17
In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.
Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.
OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid
Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.
2016-01-01
High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617
High-Throughput Platform for Synthesis of Melamine-Formaldehyde Microcapsules.
Çakir, Seda; Bauters, Erwin; Rivero, Guadalupe; Parasote, Tom; Paul, Johan; Du Prez, Filip E
2017-07-10
The synthesis of microcapsules via in situ polymerization is a labor-intensive and time-consuming process, where many composition and process factors affect the microcapsule formation and its morphology. Herein, we report a novel combinatorial technique for the preparation of melamine-formaldehyde microcapsules, using a custom-made and automated high-throughput platform (HTP). After performing validation experiments for ensuring the accuracy and reproducibility of the novel platform, a design of experiment study was performed. The influence of different encapsulation parameters was investigated, such as the effect of the surfactant, surfactant type, surfactant concentration and core/shell ratio. As a result, this HTP-platform is suitable to be used for the synthesis of different types of microcapsules in an automated and controlled way, allowing the screening of different reaction parameters in a shorter time compared to the manual synthetic techniques.
Mapping of MPEG-4 decoding on a flexible architecture platform
NASA Astrophysics Data System (ADS)
van der Tol, Erik B.; Jaspers, Egbert G.
2001-12-01
In the field of consumer electronics, the advent of new features such as Internet, games, video conferencing, and mobile communication has triggered the convergence of television and computers technologies. This requires a generic media-processing platform that enables simultaneous execution of very diverse tasks such as high-throughput stream-oriented data processing and highly data-dependent irregular processing with complex control flows. As a representative application, this paper presents the mapping of a Main Visual profile MPEG-4 for High-Definition (HD) video onto a flexible architecture platform. A stepwise approach is taken, going from the decoder application toward an implementation proposal. First, the application is decomposed into separate tasks with self-contained functionality, clear interfaces, and distinct characteristics. Next, a hardware-software partitioning is derived by analyzing the characteristics of each task such as the amount of inherent parallelism, the throughput requirements, the complexity of control processing, and the reuse potential over different applications and different systems. Finally, a feasible implementation is proposed that includes amongst others a very-long-instruction-word (VLIW) media processor, one or more RISC processors, and some dedicated processors. The mapping study of the MPEG-4 decoder proves the flexibility and extensibility of the media-processing platform. This platform enables an effective HW/SW co-design yielding a high performance density.
Fantuzzo, J. A.; Mirabella, V. R.; Zahn, J. D.
2017-01-01
Abstract Synapse formation analyses can be performed by imaging and quantifying fluorescent signals of synaptic markers. Traditionally, these analyses are done using simple or multiple thresholding and segmentation approaches or by labor-intensive manual analysis by a human observer. Here, we describe Intellicount, a high-throughput, fully-automated synapse quantification program which applies a novel machine learning (ML)-based image processing algorithm to systematically improve region of interest (ROI) identification over simple thresholding techniques. Through processing large datasets from both human and mouse neurons, we demonstrate that this approach allows image processing to proceed independently of carefully set thresholds, thus reducing the need for human intervention. As a result, this method can efficiently and accurately process large image datasets with minimal interaction by the experimenter, making it less prone to bias and less liable to human error. Furthermore, Intellicount is integrated into an intuitive graphical user interface (GUI) that provides a set of valuable features, including automated and multifunctional figure generation, routine statistical analyses, and the ability to run full datasets through nested folders, greatly expediting the data analysis process. PMID:29218324
Micro-patterned agarose gel devices for single-cell high-throughput microscopy of E. coli cells.
Priest, David G; Tanaka, Nobuyuki; Tanaka, Yo; Taniguchi, Yuichi
2017-12-21
High-throughput microscopy of bacterial cells elucidated fundamental cellular processes including cellular heterogeneity and cell division homeostasis. Polydimethylsiloxane (PDMS)-based microfluidic devices provide advantages including precise positioning of cells and throughput, however device fabrication is time-consuming and requires specialised skills. Agarose pads are a popular alternative, however cells often clump together, which hinders single cell quantitation. Here, we imprint agarose pads with micro-patterned 'capsules', to trap individual cells and 'lines', to direct cellular growth outwards in a straight line. We implement this micro-patterning into multi-pad devices called CapsuleHotel and LineHotel for high-throughput imaging. CapsuleHotel provides ~65,000 capsule structures per mm 2 that isolate individual Escherichia coli cells. In contrast, LineHotel provides ~300 line structures per mm that direct growth of micro-colonies. With CapsuleHotel, a quantitative single cell dataset of ~10,000 cells across 24 samples can be acquired and analysed in under 1 hour. LineHotel allows tracking growth of > 10 micro-colonies across 24 samples simultaneously for up to 4 generations. These easy-to-use devices can be provided in kit format, and will accelerate discoveries in diverse fields ranging from microbiology to systems and synthetic biology.
Yi, Jianyong; Zhou, Linyan; Bi, Jinfeng; Chen, Qinqin; Liu, Xuan; Wu, Xinye
2016-02-01
The effects of hot air drying (AD), freeze drying (FD), infrared drying (IR), microwave drying (MV), vacuum drying (VD) as pre-drying treatments for explosion puff drying (EPD) on qualities of jackfruit chips were studied. The lowest total color differences (∆E) were found in the FD-, MV- and VD-EPD dried chips. Volume expansion effect (9.2 %) was only observed in the FD-EPD dried chips, which corresponded to its well expanded honeycomb microstructures and high rehydration rate. Compared with AD-, IR-, MV- and VD-EPD, the FD-EPD dried fruit chips exhibited lower hardness and higher crispness, indicative of a crispier texture. FD-EPD dried fruits also obtained high retentions of ascorbic acid, phenolics and carotenoids compared with that of the other puffed products. The results of sensory evaluation suggested that the FD-EPD was a more beneficial combination because it enhanced the overall qualities of jackfruit chips. In conclusion, the FD-EPD could be used as a novel combination drying method for processing valuable and/or high quality fruit chips.
USDA-ARS?s Scientific Manuscript database
Plants are under continuous threat of infection by pathogens endowed with diverse strategies to colonize their host. Knowledge of plant susceptibility factors and the molecular processes involved in the infection process are critical for understanding plant-pathogen interactions. We used SuperSAGE t...
Drying characteristics of electro-osmosis dewatered sludge.
Ma, Degang; Qian, Jingjing; Zhu, Hongmin; Zhai, Jun
2016-12-01
Electro-osmotic dewatering (EDW) is one of the effective deeply dewatering technologies that is suitable for treating sludge with 55-80% of moisture content. Regarding EDW as the pre-treatment process of drying or incinerating, this article investigated the drying characteristics of electro-osmosis-dewatered sludge, including shear stress test, drying curves analysis, model analysis, and energy balance calculation. After EDW pre-treatment, sludge adhesion was reduced. The sludge drying rate was higher compared to the non-pre-treated sludge, especially under high temperatures (80-120°C). In addition, it is better to place the sludge cake with cathode surface facing upward for improving the drying rate. An adjusted model based on the Logarithmic model could better describe the EDW sludge drying process. Using the energy balance calculation, EDW can save the energy consumed in the process of sludge incineration and electricity generation and enable the system to run without extra energy input.
NASA Astrophysics Data System (ADS)
Fang, Sheng-Po; Jao, PitFee; Senior, David E.; Kim, Kyoung-Tae; Yoon, Yong-Kyu
2017-12-01
High throughput nanomanufacturing of photopatternable nanofibers and subsequent photopatterning is reported. For the production of high density nanofibers, the tube nozzle electrospinning (TNE) process has been used, where an array of micronozzles on the sidewall of a plastic tube are used as spinnerets. By increasing the density of nozzles, the electric fields of adjacent nozzles confine the cone of electrospinning and give a higher density of nanofibers. With TNE, higher density nozzles are easily achievable compared to metallic nozzles, e.g. an inter-nozzle distance as small as 0.5 cm and an average semi-vertical repulsion angle of 12.28° for 8-nozzles were achieved. Nanofiber diameter distribution, mass throughput rate, and growth rate of nanofiber stacks in different operating conditions and with different numbers of nozzles, such as 2, 4 and 8 nozzles, and scalability with single and double tube configurations are discussed. Nanofibers made of SU-8, photopatternable epoxy, have been collected to a thickness of over 80 μm in 240 s of electrospinning and the production rate of 0.75 g/h is achieved using the 2 tube 8 nozzle systems, followed by photolithographic micropatterning. TNE is scalable to a large number of nozzles, and offers high throughput production, plug and play capability with standard electrospinning equipment, and little waste of polymer.
A new fungal large subunit ribosomal RNA primer for high throughput sequencing surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Rebecca C.; Gallegos-Graves, La Verne; Kuske, Cheryl R.
The inclusion of phylogenetic metrics in community ecology has provided insights into important ecological processes, particularly when combined with high-throughput sequencing methods; however, these approaches have not been widely used in studies of fungal communities relative to other microbial groups. Two obstacles have been considered: (1) the internal transcribed spacer (ITS) region has limited utility for constructing phylogenies and (2) most PCR primers that target the large subunit (LSU) ribosomal unit generate amplicons that exceed current limits of high-throughput sequencing platforms. We designed and tested a PCR primer (LR22R) to target approximately 300–400 bp region of the D2 hypervariable regionmore » of the fungal LSU for use with the Illumina MiSeq platform. Both in silico and empirical analyses showed that the LR22R–LR3 pair captured a broad range of fungal taxonomic groups with a small fraction of non-fungal groups. Phylogenetic placement of publically available LSU D2 sequences showed broad agreement with taxonomic classification. Comparisons of the LSU D2 and the ITS2 ribosomal regions from environmental samples and known communities showed similar discriminatory abilities of the two primer sets. Altogether, these findings show that the LR22R–LR3 primer pair has utility for phylogenetic analyses of fungal communities using high-throughput sequencing methods.« less
A new fungal large subunit ribosomal RNA primer for high throughput sequencing surveys
Mueller, Rebecca C.; Gallegos-Graves, La Verne; Kuske, Cheryl R.
2015-12-09
The inclusion of phylogenetic metrics in community ecology has provided insights into important ecological processes, particularly when combined with high-throughput sequencing methods; however, these approaches have not been widely used in studies of fungal communities relative to other microbial groups. Two obstacles have been considered: (1) the internal transcribed spacer (ITS) region has limited utility for constructing phylogenies and (2) most PCR primers that target the large subunit (LSU) ribosomal unit generate amplicons that exceed current limits of high-throughput sequencing platforms. We designed and tested a PCR primer (LR22R) to target approximately 300–400 bp region of the D2 hypervariable regionmore » of the fungal LSU for use with the Illumina MiSeq platform. Both in silico and empirical analyses showed that the LR22R–LR3 pair captured a broad range of fungal taxonomic groups with a small fraction of non-fungal groups. Phylogenetic placement of publically available LSU D2 sequences showed broad agreement with taxonomic classification. Comparisons of the LSU D2 and the ITS2 ribosomal regions from environmental samples and known communities showed similar discriminatory abilities of the two primer sets. Altogether, these findings show that the LR22R–LR3 primer pair has utility for phylogenetic analyses of fungal communities using high-throughput sequencing methods.« less
NASA Astrophysics Data System (ADS)
Maier, A.; Schledjewski, R.
2016-07-01
For continuous manufacturing processes mechanical preloading of the fibers occurs during the delivery of the fibers from the spool creel to the actual manufacturing process step. Moreover preloading of the dry roving bundles might be mandatory, e.g. during winding, to be able to produce high quality components. On the one hand too high tensile loads within dry roving bundles might result in a catastrophic failure and on the other hand the part produced under too low pre-tension might have low quality and mechanical properties. In this work, load conditions influencing mechanical properties of dry glass fiber bundles during continuous composite manufacturing processes were analyzed. Load conditions, i.e. fiber delivery speed, necessary pre-tension and other effects of the delivery system during continuous fiber winding, were chosen in process typical ranges. First, the strain rate dependency under static tensile load conditions was investigated. Furthermore different free gauge lengths up to 1.2 m, interactions between fiber points of contact regarding influence of sizing as well as impregnation were tested and the effect of twisting on the mechanical behavior of dry glass fiber bundles during the fiber delivery was studied.
Cheng, Jerome; Hipp, Jason; Monaco, James; Lucas, David R; Madabhushi, Anant; Balis, Ulysses J
2011-01-01
Spatially invariant vector quantization (SIVQ) is a texture and color-based image matching algorithm that queries the image space through the use of ring vectors. In prior studies, the selection of one or more optimal vectors for a particular feature of interest required a manual process, with the user initially stochastically selecting candidate vectors and subsequently testing them upon other regions of the image to verify the vector's sensitivity and specificity properties (typically by reviewing a resultant heat map). In carrying out the prior efforts, the SIVQ algorithm was noted to exhibit highly scalable computational properties, where each region of analysis can take place independently of others, making a compelling case for the exploration of its deployment on high-throughput computing platforms, with the hypothesis that such an exercise will result in performance gains that scale linearly with increasing processor count. An automated process was developed for the selection of optimal ring vectors to serve as the predicate matching operator in defining histopathological features of interest. Briefly, candidate vectors were generated from every possible coordinate origin within a user-defined vector selection area (VSA) and subsequently compared against user-identified positive and negative "ground truth" regions on the same image. Each vector from the VSA was assessed for its goodness-of-fit to both the positive and negative areas via the use of the receiver operating characteristic (ROC) transfer function, with each assessment resulting in an associated area-under-the-curve (AUC) figure of merit. Use of the above-mentioned automated vector selection process was demonstrated in two cases of use: First, to identify malignant colonic epithelium, and second, to identify soft tissue sarcoma. For both examples, a very satisfactory optimized vector was identified, as defined by the AUC metric. Finally, as an additional effort directed towards attaining high-throughput capability for the SIVQ algorithm, we demonstrated the successful incorporation of it with the MATrix LABoratory (MATLAB™) application interface. The SIVQ algorithm is suitable for automated vector selection settings and high throughput computation.
High-Throughput, Adaptive FFT Architecture for FPGA-Based Spaceborne Data Processors
NASA Technical Reports Server (NTRS)
NguyenKobayashi, Kayla; Zheng, Jason X.; He, Yutao; Shah, Biren N.
2011-01-01
Exponential growth in microelectronics technology such as field-programmable gate arrays (FPGAs) has enabled high-performance spaceborne instruments with increasing onboard data processing capabilities. As a commonly used digital signal processing (DSP) building block, fast Fourier transform (FFT) has been of great interest in onboard data processing applications, which needs to strike a reasonable balance between high-performance (throughput, block size, etc.) and low resource usage (power, silicon footprint, etc.). It is also desirable to be designed so that a single design can be reused and adapted into instruments with different requirements. The Multi-Pass Wide Kernel FFT (MPWK-FFT) architecture was developed, in which the high-throughput benefits of the parallel FFT structure and the low resource usage of Singleton s single butterfly method is exploited. The result is a wide-kernel, multipass, adaptive FFT architecture. The 32K-point MPWK-FFT architecture includes 32 radix-2 butterflies, 64 FIFOs to store the real inputs, 64 FIFOs to store the imaginary inputs, complex twiddle factor storage, and FIFO logic to route the outputs to the correct FIFO. The inputs are stored in sequential fashion into the FIFOs, and the outputs of each butterfly are sequentially written first into the even FIFO, then the odd FIFO. Because of the order of the outputs written into the FIFOs, the depth of the even FIFOs, which are 768 each, are 1.5 times larger than the odd FIFOs, which are 512 each. The total memory needed for data storage, assuming that each sample is 36 bits, is 2.95 Mbits. The twiddle factors are stored in internal ROM inside the FPGA for fast access time. The total memory size to store the twiddle factors is 589.9Kbits. This FFT structure combines the benefits of high throughput from the parallel FFT kernels and low resource usage from the multi-pass FFT kernels with desired adaptability. Space instrument missions that need onboard FFT capabilities such as the proposed DESDynl, SWOT (Surface Water Ocean Topography), and Europa sounding radar missions would greatly benefit from this technology with significant reductions in non-recurring cost and risk.
Development of Low-cost, High Energy-per-unit-area Solar Cell Modules
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.
1978-01-01
The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.
Listeria monocytogenes presence during fermentation, drying and storage of Petrovská klobása sausage
NASA Astrophysics Data System (ADS)
Janković, V.; Mitrović, R.; Lakićević, B.; Velebit, B.; Baltić, T.
2017-09-01
The majority of human listeriosis cases appear to be caused by consumption of ready-to-eat (RTE) foods contaminated at the time of consumption with high levels of Listeria monocytogenes. Although strategies to prevent growth of L. monocytogenes in RTE products are critical for reducing the incidence of human listeriosis, this pathogen is highly difficult to control in fermented sausage processing environments due to its high tolerance to low pH and high salt concentration. The aims of the present study were to investigate the occurrence, presence and elimination of L. monocytogenes in Petrovská klobása sausage during processing, fermentation, drying and storage. L. monocytogenes, which was detected at the beginning of the production cycle, disappeared before day 30. The pathogen decline was much faster in those sausages which were dried in controlled, industrial conditions than in those dried applying the traditional, household technique.
NASA Technical Reports Server (NTRS)
Chitre, S. R.
1978-01-01
The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.
High Throughput PBTK: Open-Source Data and Tools for ...
Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy
NASA Astrophysics Data System (ADS)
Saputra, Asep Handaya; Putri, Rizky Anggreini
2017-05-01
Water hyacinth is an aquatic weed that has a very fast growth which makes it becomes a problem to the ecosystem. On the other hand, water hyacinth has a high fiber content (up to 20% by weight) which makes it potential to become raw material for composites and textile industries. As an aquatic plant, water hyacinth has a high initial moisture content that reaches more than 90%. Meanwhile the moisture content of fiber as a raw material for composite and textile industry should not be more than 10% to maintain the good quality of the products. Mixed adsorption drying method is one of the innovative method that can replace conventional drying process. Fluidization method which has been commonly used in agricultural and pharmaceutical products drying, can be enhanced by combining it with the adsorption method as performed in this study. In mixed fluidization-adsorption drying method, fly ash as adsorbent and water hyacinth fiber were put together into the fluidization column where the drying air evaporate the moisture content in water hyacinth fiber. In addition, the adsorbent adsorb the moisture content in the drying air to make the moisture content of the drying air remain low. The drying process is performed in various temperature and composition of water hyacinth and adsorbent in order to obtain the optimum drying condition. In addition, the effect of fly ash pellet and fly ash powder to the drying process was also performed. The result shows that the higher temperature and the more amount of adsorbent results in the faster drying rate. Fly ash pellet shows a better adsorption since it has a smaller pore diameter and wider surface area. The optimum temperature obtained from this study is 60°C and the optimum ratio of water hyacinth and fly ash is 50:50.
Shibata, Kazuhiro; Itoh, Masayoshi; Aizawa, Katsunori; Nagaoka, Sumiharu; Sasaki, Nobuya; Carninci, Piero; Konno, Hideaki; Akiyama, Junichi; Nishi, Katsuo; Kitsunai, Tokuji; Tashiro, Hideo; Itoh, Mari; Sumi, Noriko; Ishii, Yoshiyuki; Nakamura, Shin; Hazama, Makoto; Nishine, Tsutomu; Harada, Akira; Yamamoto, Rintaro; Matsumoto, Hiroyuki; Sakaguchi, Sumito; Ikegami, Takashi; Kashiwagi, Katsuya; Fujiwake, Syuji; Inoue, Kouji; Togawa, Yoshiyuki; Izawa, Masaki; Ohara, Eiji; Watahiki, Masanori; Yoneda, Yuko; Ishikawa, Tomokazu; Ozawa, Kaori; Tanaka, Takumi; Matsuura, Shuji; Kawai, Jun; Okazaki, Yasushi; Muramatsu, Masami; Inoue, Yorinao; Kira, Akira; Hayashizaki, Yoshihide
2000-01-01
The RIKEN high-throughput 384-format sequencing pipeline (RISA system) including a 384-multicapillary sequencer (the so-called RISA sequencer) was developed for the RIKEN mouse encyclopedia project. The RISA system consists of colony picking, template preparation, sequencing reaction, and the sequencing process. A novel high-throughput 384-format capillary sequencer system (RISA sequencer system) was developed for the sequencing process. This system consists of a 384-multicapillary auto sequencer (RISA sequencer), a 384-multicapillary array assembler (CAS), and a 384-multicapillary casting device. The RISA sequencer can simultaneously analyze 384 independent sequencing products. The optical system is a scanning system chosen after careful comparison with an image detection system for the simultaneous detection of the 384-capillary array. This scanning system can be used with any fluorescent-labeled sequencing reaction (chain termination reaction), including transcriptional sequencing based on RNA polymerase, which was originally developed by us, and cycle sequencing based on thermostable DNA polymerase. For long-read sequencing, 380 out of 384 sequences (99.2%) were successfully analyzed and the average read length, with more than 99% accuracy, was 654.4 bp. A single RISA sequencer can analyze 216 kb with >99% accuracy in 2.7 h (90 kb/h). For short-read sequencing to cluster the 3′ end and 5′ end sequencing by reading 350 bp, 384 samples can be analyzed in 1.5 h. We have also developed a RISA inoculator, RISA filtrator and densitometer, RISA plasmid preparator which can handle throughput of 40,000 samples in 17.5 h, and a high-throughput RISA thermal cycler which has four 384-well sites. The combination of these technologies allowed us to construct the RISA system consisting of 16 RISA sequencers, which can process 50,000 DNA samples per day. One haploid genome shotgun sequence of a higher organism, such as human, mouse, rat, domestic animals, and plants, can be revealed by seven RISA systems within one month. PMID:11076861
Robotic Patterning a Superhydrophobic Surface for Collective Cell Migration Screening.
Pang, Yonggang; Yang, Jing; Hui, Zhixin; Grottkau, Brian E
2018-04-01
Collective cell migration, in which cells migrate as a group, is fundamental in many biological and pathological processes. There is increasing interest in studying the collective cell migration in high throughput. Cell scratching, insertion blocker, and gel-dissolving techniques are some methodologies used previously. However, these methods have the drawbacks of cell damage, substrate surface alteration, limitation in medium exchange, and solvent interference. The superhydrophobic surface, on which the water contact angle is greater than 150 degrees, has been recently utilized to generate patterned arrays. Independent cell culture areas can be generated on a substrate that functions the same as a conventional multiple well plate. However, so far there has been no report on superhydrophobic patterning for the study of cell migration. In this study, we report on the successful development of a robotically patterned superhydrophobic array for studying collective cell migration in high throughput. The array was developed on a rectangular single-well cell culture plate consisting of hydrophilic flat microwells separated by the superhydrophobic surface. The manufacturing process is robotic and includes patterning discrete protective masks to the substrate using 3D printing, robotic spray coating of silica nanoparticles, robotic mask removal, robotic mini silicone blocker patterning, automatic cell seeding, and liquid handling. Compared with a standard 96-well plate, our system increases the throughput by 2.25-fold and generates a cell-free area in each well non-destructively. Our system also demonstrates higher efficiency than conventional way of liquid handling using microwell plates, and shorter processing time than manual operating in migration assays. The superhydrophobic surface had no negative impact on cell viability. Using our system, we studied the collective migration of human umbilical vein endothelial cells and cancer cells using assays of endpoint quantification, dynamic cell tracking, and migration quantification following varied drug treatments. This system provides a versatile platform to study collective cell migration in high throughput for a broad range of applications.
Automatic cassette to cassette radiant impulse processor
NASA Astrophysics Data System (ADS)
Sheets, Ronald E.
1985-01-01
Single wafer rapid annealing using high temperature isothermal processing has become increasingly popular in recent years. In addition to annealing, this process is also being investigated for suicide formation, passivation, glass reflow and alloying. Regardless of the application, there is a strong necessity to automate in order to maintain process control, repeatability, cleanliness and throughput. These requirements have been carefully addressed during the design and development of the Model 180 Radiant Impulse Processor which is a totally automatic cassette to cassette wafer processing system. Process control and repeatability are maintained by a closed loop optical pyrometer system which maintains the wafer at the programmed temperature-time conditions. Programmed recipes containing up to 10 steps may be easily entered on the computer keyboard or loaded in from a recipe library stored on a standard 5 {1}/{4″} floppy disk. Cold wall heating chamber construction, controlled environment (N 2, A, forming gas) and quartz wafer carriers prevent contamination of the wafer during high temperature processing. Throughputs of 150-240 wafers per hour are achieved by quickly heating the wafer to temperature (450-1400°C) in 3-6 s with a high intensity, uniform (± 1%) radiant flux of 100 {W}/{cm 2}, parallel wafer handling system and a wafer cool down stage.
Pérez, Alberto J; González-Peña, Rolando J; Braga, Roberto; Perles, Ángel; Pérez-Marín, Eva; García-Diego, Fernando J
2018-01-11
Dynamic laser speckle (DLS) is used as a reliable sensor of activity for all types of materials. Traditional applications are based on high-rate captures (usually greater than 10 frames-per-second, fps). Even for drying processes in conservation treatments, where there is a high level of activity in the first moments after the application and slower activity after some minutes or hours, the process is based on the acquisition of images at a time rate that is the same in moments of high and low activity. In this work, we present an alternative approach to track the drying process of protective layers and other painting conservation processes that take a long time to reduce their levels of activity. We illuminate, using three different wavelength lasers, a temporary protector (cyclododecane) and a varnish, and monitor them using a low fps rate during long-term drying. The results are compared to the traditional method. This work also presents a monitoring method that uses portable equipment. The results present the feasibility of using the portable device and show the improved sensitivity of the dynamic laser speckle when sensing the long-term process for drying cyclododecane and varnish in conservation.
NASA Astrophysics Data System (ADS)
Triyastuti, M. S.; Kumoro, A. C.; Djaeni, M.
2017-03-01
Roselle contains anthocyanin that is potential for food colorant. Occasionally, roselle extract is provided in dry powder prepared under high temperature. In this case, the anthocyanin color degrades due to the intervention of heat. The foammat drying with egg white is a potential method to speed up the drying process as well as minimize color degradation. This research aims to study the physical properties of roselle extract under foam mat drying. As indicators, the powder size and color intensity were observed. The result showed that at high temperatures, roselle powder under foam mat drying has the fine size with porous structure. However, at the higher the drying temperature the color retention decreased.
Cardoso-Toset, F; Luque, I; Morales-Partera, A; Galán-Relaño, A; Barrero-Domínguez, B; Hernández, M; Gómez-Laguna, J
2017-02-01
Dry-cured hams, shoulders and loins of Iberian pigs are highly appreciated in national and international markets. Salting, additive addition and dehydration are the main strategies to produce these ready-to-eat products. Although the dry curing process is known to reduce the load of well-known food borne pathogens, studies evaluating the viability of other microorganisms in contaminated pork have not been performed. In this work, the efficacy of the dry curing process to eliminate three swine pathogens associated with pork carcass condemnation, Streptococcus suis, Streptococcus dysgalactiae and Trueperella pyogenes, was evaluated. Results of this study highlight that the dry curing process is a suitable method to obtain safe ready-to-eat products free of these microorganisms. Although salting of dry-cured shoulders had a moderate bactericidal effect, results of this study suggest that drying and ripening were the most important stages to obtain dry-cured products free of these microorganisms. Copyright © 2016 Elsevier Ltd. All rights reserved.
FBI Fingerprint Image Capture System High-Speed-Front-End throughput modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathke, P.M.
1993-09-01
The Federal Bureau of Investigation (FBI) has undertaken a major modernization effort called the Integrated Automated Fingerprint Identification System (IAFISS). This system will provide centralized identification services using automated fingerprint, subject descriptor, mugshot, and document processing. A high-speed Fingerprint Image Capture System (FICS) is under development as part of the IAFIS program. The FICS will capture digital and microfilm images of FBI fingerprint cards for input into a central database. One FICS design supports two front-end scanning subsystems, known as the High-Speed-Front-End (HSFE) and Low-Speed-Front-End, to supply image data to a common data processing subsystem. The production rate of themore » HSFE is critical to meeting the FBI`s fingerprint card processing schedule. A model of the HSFE has been developed to help identify the issues driving the production rate, assist in the development of component specifications, and guide the evolution of an operations plan. A description of the model development is given, the assumptions are presented, and some HSFE throughput analysis is performed.« less
Huang, Yuhong; Willats, William G; Lange, Lene; Jin, Yanling; Fang, Yang; Salmeán, Armando A; Pedersen, Henriette L; Busk, Peter Kamp; Zhao, Hai
2016-01-01
Viscosity reduction has a great impact on the efficiency of ethanol production when using roots and tubers as feedstock. Plant cell wall-degrading enzymes have been successfully applied to overcome the challenges posed by high viscosity. However, the changes in cell wall polymers during the viscosity-reducing process are poorly characterized. Comprehensive microarray polymer profiling, which is a high-throughput microarray, was used for the first time to map changes in the cell wall polymers of sweet potato (Ipomoea batatas), cassava (Manihot esculenta), and Canna edulis Ker. over the entire viscosity-reducing process. The results indicated that the composition of cell wall polymers among these three roots and tubers was markedly different. The gel-like matrix and glycoprotein network in the C. edulis Ker. cell wall caused difficulty in viscosity reduction. The obvious viscosity reduction of the sweet potato and the cassava was attributed to the degradation of homogalacturonan and the released 1,4-β-d-galactan and 1,5-α-l-arabinan. © 2015 International Union of Biochemistry and Molecular Biology, Inc.
A bioinformatics roadmap for the human vaccines project.
Scheuermann, Richard H; Sinkovits, Robert S; Schenkelberg, Theodore; Koff, Wayne C
2017-06-01
Biomedical research has become a data intensive science in which high throughput experimentation is producing comprehensive data about biological systems at an ever-increasing pace. The Human Vaccines Project is a new public-private partnership, with the goal of accelerating development of improved vaccines and immunotherapies for global infectious diseases and cancers by decoding the human immune system. To achieve its mission, the Project is developing a Bioinformatics Hub as an open-source, multidisciplinary effort with the overarching goal of providing an enabling infrastructure to support the data processing, analysis and knowledge extraction procedures required to translate high throughput, high complexity human immunology research data into biomedical knowledge, to determine the core principles driving specific and durable protective immune responses.
Green, Anthony P; Turner, Nicholas J; O'Reilly, Elaine
2014-01-01
The widespread application of ω-transaminases as biocatalysts for chiral amine synthesis has been hampered by fundamental challenges, including unfavorable equilibrium positions and product inhibition. Herein, an efficient process that allows reactions to proceed in high conversion in the absence of by-product removal using only one equivalent of a diamine donor (ortho-xylylenediamine) is reported. This operationally simple method is compatible with the most widely used (R)- and (S)-selective ω-TAs and is particularly suitable for the conversion of substrates with unfavorable equilibrium positions (e.g., 1-indanone). Significantly, spontaneous polymerization of the isoindole by-product generates colored derivatives, providing a high-throughput screening platform to identify desired ω-TA activity. PMID:25138082
Dragas, Jelena; Jäckel, David; Hierlemann, Andreas; Franke, Felix
2017-01-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction. PMID:25415989
Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix
2015-03-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.
Microscale Laminar Vortices for High-Purity Extraction and Release of Circulating Tumor Cells.
Hur, Soojung Claire; Che, James; Di Carlo, Dino
2017-01-01
Circulating tumor cells (CTCs) are disseminated tumor cells that reflect the tumors of origin and can provide a liquid biopsy that would potentially enable noninvasive tumor profiling, treatment monitoring, and identification of targeted treatments. Accurate and rapid purification of CTCs holds great potential to improve cancer care but the task remains technically challenging. Microfluidic isolation of CTCs within microscale vortices enables high-throughput and size-based purification of rare CTCs from bodily fluids. Collected cells are highly pure, viable, and easily accessible, allowing seamless integration with various downstream applications. Here, we describe how to fabricate the High-Throughput Vortex Chip (Vortex-HT) and to process diluted whole blood for CTC collection. Lastly, immunostaining and imaging protocols for CTC classification and corresponding CTC image galleries are reported.
Taylor, Jessica; Woodcock, Simon
2015-09-01
For more than a decade, RNA interference (RNAi) has brought about an entirely new approach to functional genomics screening. Enabling high-throughput loss-of-function (LOF) screens against the human genome, identifying new drug targets, and significantly advancing experimental biology, RNAi is a fast, flexible technology that is compatible with existing high-throughput systems and processes; however, the recent advent of clustered regularly interspaced palindromic repeats (CRISPR)-Cas, a powerful new precise genome-editing (PGE) technology, has opened up vast possibilities for functional genomics. CRISPR-Cas is novel in its simplicity: one piece of easily engineered guide RNA (gRNA) is used to target a gene sequence, and Cas9 expression is required in the cells. The targeted double-strand break introduced by the gRNA-Cas9 complex is highly effective at removing gene expression compared to RNAi. Together with the reduced cost and complexity of CRISPR-Cas, there is the realistic opportunity to use PGE to screen for phenotypic effects in a total gene knockout background. This review summarizes the exciting development of CRISPR-Cas as a high-throughput screening tool, comparing its future potential to that of well-established RNAi screening techniques, and highlighting future challenges and opportunities within these disciplines. We conclude that the two technologies actually complement rather than compete with each other, enabling greater understanding of the genome in relation to drug discovery. © 2015 Society for Laboratory Automation and Screening.
Ghaemi, Reza; Selvaganapathy, Ponnambalam R
Drug discovery is a long and expensive process, which usually takes 12-15 years and could cost up to ~$1 billion. Conventional drug discovery process starts with high throughput screening and selection of drug candidates that bind to specific target associated with a disease condition. However, this process does not consider whether the chosen candidate is optimal not only for binding but also for ease of administration, distribution in the body, effect of metabolism and associated toxicity if any. A holistic approach, using model organisms early in the drug discovery process to select drug candidates that are optimal not only in binding but also suitable for administration, distribution and are not toxic is now considered as a viable way for lowering the cost and time associated with the drug discovery process. However, the conventional drug discovery assays using Drosophila are manual and required skill operator, which makes them expensive and not suitable for high-throughput screening. Recently, microfluidics has been used to automate many of the operations (e.g. sorting, positioning, drug delivery) associated with the Drosophila drug discovery assays and thereby increase their throughput. This review highlights recent microfluidic devices that have been developed for Drosophila assays with primary application towards drug discovery for human diseases. The microfluidic devices that have been reviewed in this paper are categorized based on the stage of the Drosophila that have been used. In each category, the microfluidic technologies behind each device are described and their potential biological applications are discussed.
Physico-chemical foundations underpinning microarray and next-generation sequencing experiments
Harrison, Andrew; Binder, Hans; Buhot, Arnaud; Burden, Conrad J.; Carlon, Enrico; Gibas, Cynthia; Gamble, Lara J.; Halperin, Avraham; Hooyberghs, Jef; Kreil, David P.; Levicky, Rastislav; Noble, Peter A.; Ott, Albrecht; Pettitt, B. Montgomery; Tautz, Diethard; Pozhitkov, Alexander E.
2013-01-01
Hybridization of nucleic acids on solid surfaces is a key process involved in high-throughput technologies such as microarrays and, in some cases, next-generation sequencing (NGS). A physical understanding of the hybridization process helps to determine the accuracy of these technologies. The goal of a widespread research program is to develop reliable transformations between the raw signals reported by the technologies and individual molecular concentrations from an ensemble of nucleic acids. This research has inputs from many areas, from bioinformatics and biostatistics, to theoretical and experimental biochemistry and biophysics, to computer simulations. A group of leading researchers met in Ploen Germany in 2011 to discuss present knowledge and limitations of our physico-chemical understanding of high-throughput nucleic acid technologies. This meeting inspired us to write this summary, which provides an overview of the state-of-the-art approaches based on physico-chemical foundation to modeling of the nucleic acids hybridization process on solid surfaces. In addition, practical application of current knowledge is emphasized. PMID:23307556
NASA Astrophysics Data System (ADS)
Rohde, Christopher B.; Zeng, Fei; Gilleland, Cody; Samara, Chrysanthi; Yanik, Mehmet F.
2009-02-01
In recent years, the advantages of using small invertebrate animals as model systems for human disease have become increasingly apparent and have resulted in three Nobel Prizes in medicine or chemistry during the last six years for studies conducted on the nematode Caenorhabditis elegans (C. elegans). The availability of a wide array of species-specific genetic techniques, along with the transparency of the worm and its ability to grow in minute volumes make C. elegans an extremely powerful model organism. We present a suite of technologies for complex high-throughput whole-animal genetic and drug screens. We demonstrate a high-speed microfluidic sorter that can isolate and immobilize C. elegans in a well-defined geometry, an integrated chip containing individually addressable screening chambers for incubation and exposure of individual animals to biochemical compounds, and a device for delivery of compound libraries in standard multiwell plates to microfluidic devices. The immobilization stability obtained by these devices is comparable to that of chemical anesthesia and the immobilization process does not affect lifespan, progeny production, or other aspects of animal health. The high-stability enables the use of a variety of key optical techniques. We use this to demonstrate femtosecond-laser nanosurgery and three-dimensional multiphoton microscopy. Used alone or in various combinations these devices facilitate a variety of high-throughput assays using whole animals, including mutagenesis and RNAi and drug screens at subcellular resolution, as well as high-throughput high-precision manipulations such as femtosecond-laser nanosurgery for large-scale in vivo neural degeneration and regeneration studies.
Von Dreele, Robert B.; D'Amico, Kevin
2006-10-31
A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.
Payne, Matthew A; Miller, James B; Gellman, Andrew J
2016-09-12
Composition spread alloy films (CSAFs) are commonly used as libraries for high-throughput screening of composition-property relationships in multicomponent materials science. Because lateral gradients afford two degrees of freedom, an n-component CSAF can, in principle, contain any composition range falling on a continuous two-dimensional surface through an (n - 1)-dimensional composition space. However, depending on the complexity of the CSAF gradients, characterizing and graphically representing this composition range may not be straightforward when n ≥ 4. The standard approach for combinatorial studies performed using quaternary or higher-order CSAFs has been to use fixed stoichiometric ratios of one or more components to force the composition range to fall on some well-defined plane in the composition space. In this work, we explore the synthesis of quaternary Al-Fe-Ni-Cr CSAFs with a rotatable shadow mask CSAF deposition tool, in which none of the component ratios are fixed. On the basis of the unique gradient geometry produced by the tool, we show that the continuous quaternary composition range of the CSAF can be rigorously represented using a set of two-dimensional "pseudoternary" composition diagrams. We then perform a case study of (AlxFeyNi1-x-y)∼0.8Cr∼0.2 oxidation in dry air at 427 °C to demonstrate how such CSAFs can be used to screen an alloy property across a continuous two-dimensional subspace of a quaternary composition space. We identify a continuous boundary through the (AlxFeyNi1-x-y)∼0.8Cr∼0.2 subspace at which the oxygen uptake into the CSAF between 1 and 16 h oxidation time increases abruptly with decreasing Al content. The results are compared to a previous study of the oxidation of AlxFeyNi1-x-y CSAFs in dry air at 427 °C.
NASA Astrophysics Data System (ADS)
Ewers, B. E.; Pleban, J. R.; Aston, T.; Beverly, D.; Speckman, H. N.; Hosseini, A.; Bretfeld, M.; Edwards, C.; Yarkhunova, Y.; Weinig, C.; Mackay, D. S.
2017-12-01
Abiotic and biotic stresses reduce plant productivity, yet high-throughput characterization of plant responses across genotypes, species and stress conditions are limited by both instrumentation and data analysis techniques. Recent developments in chlorophyll a fluorescence measurement at leaf to landscape scales could improve our predictive understanding of plants response to stressors. We analyzed the interaction of species and stress across two crop types, five gymnosperm and two angiosperm tree species from boreal and montane forests, grasses, forbs and shrubs from sagebrush steppe, and 30 tree species from seasonally wet tropical forest. We also analyzed chlorophyll fluorescence and gas exchange data from twelve Brassica rapa crop accessions and 120 recombinant inbred lines to investigate phenotypic responses to drought. These data represent more than 10,000 measurements of fluorescence and allow us to answer two questions 1) are the measurements from high-throughput, hand held and drone-mounted instruments quantitatively similar to lower throughput camera and gas exchange mounted instruments and 2) do the measurements find differences in genotypic, species and environmental stress on plants? We found through regression that the high and low throughput instruments agreed across both individual chlorophyll fluorescence components and calculated ratios and were not different from a 1:1 relationship with correlation greater than 0.9. We used hierarchical Bayesian modeling to test the second question. We found a linear relationship between the fluorescence-derived quantum yield of PSII and the quantum yield of CO2 assimilation from gas-exchange, with a slope of ca. 0.1 indicating that the efficiency of the entire photosynthetic process was about 10% of PSII across genotypes, species and drought stress. Posterior estimates of quantum yield revealed that drought-treatment, genotype and species differences were preserved when accounting for measurement uncertainty. High throughput handheld or drone-based measurements of chlorophyll fluorescence provide high quality, quantitative data that can be used to not only connect genotype to phenotype but also quantify how vastly different plant species and genotypes respond to stress and change ecosystem productivity.
Fan, Kai; Zhang, Min; Mujumdar, Arun S
2018-01-10
Microwave heating has been applied in the drying of high-value solids as it affords a number of advantages, including shorter drying time and better product quality. Freeze-drying at cryogenic temperature and extremely low pressure provides the advantage of high product quality, but at very high capital and operating costs due partly to very long drying time. Freeze-drying coupled with a microwave heat source speeds up the drying rate and yields good quality products provided the operating unit is designed and operated to achieve the potential for an absence of hot spot developments. This review is a survey of recent developments in the modeling and experimental results on microwave-assisted freeze-drying (MFD) over the past decade. Owing to the high costs involved, so far all applications are limited to small-scale operations for the drying of high-value foods such as fruits and vegetables. In order to promote industrial-scale applications for a broader range of products further research and development efforts are needed to offset the current limitations of the process. The needs and opportunities for future research and developments are outlined.
Application of ToxCast High-Throughput Screening and ...
Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
Development of high-throughput assays for chemical screening and hazard identification is a pressing priority worldwide. One approach uses in vitro, cell-based assays which recapitulate biological events observed in vivo. Neurite outgrowth is one such critical cellular process un...
Air-stable ink for scalable, high-throughput layer deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weil, Benjamin D; Connor, Stephen T; Cui, Yi
A method for producing and depositing air-stable, easily decomposable, vulcanized ink on any of a wide range of substrates is disclosed. The ink enables high-volume production of optoelectronic and/or electronic devices using scalable production methods, such as roll-to-roll transfer, fast rolling processes, and the like.
Rodríguez, Óscar; Eim, Valeria; Rosselló, Carmen; Femenia, Antoni; Cárcel, Juan A; Simal, Susana
2018-03-01
Drying gives rise to products with a long shelf life by reducing the water activity to a level that is sufficiently low to inhibit the growth of microorganisms, enzymatic reactions and other deteriorative reactions. Despite the benefits of this operation, the quality of heat sensitive products is diminished when high temperatures are used. The use of low drying temperatures reduces the heat damage but, because of a longer drying time, oxidation reactions occur and a reduction of the quality is also observed. Thus, drying is a method that lends itself to being intensified. For this reason, alternative techniques are being studied. Power ultrasound is considered as an emerging and promising technology in the food industry. The potential of this technology relies on its ability to accelerate the mass transfer processes in solid-liquid and solid-gas systems. Intensification of the drying process with power ultrasound can be achieved by modifying the product behavior during drying, using pre-treatments such as soaking in a liquid medium assisted acoustically or, during the drying process itself, by applying power ultrasound in the gaseous medium. This review summarises the effects of the application of the power ultrasound on the quality of different dried products, such as fruits and vegetables, when the acoustic energy is intended to intensify the drying process, either when the application is performed before pretreatment or during the drying process. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Hu, Guangrong; Fan, Yong; Zhang, Lei; Yuan, Cheng; Wang, Jufang; Li, Wenjian; Hu, Qiang; Li, Fuli
2013-01-01
The unicellular green microalga Desmodesmus sp. S1 can produce more than 50% total lipid of cell dry weight under high light and nitrogen-limitation conditions. After irradiation by heavy (12)C(6+) ion beam of 10, 30, 60, 90 or 120 Gy, followed by screening of resulting mutants on 24-well microplates, more than 500 mutants were obtained. One of those, named D90G-19, exhibited lipid productivity of 0.298 g L(-1)⋅d(-1), 20.6% higher than wild type, likely owing to an improved maximum quantum efficiency (Fv/Fm) of photosynthesis under stress. This work demonstrated that heavy-ion irradiation combined with high-throughput screening is an effective means for trait improvement. The resulting mutant D90G-19 may be used for enhanced lipid production.
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
Parametric study on mixing process in an in-plane spiral micromixer utilizing chaotic advection.
Vatankhah, Parham; Shamloo, Amir
2018-08-31
Recent advances in the field of microfabrication have made the application of high-throughput microfluidics feasible. Mixing which is an essential part of any miniaturized standalone system remains the key challenge. This paper proposes a geometrically simple micromixer for efficient mixing for high-throughput microfluidic devices. The proposed micromixer utilizes a curved microchannel (spiral microchannel) to induce chaotic advection and enhance the mixing process. It is shown that the spiral microchannel is more efficient in comparison to a straight microchannel, mixing wise. The pressure drop in the spiral microchannel is only slightly higher than that in the straight microchannel. It is found that the mixing process in the spiral microchannel enhances with increasing the inlet velocity, unlike what happens in the straight microchannel. It is also realized that the initial radius of the spiral microchannel plays a prominent role in enhancing the mixing process. Studying different cross sections, it is gathered that the square cross section yields a higher mixing quality. Copyright © 2018 Elsevier B.V. All rights reserved.
The JCSG high-throughput structural biology pipeline.
Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A
2010-10-01
The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.
High throughput DNA damage quantification of human tissue with home-based collection device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costes, Sylvain V.; Tang, Jonathan; Yannone, Steven M.
Kits, methods and systems for providing a service to provide a subject with information regarding the state of a subject's DNA damage. Collection, processing and analysis of samples are also described.
Thin-Film Material Science and Processing | Materials Science | NREL
, a prime example of this research is thin-film photovoltaics (PV). Thin films are important because have developed a quantitative high-throughput technique that can measure many barriers in parallel with
High-throughput electrophysiological assays for voltage gated ion channels using SyncroPatch 768PE.
Li, Tianbo; Lu, Gang; Chiang, Eugene Y; Chernov-Rogan, Tania; Grogan, Jane L; Chen, Jun
2017-01-01
Ion channels regulate a variety of physiological processes and represent an important class of drug target. Among the many methods of studying ion channel function, patch clamp electrophysiology is considered the gold standard by providing the ultimate precision and flexibility. However, its utility in ion channel drug discovery is impeded by low throughput. Additionally, characterization of endogenous ion channels in primary cells remains technical challenging. In recent years, many automated patch clamp (APC) platforms have been developed to overcome these challenges, albeit with varying throughput, data quality and success rate. In this study, we utilized SyncroPatch 768PE, one of the latest generation APC platforms which conducts parallel recording from two-384 modules with giga-seal data quality, to push these 2 boundaries. By optimizing various cell patching parameters and a two-step voltage protocol, we developed a high throughput APC assay for the voltage-gated sodium channel Nav1.7. By testing a group of Nav1.7 reference compounds' IC50, this assay was proved to be highly consistent with manual patch clamp (R > 0.9). In a pilot screening of 10,000 compounds, the success rate, defined by > 500 MΩ seal resistance and >500 pA peak current, was 79%. The assay was robust with daily throughput ~ 6,000 data points and Z' factor 0.72. Using the same platform, we also successfully recorded endogenous voltage-gated potassium channel Kv1.3 in primary T cells. Together, our data suggest that SyncroPatch 768PE provides a powerful platform for ion channel research and drug discovery.
High Throughput Screening For Hazard and Risk of Environmental Contaminants
High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...
Raspberry Pi-powered imaging for plant phenotyping.
Tovar, Jose C; Hoyer, J Steen; Lin, Andy; Tielking, Allison; Callen, Steven T; Elizabeth Castillo, S; Miller, Michael; Tessman, Monica; Fahlgren, Noah; Carrington, James C; Nusinow, Dmitri A; Gehan, Malia A
2018-03-01
Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV. This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.
TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.
Fimereli, Danai; Detours, Vincent; Konopka, Tomasz
2013-04-01
High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.
Leszczynski, Dariusz; Nylund, Reetta; Joenväärä, Sakari; Reivinen, Jukka
2004-02-01
We argue that the use of high-throughput screening techniques, although expensive and laborious, is justified and necessary in studies that examine biological effects of mobile phone radiation. The "case of hsp27 protein" presented here suggests that even proteins with only modestly altered (by exposure to mobile phone radiation) expression and activity might have an impact on cell physiology. However, this short communication does not attempt to present the full scientific evidence that is far too large to be presented in a single article and that is being prepared for publication in three separate research articles. Examples of the experimental evidence presented here were designed to show the flow of experimental process demonstrating that the use of high-throughput screening techniques might help in rapid identification of the responding proteins. This, in turn, can help in speeding up of the process of determining whether these changes might affect human health.*
High-throughput microfluidic single-cell digital polymerase chain reaction.
White, A K; Heyries, K A; Doolin, C; Vaninsberghe, M; Hansen, C L
2013-08-06
Here we present an integrated microfluidic device for the high-throughput digital polymerase chain reaction (dPCR) analysis of single cells. This device allows for the parallel processing of single cells and executes all steps of analysis, including cell capture, washing, lysis, reverse transcription, and dPCR analysis. The cDNA from each single cell is distributed into a dedicated dPCR array consisting of 1020 chambers, each having a volume of 25 pL, using surface-tension-based sample partitioning. The high density of this dPCR format (118,900 chambers/cm(2)) allows the analysis of 200 single cells per run, for a total of 204,000 PCR reactions using a device footprint of 10 cm(2). Experiments using RNA dilutions show this device achieves shot-noise-limited performance in quantifying single molecules, with a dynamic range of 10(4). We performed over 1200 single-cell measurements, demonstrating the use of this platform in the absolute quantification of both high- and low-abundance mRNA transcripts, as well as micro-RNAs that are not easily measured using alternative hybridization methods. We further apply the specificity and sensitivity of single-cell dPCR to performing measurements of RNA editing events in single cells. High-throughput dPCR provides a new tool in the arsenal of single-cell analysis methods, with a unique combination of speed, precision, sensitivity, and specificity. We anticipate this approach will enable new studies where high-performance single-cell measurements are essential, including the analysis of transcriptional noise, allelic imbalance, and RNA processing.
Sakwari, Gloria; Mamuya, Simon H D; Bråtveit, Magne; Larsson, Lennart; Pehrson, Christina; Moen, Bente E
2013-03-01
Endotoxin exposure associated with organic dust exposure has been studied in several industries. Coffee cherries that are dried directly after harvest may differ in dust and endotoxin emissions to those that are peeled and washed before drying. The aim of this study was to measure personal total dust and endotoxin levels and to evaluate their determinants of exposure in coffee processing factories. Using Sidekick Casella pumps at a flow rate of 2l/min, total dust levels were measured in the workers' breathing zone throughout the shift. Endotoxin was analyzed using the kinetic chromogenic Limulus amebocyte lysate assay. Separate linear mixed-effects models were used to evaluate exposure determinants for dust and endotoxin. Total dust and endotoxin exposure were significantly higher in Robusta than in Arabica coffee factories (geometric mean 3.41 mg/m(3) and 10 800 EU/m(3) versus 2.10 mg/m(3) and 1400 EU/m(3), respectively). Dry pre-processed coffee and differences in work tasks explained 30% of the total variance for total dust and 71% of the variance for endotoxin exposure. High exposure in Robusta processing is associated with the dry pre-processing method used after harvest. Dust and endotoxin exposure is high, in particular when processing dry pre-processed coffee. Minimization of dust emissions and use of efficient dust exhaust systems are important to prevent the development of respiratory system impairment in workers.
Global Profiling of Reactive Oxygen and Nitrogen Species in Biological Systems
Zielonka, Jacek; Zielonka, Monika; Sikora, Adam; Adamus, Jan; Joseph, Joy; Hardy, Micael; Ouari, Olivier; Dranka, Brian P.; Kalyanaraman, Balaraman
2012-01-01
Herein we describe a high-throughput fluorescence and HPLC-based methodology for global profiling of reactive oxygen and nitrogen species (ROS/RNS) in biological systems. The combined use of HPLC and fluorescence detection is key to successful implementation and validation of this methodology. Included here are methods to specifically detect and quantitate the products formed from interaction between the ROS/RNS species and the fluorogenic probes, as follows: superoxide using hydroethidine, peroxynitrite using boronate-based probes, nitric oxide-derived nitrosating species with 4,5-diaminofluorescein, and hydrogen peroxide and other oxidants using 10-acetyl-3,7-dihydroxyphenoxazine (Amplex® Red) with and without horseradish peroxidase, respectively. In this study, we demonstrate real-time monitoring of ROS/RNS in activated macrophages using high-throughput fluorescence and HPLC methods. This global profiling approach, simultaneous detection of multiple ROS/RNS products of fluorescent probes, developed in this study will be useful in unraveling the complex role of ROS/RNS in redox regulation, cell signaling, and cellular oxidative processes and in high-throughput screening of anti-inflammatory antioxidants. PMID:22139901
Wu, Szu-Huei; Yao, Chun-Hsu; Hsieh, Chieh-Jui; Liu, Yu-Wei; Chao, Yu-Sheng; Song, Jen-Shin; Lee, Jinq-Chyi
2015-07-10
Sodium-dependent glucose co-transporter 2 (SGLT2) inhibitors are of current interest as a treatment for type 2 diabetes. Efforts have been made to discover phlorizin-related glycosides with good SGLT2 inhibitory activity. To increase structural diversity and better understand the role of non-glycoside SGLT2 inhibitors on glycemic control, we initiated a research program to identify non-glycoside hits from high-throughput screening. Here, we report the development of a novel, fluorogenic probe-based glucose uptake system based on a Cu(I)-catalyzed [3+2] cycloaddition. The safer processes and cheaper substances made the developed assay our first priority for large-scale primary screening as compared to the well-known [(14)C]-labeled α-methyl-D-glucopyranoside ([(14)C]-AMG) radioactive assay. This effort culminated in the identification of a benzimidazole, non-glycoside SGLT2 hit with an EC50 value of 0.62 μM by high-throughput screening of 41,000 compounds. Copyright © 2015 Elsevier B.V. All rights reserved.
Fast multiclonal clusterization of V(D)J recombinations from high-throughput sequencing.
Giraud, Mathieu; Salson, Mikaël; Duez, Marc; Villenet, Céline; Quief, Sabine; Caillault, Aurélie; Grardel, Nathalie; Roumier, Christophe; Preudhomme, Claude; Figeac, Martin
2014-05-28
V(D)J recombinations in lymphocytes are essential for immunological diversity. They are also useful markers of pathologies. In leukemia, they are used to quantify the minimal residual disease during patient follow-up. However, the full breadth of lymphocyte diversity is not fully understood. We propose new algorithms that process high-throughput sequencing (HTS) data to extract unnamed V(D)J junctions and gather them into clones for quantification. This analysis is based on a seed heuristic and is fast and scalable because in the first phase, no alignment is performed with germline database sequences. The algorithms were applied to TR γ HTS data from a patient with acute lymphoblastic leukemia, and also on data simulating hypermutations. Our methods identified the main clone, as well as additional clones that were not identified with standard protocols. The proposed algorithms provide new insight into the analysis of high-throughput sequencing data for leukemia, and also to the quantitative assessment of any immunological profile. The methods described here are implemented in a C++ open-source program called Vidjil.
Bell, Robert T; Jacobs, Alan G; Sorg, Victoria C; Jung, Byungki; Hill, Megan O; Treml, Benjamin E; Thompson, Michael O
2016-09-12
A high-throughput method for characterizing the temperature dependence of material properties following microsecond to millisecond thermal annealing, exploiting the temperature gradients created by a lateral gradient laser spike anneal (lgLSA), is presented. Laser scans generate spatial thermal gradients of up to 5 °C/μm with peak temperatures ranging from ambient to in excess of 1400 °C, limited only by laser power and materials thermal limits. Discrete spatial property measurements across the temperature gradient are then equivalent to independent measurements after varying temperature anneals. Accurate temperature calibrations, essential to quantitative analysis, are critical and methods for both peak temperature and spatial/temporal temperature profile characterization are presented. These include absolute temperature calibrations based on melting and thermal decomposition, and time-resolved profiles measured using platinum thermistors. A variety of spatially resolved measurement probes, ranging from point-like continuous profiling to large area sampling, are discussed. Examples from annealing of III-V semiconductors, CdSe quantum dots, low-κ dielectrics, and block copolymers are included to demonstrate the flexibility, high throughput, and precision of this technique.
A high-throughput two channel discrete wavelet transform architecture for the JPEG2000 standard
NASA Astrophysics Data System (ADS)
Badakhshannoory, Hossein; Hashemi, Mahmoud R.; Aminlou, Alireza; Fatemi, Omid
2005-07-01
The Discrete Wavelet Transform (DWT) is increasingly recognized in image and video compression standards, as indicated by its use in JPEG2000. The lifting scheme algorithm is an alternative DWT implementation that has a lower computational complexity and reduced resource requirement. In the JPEG2000 standard two lifting scheme based filter banks are introduced: the 5/3 and 9/7. In this paper a high throughput, two channel DWT architecture for both of the JPEG2000 DWT filters is presented. The proposed pipelined architecture has two separate input channels that process the incoming samples simultaneously with minimum memory requirement for each channel. The architecture had been implemented in VHDL and synthesized on a Xilinx Virtex2 XCV1000. The proposed architecture applies DWT on a 2K by 1K image at 33 fps with a 75 MHZ clock frequency. This performance is achieved with 70% less resources than two independent single channel modules. The high throughput and reduced resource requirement has made this architecture the proper choice for real time applications such as Digital Cinema.
Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey
2011-11-01
In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers
High-Throughput Non-Contact Vitrification of Cell-Laden Droplets Based on Cell Printing
NASA Astrophysics Data System (ADS)
Shi, Meng; Ling, Kai; Yong, Kar Wey; Li, Yuhui; Feng, Shangsheng; Zhang, Xiaohui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng
2015-12-01
Cryopreservation is the most promising way for long-term storage of biological samples e.g., single cells and cellular structures. Among various cryopreservation methods, vitrification is advantageous by employing high cooling rate to avoid the formation of harmful ice crystals in cells. Most existing vitrification methods adopt direct contact of cells with liquid nitrogen to obtain high cooling rates, which however causes the potential contamination and difficult cell collection. To address these limitations, we developed a non-contact vitrification device based on an ultra-thin freezing film to achieve high cooling/warming rate and avoid direct contact between cells and liquid nitrogen. A high-throughput cell printer was employed to rapidly generate uniform cell-laden microdroplets into the device, where the microdroplets were hung on one side of the film and then vitrified by pouring the liquid nitrogen onto the other side via boiling heat transfer. Through theoretical and experimental studies on vitrification processes, we demonstrated that our device offers a high cooling/warming rate for vitrification of the NIH 3T3 cells and human adipose-derived stem cells (hASCs) with maintained cell viability and differentiation potential. This non-contact vitrification device provides a novel and effective way to cryopreserve cells at high throughput and avoid the contamination and collection problems.
High-Throughput Non-Contact Vitrification of Cell-Laden Droplets Based on Cell Printing
Shi, Meng; Ling, Kai; Yong, Kar Wey; Li, Yuhui; Feng, Shangsheng; Zhang, Xiaohui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng
2015-01-01
Cryopreservation is the most promising way for long-term storage of biological samples e.g., single cells and cellular structures. Among various cryopreservation methods, vitrification is advantageous by employing high cooling rate to avoid the formation of harmful ice crystals in cells. Most existing vitrification methods adopt direct contact of cells with liquid nitrogen to obtain high cooling rates, which however causes the potential contamination and difficult cell collection. To address these limitations, we developed a non-contact vitrification device based on an ultra-thin freezing film to achieve high cooling/warming rate and avoid direct contact between cells and liquid nitrogen. A high-throughput cell printer was employed to rapidly generate uniform cell-laden microdroplets into the device, where the microdroplets were hung on one side of the film and then vitrified by pouring the liquid nitrogen onto the other side via boiling heat transfer. Through theoretical and experimental studies on vitrification processes, we demonstrated that our device offers a high cooling/warming rate for vitrification of the NIH 3T3 cells and human adipose-derived stem cells (hASCs) with maintained cell viability and differentiation potential. This non-contact vitrification device provides a novel and effective way to cryopreserve cells at high throughput and avoid the contamination and collection problems. PMID:26655688
Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R
2008-08-01
Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.
Jiang, Hao; Zhang, Min; Mujumdar, Arun S; Lim, Rui-Xin
2014-07-01
To overcome the flaws of high energy consumption of freeze drying (FD) and the non-uniform drying of microwave freeze drying (MFD), pulse-spouted microwave vacuum drying (PSMVD) was developed. The results showed that the drying time can be dramatically shortened if microwave was used as the heating source. In this experiment, both MFD and PSMVD could shorten drying time by 50% as compared to the FD process. Depending on the heating method, MFD and PSMVD dried banana cubes showed trends of expansion while FD dried samples demonstrated trends of shrinkage. Shrinkage also brought intensive structure and highest fracturability of all three samples dried by different methods. The residual ascorbic acid content of PSMVD dried samples can be as high as in FD dried samples, which were superior to MFD dried samples. The tests confirmed that PSMVD could bring about better drying uniformity than MFD. Besides, compared with traditional MFD, PSMVD can provide better extrinsic feature, and can bring about improved nutritional features because of the higher residual ascorbic acid content. © 2013 Society of Chemical Industry.
High Throughput Plasma Water Treatment
NASA Astrophysics Data System (ADS)
Mujovic, Selman; Foster, John
2016-10-01
The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).
[Drying characteristics and apparent change of sludge granules during drying].
Ma, Xue-Wen; Weng, Huan-Xin; Zhang, Jin-Jun
2011-08-01
Three different weight grades of sludge granules (2.5, 5, 10 g) were dried at constant temperature of 100, 200, 300, 400 and 500 degrees C, respectively. Then characteristics of weight loss and change of apparent form during sludge drying were analyzed. Results showed that there were three stages during sludge drying at 100-200 degrees C: acceleration phase, constant-rate phase, and falling-rate phase. At 300-500 degrees C, there were no constant-rate phase, but due to lots of cracks generated at sludge surface, average drying rates were still high. There was a quadratic nonlinear relationship between average drying rate and drying temperature. At 100-200 degrees C, drying processes of different weight grade sludge granules were similar. At 300-500 degrees C, drying processes of same weight grade of sludge granules were similar. Little organic matter decomposed till sludge burning at 100-300 degrees C, while some organic matter began to decompose at the beginning of sludge drying at 400-500 degrees C.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Hughes, Stephen R; Butt, Tauseef R; Bartolett, Scott; Riedmuller, Steven B; Farrelly, Philip
2011-08-01
The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly clone and express heterologous gene open reading frames in bacteria and yeast and to screen large numbers of expressed proteins for optimized function are an important technology for improving microbial strains for biofuel production. The process involves the production of full-length complementary DNA libraries as a source of plasmid-based clones to express the desired proteins in active form for determination of their functions. Proteins that were identified by high-throughput screening as having desired characteristics are overexpressed in microbes to enable them to perform functions that will allow more cost-effective and sustainable production of biofuels. Because the plasmid libraries are composed of several thousand unique genes, automation of the process is essential. This review describes the design and implementation of an automated integrated programmable robotic workcell capable of producing complementary DNA libraries, colony picking, isolating plasmid DNA, transforming yeast and bacteria, expressing protein, and performing appropriate functional assays. These operations will allow tailoring microbial strains to use renewable feedstocks for production of biofuels, bioderived chemicals, fertilizers, and other coproducts for profitable and sustainable biorefineries. Published by Elsevier Inc.
Methods for processing high-throughput RNA sequencing data.
Ares, Manuel
2014-11-03
High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.
Computational Tools for Stem Cell Biology
Bian, Qin; Cahan, Patrick
2016-01-01
For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the last several years, a new sub-discipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single-cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. PMID:27318512
Target Discovery for Precision Medicine Using High-Throughput Genome Engineering.
Guo, Xinyi; Chitale, Poonam; Sanjana, Neville E
2017-01-01
Over the past few years, programmable RNA-guided nucleases such as the CRISPR/Cas9 system have ushered in a new era of precision genome editing in diverse model systems and in human cells. Functional screens using large libraries of RNA guides can interrogate a large hypothesis space to pinpoint particular genes and genetic elements involved in fundamental biological processes and disease-relevant phenotypes. Here, we review recent high-throughput CRISPR screens (e.g. loss-of-function, gain-of-function, and targeting noncoding elements) and highlight their potential for uncovering novel therapeutic targets, such as those involved in cancer resistance to small molecular drugs and immunotherapies, tumor evolution, infectious disease, inborn genetic disorders, and other therapeutic challenges.
Computational Tools for Stem Cell Biology.
Bian, Qin; Cahan, Patrick
2016-12-01
For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gene cassette knock-in in mammalian cells and zygotes by enhanced MMEJ.
Aida, Tomomi; Nakade, Shota; Sakuma, Tetsushi; Izu, Yayoi; Oishi, Ayu; Mochida, Keiji; Ishikubo, Harumi; Usami, Takako; Aizawa, Hidenori; Yamamoto, Takashi; Tanaka, Kohichi
2016-11-28
Although CRISPR/Cas enables one-step gene cassette knock-in, assembling targeting vectors containing long homology arms is a laborious process for high-throughput knock-in. We recently developed the CRISPR/Cas-based precise integration into the target chromosome (PITCh) system for a gene cassette knock-in without long homology arms mediated by microhomology-mediated end-joining. Here, we identified exonuclease 1 (Exo1) as an enhancer for PITCh in human cells. By combining the Exo1 and PITCh-directed donor vectors, we achieved convenient one-step knock-in of gene cassettes and floxed allele both in human cells and mouse zygotes. Our results provide a technical platform for high-throughput knock-in.
High Throughput Transcriptomics: From screening to pathways
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
A Study on Ohmic Contact to Dry-Etched p-GaN
NASA Astrophysics Data System (ADS)
Hu, Cheng-Yu; Ao, Jin-Ping; Okada, Masaya; Ohno, Yasuo
Low-power dry-etching process has been adopted to study the influence of dry-etching on Ohmic contact to p-GaN. When the surface layer of as-grown p-GaN was removed by low-power SiCl4/Cl2-etching, no Ohmic contact can be formed on the low-power dry-etched p-GaN. The same dry-etching process was also applied on n-GaN to understand the influence of the low-power dry-etching process. By capacitance-voltage (C-V) measurement, the Schottky barrier heights (SBHs) of p-GaN and n-GaN were measured. By comparing the change of measured SBHs on p-GaN and n-GaN, it was suggested that etching damage is not the only reason responsible for the degraded Ohmic contacts to dry-etched p-GaN and for Ohmic contact formatin, the original surface layer of as-grown p-GaN have some special properties, which were removed by dry-etching process. To partially recover the original surface of as-grown p-GaN, high temperature annealing (1000°C 30s) was tried on the SiCl4/Cl2-etched p-GaN and Ohmic contact was obtained.
NASA Astrophysics Data System (ADS)
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-12-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Xu, Xiaohui Sophia; Rose, Anne; Demers, Roger; Eley, Timothy; Ryan, John; Stouffer, Bruce; Cojocaru, Laura; Arnold, Mark
2014-01-01
The determination of drug-protein binding is important in the pharmaceutical development process because of the impact of protein binding on both the pharmacokinetics and pharmacodynamics of drugs. Equilibrium dialysis is the preferred method to measure the free drug fraction because it is considered to be more accurate. The throughput of equilibrium dialysis has recently been improved by implementing a 96-well format plate. Results/methodology: This manuscript illustrates the successful application of a 96-well rapid equilibrium dialysis (RED) device in the determination of atazanavir plasma-protein binding. This RED method of measuring free fraction was successfully validated and then applied to the analysis of clinical plasma samples taken from HIV-infected pregnant women administered atazanavir. Combined with LC-MS/MS detection, the 96-well format equilibrium dialysis device was suitable for measuring the free and bound concentration of pharmaceutical molecules in a high-throughput mode.
Cache-Oblivious parallel SIMD Viterbi decoding for sequence search in HMMER.
Ferreira, Miguel; Roma, Nuno; Russo, Luis M S
2014-05-30
HMMER is a commonly used bioinformatics tool based on Hidden Markov Models (HMMs) to analyze and process biological sequences. One of its main homology engines is based on the Viterbi decoding algorithm, which was already highly parallelized and optimized using Farrar's striped processing pattern with Intel SSE2 instruction set extension. A new SIMD vectorization of the Viterbi decoding algorithm is proposed, based on an SSE2 inter-task parallelization approach similar to the DNA alignment algorithm proposed by Rognes. Besides this alternative vectorization scheme, the proposed implementation also introduces a new partitioning of the Markov model that allows a significantly more efficient exploitation of the cache locality. Such optimization, together with an improved loading of the emission scores, allows the achievement of a constant processing throughput, regardless of the innermost-cache size and of the dimension of the considered model. The proposed optimized vectorization of the Viterbi decoding algorithm was extensively evaluated and compared with the HMMER3 decoder to process DNA and protein datasets, proving to be a rather competitive alternative implementation. Being always faster than the already highly optimized ViterbiFilter implementation of HMMER3, the proposed Cache-Oblivious Parallel SIMD Viterbi (COPS) implementation provides a constant throughput and offers a processing speedup as high as two times faster, depending on the model's size.
Reddy, Jay Poorna; Jones, John W; Wray, Patrick S; Dennis, Andrew B; Brown, Jonathan; Timmins, Peter
2018-04-25
Form changes during drug product processing can be a risk to the final product quality in terms of chemical stability and bioavailability. In this study, online Raman spectroscopy was used to monitor the form changes in real time during high shear wet granulation of Compound A, a highly soluble drug present at a high drug load in an extended release formulation. The effect of water content, temperature, wet massing time and drying technique on the degree of drug transformation were examined. A designed set of calibration standards were employed to develop quantitative partial least square regression models to predict the concentration of each drug form during both wet granulation and the drying process. Throughout all our experiments we observed complex changes of the drug form during granulation, manifest as conversions between the initial non-solvated form of Compound A, the hemi-hydrate form and the "apparent" amorphous form (dissolved drug). The online Raman data demonstrate that the non-solvated form converts to an "apparent" amorphous form (dissolved drug) due to drug dissolution with no appearance of the hemi-hydrate form during water addition stage. The extent of conversion of the non-solvated form was governed by the amount of water added and the rate of conversion was accelerated at higher temperatures. Interestingly, in the wet massing zone, the formation of the hemi-hydrate form was observed at a rate equivalent to the rate of depletion of the non-solvated form with no change in the level of the "apparent amorphous" form generated. The level of hemi-hydrate increased with an increase in wet massing time. The drying process had a significant effect on the proportion of each form. During tray drying, changes in drug form continued for hours. In contrast fluid bed drying appeared to lock the final proportions of drug form product attained during granulation, with comparatively small changes observed during drying. In conclusion, it was possible to simultaneously monitor the three forms in real time during wet granulation and drying using online Raman spectroscopy. The results regarding the effect of process parameters on the degree of transformation are critical for designing a robust process that ensures a consistent form in the final drug product. Copyright © 2018 Elsevier B.V. All rights reserved.
High-throughput full-length single-cell mRNA-seq of rare cells.
Ooi, Chin Chun; Mantalas, Gary L; Koh, Winston; Neff, Norma F; Fuchigami, Teruaki; Wong, Dawson J; Wilson, Robert J; Park, Seung-Min; Gambhir, Sanjiv S; Quake, Stephen R; Wang, Shan X
2017-01-01
Single-cell characterization techniques, such as mRNA-seq, have been applied to a diverse range of applications in cancer biology, yielding great insight into mechanisms leading to therapy resistance and tumor clonality. While single-cell techniques can yield a wealth of information, a common bottleneck is the lack of throughput, with many current processing methods being limited to the analysis of small volumes of single cell suspensions with cell densities on the order of 107 per mL. In this work, we present a high-throughput full-length mRNA-seq protocol incorporating a magnetic sifter and magnetic nanoparticle-antibody conjugates for rare cell enrichment, and Smart-seq2 chemistry for sequencing. We evaluate the efficiency and quality of this protocol with a simulated circulating tumor cell system, whereby non-small-cell lung cancer cell lines (NCI-H1650 and NCI-H1975) are spiked into whole blood, before being enriched for single-cell mRNA-seq by EpCAM-functionalized magnetic nanoparticles and the magnetic sifter. We obtain high efficiency (> 90%) capture and release of these simulated rare cells via the magnetic sifter, with reproducible transcriptome data. In addition, while mRNA-seq data is typically only used for gene expression analysis of transcriptomic data, we demonstrate the use of full-length mRNA-seq chemistries like Smart-seq2 to facilitate variant analysis of expressed genes. This enables the use of mRNA-seq data for differentiating cells in a heterogeneous population by both their phenotypic and variant profile. In a simulated heterogeneous mixture of circulating tumor cells in whole blood, we utilize this high-throughput protocol to differentiate these heterogeneous cells by both their phenotype (lung cancer versus white blood cells), and mutational profile (H1650 versus H1975 cells), in a single sequencing run. This high-throughput method can help facilitate single-cell analysis of rare cell populations, such as circulating tumor or endothelial cells, with demonstrably high-quality transcriptomic data.
High Throughput Experimental Materials Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakutayev, Andriy; Perkins, John; Schwarting, Marcus
The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).
A pH-based biosensor for detection of arsenic in drinking water.
de Mora, K; Joshi, N; Balint, B L; Ward, F B; Elfick, A; French, C E
2011-05-01
Arsenic contaminated groundwater is estimated to affect over 100 million people worldwide, with Bangladesh and West Bengal being among the worst affected regions. A simple, cheap, accurate and disposable device is required for arsenic field testing. We have previously described a novel biosensor for arsenic in which the output is a change in pH, which can be detected visually as a colour change by the use of a pH indicator. Here, we present an improved formulation allowing sensitive and accurate detection of less than 10 ppb arsenate with static overnight incubation. Furthermore, we describe a cheap and simple high-throughput system for simultaneous monitoring of pH in multiple assays over time. Up to 50 samples can be monitored continuously over the desired time period. Cells can be stored and distributed in either air-dried or freeze-dried form. This system was successfully tested on arsenic-contaminated groundwater samples from the South East region of Hungary. We hope to continue to develop this sensor to produce a device suitable for field trials.
Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.
Fabian, Heinz; Lasch, Peter; Naumann, Dieter
2005-01-01
In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.
Phahom, Traiphop; Phoungchandang, Singhanat; Kerr, William L
2017-08-01
Dried Thunbergia laurifolia leaves are usually prepared using tray drying, resulting in products that have lost substantial amounts of bioactive compounds and antioxidant activity. The maturity of the raw material, blanching techniques and drying methods were investigated in order to select the best condition to produce high qualities of dried T. laurifolia leaves. The 1st stage of maturity was selected and steam-microwave blanching (SMB) for 4 min was adequate for blanching leading to the maximum recovery of bioactive compounds. The modified Halsey model was the best desorption isotherm model. A new drying model proposed in this study was the best to fit the drying curves as compared to five common drying models. Moisture diffusivities were increased with the increase of drying temperature when combining SMB and heat pump-dehumidified drying. Microwave heat pump-dehumidified drying (MHPD) provided the shortest drying time, high specific moisture extraction rate (SMER) and could reduce drying time by 67.5% and increase caffeic acid and quercetin by 51.24% and 60.89%, respectively. MHPD was found to be the best drying method and provided the highest antioxidant activity and bioactive compounds content, high SMER and short drying time. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
An evaluation of MPI message rate on hybrid-core processors
Barrett, Brian W.; Brightwell, Ron; Grant, Ryan; ...
2014-11-01
Power and energy concerns are motivating chip manufacturers to consider future hybrid-core processor designs that may combine a small number of traditional cores optimized for single-thread performance with a large number of simpler cores optimized for throughput performance. This trend is likely to impact the way in which compute resources for network protocol processing functions are allocated and managed. In particular, the performance of MPI match processing is critical to achieving high message throughput. In this paper, we analyze the ability of simple and more complex cores to perform MPI matching operations for various scenarios in order to gain insightmore » into how MPI implementations for future hybrid-core processors should be designed.« less
The MaNGA integral field unit fiber feed system for the Sloan 2.5 m telescope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drory, N.; MacDonald, N.; Byler, N.
2015-02-01
We describe the design, manufacture, and performance of bare-fiber integral field units (IFUs) for the SDSS-IV survey Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) on the the Sloan 2.5 m telescope at Apache Point Observatory. MaNGA is a luminosity-selected integral-field spectroscopic survey of 10{sup 4} local galaxies covering 360–1030 nm at R∼2200. The IFUs have hexagonal dense packing of fibers with packing regularity of 3 μm (rms), and throughput of 96 ± 0.5% from 350 nm to 1 μm in the lab. Their sizes range from 19 to 127 fibers (3–7 hexagonal layers) using Polymicro FBP 120:132:150 μm core:clad:buffermore » fibers to reach a fill fraction of 56%. High throughput (and low focal-ratio degradation (FRD)) is achieved by maintaining the fiber cladding and buffer intact, ensuring excellent surface polish, and applying a multi-layer anti-reflection (AR) coating of the input and output surfaces. In operations on-sky, the IFUs show only an additional 2.3% FRD-related variability in throughput despite repeated mechanical stressing during plate plugging (however other losses are present). The IFUs achieve on-sky throughput 5% above the single-fiber feeds used in SDSS-III/BOSS, attributable to equivalent performance compared to single fibers and additional gains from the AR coating. The manufacturing process is geared toward mass-production of high-multiplex systems. The low-stress process involves a precision ferrule with a hexagonal inner shape designed to lead inserted fibers to settle in a dense hexagonal pattern. The ferrule ID is tapered at progressively shallower angles toward its tip and the final 2 mm are straight and only a few microns larger than necessary to hold the desired number of fibers. Our IFU manufacturing process scales easily to accommodate other fiber sizes and can produce IFUs with substantially larger fiber counts. To assure quality, automated testing in a simple and inexpensive system enables complete characterization of throughput and fiber metrology. Future applications include larger IFUs, higher fill factors with stripped buffer, de-cladding, and lenslet coupling.« less
The MaNGA Integral Field Unit Fiber Feed System for the Sloan 2.5 m Telescope
NASA Astrophysics Data System (ADS)
Drory, N.; MacDonald, N.; Bershady, M. A.; Bundy, K.; Gunn, J.; Law, D. R.; Smith, M.; Stoll, R.; Tremonti, C. A.; Wake, D. A.; Yan, R.; Weijmans, A. M.; Byler, N.; Cherinka, B.; Cope, F.; Eigenbrot, A.; Harding, P.; Holder, D.; Huehnerhoff, J.; Jaehnig, K.; Jansen, T. C.; Klaene, M.; Paat, A. M.; Percival, J.; Sayres, C.
2015-02-01
We describe the design, manufacture, and performance of bare-fiber integral field units (IFUs) for the SDSS-IV survey Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) on the the Sloan 2.5 m telescope at Apache Point Observatory. MaNGA is a luminosity-selected integral-field spectroscopic survey of 104 local galaxies covering 360-1030 nm at R˜ 2200. The IFUs have hexagonal dense packing of fibers with packing regularity of 3 μm (rms), and throughput of 96 ± 0.5% from 350 nm to 1 μm in the lab. Their sizes range from 19 to 127 fibers (3-7 hexagonal layers) using Polymicro FBP 120:132:150 μm core:clad:buffer fibers to reach a fill fraction of 56%. High throughput (and low focal-ratio degradation (FRD)) is achieved by maintaining the fiber cladding and buffer intact, ensuring excellent surface polish, and applying a multi-layer anti-reflection (AR) coating of the input and output surfaces. In operations on-sky, the IFUs show only an additional 2.3% FRD-related variability in throughput despite repeated mechanical stressing during plate plugging (however other losses are present). The IFUs achieve on-sky throughput 5% above the single-fiber feeds used in SDSS-III/BOSS, attributable to equivalent performance compared to single fibers and additional gains from the AR coating. The manufacturing process is geared toward mass-production of high-multiplex systems. The low-stress process involves a precision ferrule with a hexagonal inner shape designed to lead inserted fibers to settle in a dense hexagonal pattern. The ferrule ID is tapered at progressively shallower angles toward its tip and the final 2 mm are straight and only a few microns larger than necessary to hold the desired number of fibers. Our IFU manufacturing process scales easily to accommodate other fiber sizes and can produce IFUs with substantially larger fiber counts. To assure quality, automated testing in a simple and inexpensive system enables complete characterization of throughput and fiber metrology. Future applications include larger IFUs, higher fill factors with stripped buffer, de-cladding, and lenslet coupling.
20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)
Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...
The US EPA ToxCast Program: Moving from Data Generation ...
The U.S. EPA ToxCast program is entering its tenth year. Significant learning and progress have occurred towards collection, analysis, and interpretation of the data. The library of ~1,800 chemicals has been subject to ongoing characterization (e.g., identity, purity, stability) and is unique in its scope, structural diversity, and use scenarios making it ideally suited to investigate the underlying molecular mechanisms of toxicity. The ~700 high-throughput in vitro assay endpoints cover 327 genes and 293 pathways as well as other integrated cellular processes and responses. The integrated analysis of high-throughput screening data has shown that most environmental and industrial chemicals are very non-selective in the biological targets they perturb, while a small subset of chemicals are relatively selective for specific biological targets. The selectivity of a chemical informs interpretation of the screening results while also guiding future mode-of-action or adverse outcome pathway approaches. Coupling the high-throughput in vitro assays with medium-throughput pharmacokinetic assays and reverse dosimetry allows conversion of the potency estimates to an administered dose. Comparison of the administered dose to human exposure provides a risk-based context. The lessons learned from this effort will be presented and discussed towards application to chemical safety decision making and the future of the computational toxicology program at the U.S. EPA. SOT pr
Adverse outcome pathway networks II: Network analytics
The US EPA is developing more cost effective and efficient ways to evaluate chemical safety using high throughput and computationally based testing strategies. An important component of this approach is the ability to translate chemical effects on fundamental biological processes...
Laurens, L M L; Wolfrum, E J
2013-12-18
One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.
Automated Microfluidic Instrument for Label-Free and High-Throughput Cell Separation.
Zhang, Xinjie; Zhu, Zhixian; Xiang, Nan; Long, Feifei; Ni, Zhonghua
2018-03-20
Microfluidic technologies for cell separation were reported frequently in recent years. However, a compact microfluidic instrument enabling thoroughly automated cell separation is still rarely reported until today due to the difficult hybrid between the macrosized fluidic control system and the microsized microfluidic device. In this work, we propose a novel and automated microfluidic instrument to realize size-based separation of cancer cells in a label-free and high-throughput manner. Briefly, the instrument is equipped with a fully integrated microfluidic device and a set of robust fluid-driven and control units, and the instrument functions of precise fluid infusion and high-throughput cell separation are guaranteed by a flow regulatory chip and two cell separation chips which are the key components of the microfluidic device. With optimized control programs, the instrument is successfully applied to automatically sort human breast adenocarcinoma cell line MCF-7 from 5 mL of diluted human blood with a high recovery ratio of ∼85% within a rapid processing time of ∼23 min. We envision that our microfluidic instrument will be potentially useful in many biomedical applications, especially cell separation, enrichment, and concentration for the purpose of cell culture and analysis.
Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.
2006-05-01
Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less
Lee, Dennis; Barnes, Stephen
2010-01-01
The need for new pharmacological agents is unending. Yet the drug discovery process has changed substantially over the past decade and continues to evolve in response to new technologies. There is presently a high demand to reduce discovery time by improving specific lab disciplines and developing new technology platforms in the area of cell-based assay screening. Here we present the developmental concept and early stage testing of the Ab-Sniffer, a novel fiber optic fluorescence device for high-throughput cytotoxicity screening using an immobilized whole cell approach. The fused silica fibers are chemically functionalized with biotin to provide interaction with fluorescently labeled, streptavidin functionalized alginate-chitosan microspheres. The microspheres are also functionalized with Concanavalin A to facilitate binding to living cells. By using lymphoma cells and rituximab in an adaptation of a well-known cytotoxicity protocol we demonstrate the utility of the Ab-Sniffer for functional screening of potential drug compounds rather than indirect, non-functional screening via binding assay. The platform can be extended to any assay capable of being tied to a fluorescence response including multiple target cells in each well of a multi-well plate for high-throughput screening.
An Efficient Semi-supervised Learning Approach to Predict SH2 Domain Mediated Interactions.
Kundu, Kousik; Backofen, Rolf
2017-01-01
Src homology 2 (SH2) domain is an important subclass of modular protein domains that plays an indispensable role in several biological processes in eukaryotes. SH2 domains specifically bind to the phosphotyrosine residue of their binding peptides to facilitate various molecular functions. For determining the subtle binding specificities of SH2 domains, it is very important to understand the intriguing mechanisms by which these domains recognize their target peptides in a complex cellular environment. There are several attempts have been made to predict SH2-peptide interactions using high-throughput data. However, these high-throughput data are often affected by a low signal to noise ratio. Furthermore, the prediction methods have several additional shortcomings, such as linearity problem, high computational complexity, etc. Thus, computational identification of SH2-peptide interactions using high-throughput data remains challenging. Here, we propose a machine learning approach based on an efficient semi-supervised learning technique for the prediction of 51 SH2 domain mediated interactions in the human proteome. In our study, we have successfully employed several strategies to tackle the major problems in computational identification of SH2-peptide interactions.
Automatic Segmentation of High-Throughput RNAi Fluorescent Cellular Images
Yan, Pingkum; Zhou, Xiaobo; Shah, Mubarak; Wong, Stephen T. C.
2010-01-01
High-throughput genome-wide RNA interference (RNAi) screening is emerging as an essential tool to assist biologists in understanding complex cellular processes. The large number of images produced in each study make manual analysis intractable; hence, automatic cellular image analysis becomes an urgent need, where segmentation is the first and one of the most important steps. In this paper, a fully automatic method for segmentation of cells from genome-wide RNAi screening images is proposed. Nuclei are first extracted from the DNA channel by using a modified watershed algorithm. Cells are then extracted by modeling the interaction between them as well as combining both gradient and region information in the Actin and Rac channels. A new energy functional is formulated based on a novel interaction model for segmenting tightly clustered cells with significant intensity variance and specific phenotypes. The energy functional is minimized by using a multiphase level set method, which leads to a highly effective cell segmentation method. Promising experimental results demonstrate that automatic segmentation of high-throughput genome-wide multichannel screening can be achieved by using the proposed method, which may also be extended to other multichannel image segmentation problems. PMID:18270043
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Elo; Huang, Amy; Cadag, Eithon
In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less
Leung, Elo; Huang, Amy; Cadag, Eithon; ...
2016-01-20
In this study, we introduce the Protein Sequence Annotation Tool (PSAT), a web-based, sequence annotation meta-server for performing integrated, high-throughput, genome-wide sequence analyses. Our goals in building PSAT were to (1) create an extensible platform for integration of multiple sequence-based bioinformatics tools, (2) enable functional annotations and enzyme predictions over large input protein fasta data sets, and (3) provide a web interface for convenient execution of the tools. In this paper, we demonstrate the utility of PSAT by annotating the predicted peptide gene products of Herbaspirillum sp. strain RV1423, importing the results of PSAT into EC2KEGG, and using the resultingmore » functional comparisons to identify a putative catabolic pathway, thereby distinguishing RV1423 from a well annotated Herbaspirillum species. This analysis demonstrates that high-throughput enzyme predictions, provided by PSAT processing, can be used to identify metabolic potential in an otherwise poorly annotated genome. Lastly, PSAT is a meta server that combines the results from several sequence-based annotation and function prediction codes, and is available at http://psat.llnl.gov/psat/. PSAT stands apart from other sequencebased genome annotation systems in providing a high-throughput platform for rapid de novo enzyme predictions and sequence annotations over large input protein sequence data sets in FASTA. PSAT is most appropriately applied in annotation of large protein FASTA sets that may or may not be associated with a single genome.« less
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.