Erickson, Heidi S
2012-09-28
The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Lee, Hochul; Ebrahimi, Farbod; Amiri, Pedram Khalili; Wang, Kang L.
2017-05-01
A true random number generator based on perpendicularly magnetized voltage-controlled magnetic tunnel junction devices (MRNG) is presented. Unlike MTJs used in memory applications where a stable bit is needed to store information, in this work, the MTJ is intentionally designed with small perpendicular magnetic anisotropy (PMA). This allows one to take advantage of the thermally activated fluctuations of its free layer as a stochastic noise source. Furthermore, we take advantage of the voltage dependence of anisotropy to temporarily change the MTJ state into an unstable state when a voltage is applied. Since the MTJ has two energetically stable states, the final state is randomly chosen by thermal fluctuation. The voltage controlled magnetic anisotropy (VCMA) effect is used to generate the metastable state of the MTJ by lowering its energy barrier. The proposed MRNG achieves a high throughput (32 Gbps) by implementing a 64 ×64 MTJ array into CMOS circuits and executing operations in a parallel manner. Furthermore, the circuit consumes very low energy to generate a random bit (31.5 fJ/bit) due to the high energy efficiency of the voltage-controlled MTJ switching.
Quantitative high-throughput population dynamics in continuous-culture by automated microscopy.
Merritt, Jason; Kuehn, Seppe
2016-09-12
We present a high-throughput method to measure abundance dynamics in microbial communities sustained in continuous-culture. Our method uses custom epi-fluorescence microscopes to automatically image single cells drawn from a continuously-cultured population while precisely controlling culture conditions. For clonal populations of Escherichia coli our instrument reveals history-dependent resilience and growth rate dependent aggregation.
State of the Art High-Throughput Approaches to Genotoxicity: Flow Micronucleus, Ames II, GreenScreen and Comet (Presented by Dr. Marilyn J. Aardema, Chief Scientific Advisor, Toxicology, Dr. Leon Stankowski, et. al. (6/28/2012)
Duarte, José M; Barbier, Içvara; Schaerli, Yolanda
2017-11-17
Synthetic biologists increasingly rely on directed evolution to optimize engineered biological systems. Applying an appropriate screening or selection method for identifying the potentially rare library members with the desired properties is a crucial step for success in these experiments. Special challenges include substantial cell-to-cell variability and the requirement to check multiple states (e.g., being ON or OFF depending on the input). Here, we present a high-throughput screening method that addresses these challenges. First, we encapsulate single bacteria into microfluidic agarose gel beads. After incubation, they harbor monoclonal bacterial microcolonies (e.g., expressing a synthetic construct) and can be sorted according their fluorescence by fluorescence activated cell sorting (FACS). We determine enrichment rates and demonstrate that we can measure the average fluorescent signals of microcolonies containing phenotypically heterogeneous cells, obviating the problem of cell-to-cell variability. Finally, we apply this method to sort a pBAD promoter library at ON and OFF states.
Pseudouridines have context-dependent mutation and stop rates in high-throughput sequencing.
Zhou, Katherine I; Clark, Wesley C; Pan, David W; Eckwahl, Matthew J; Dai, Qing; Pan, Tao
2018-05-11
The abundant RNA modification pseudouridine (Ψ) has been mapped transcriptome-wide by chemically modifying pseudouridines with carbodiimide and detecting the resulting reverse transcription stops in high-throughput sequencing. However, these methods have limited sensitivity and specificity, in part due to the use of reverse transcription stops. We sought to use mutations rather than just stops in sequencing data to identify pseudouridine sites. Here, we identify reverse transcription conditions that allow read-through of carbodiimide-modified pseudouridine (CMC-Ψ), and we show that pseudouridines in carbodiimide-treated human ribosomal RNA have context-dependent mutation and stop rates in high-throughput sequencing libraries prepared under these conditions. Furthermore, accounting for the context-dependence of mutation and stop rates can enhance the detection of pseudouridine sites. Similar approaches could contribute to the sequencing-based detection of many RNA modifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...
Ryall, Karen A; Shin, Jimin; Yoo, Minjae; Hinz, Trista K; Kim, Jihye; Kang, Jaewoo; Heasley, Lynn E; Tan, Aik Choon
2015-12-01
Targeted kinase inhibitors have dramatically improved cancer treatment, but kinase dependency for an individual patient or cancer cell can be challenging to predict. Kinase dependency does not always correspond with gene expression and mutation status. High-throughput drug screens are powerful tools for determining kinase dependency, but drug polypharmacology can make results difficult to interpret. We developed Kinase Addiction Ranker (KAR), an algorithm that integrates high-throughput drug screening data, comprehensive kinase inhibition data and gene expression profiles to identify kinase dependency in cancer cells. We applied KAR to predict kinase dependency of 21 lung cancer cell lines and 151 leukemia patient samples using published datasets. We experimentally validated KAR predictions of FGFR and MTOR dependence in lung cancer cell line H1581, showing synergistic reduction in proliferation after combining ponatinib and AZD8055. KAR can be downloaded as a Python function or a MATLAB script along with example inputs and outputs at: http://tanlab.ucdenver.edu/KAR/. aikchoon.tan@ucdenver.edu. Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...
High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)
USDA-ARS?s Scientific Manuscript database
Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...
We demonstrate a computational network model that integrates 18 in vitro, high-throughput screening assays measuring estrogen receptor (ER) binding, dimerization, chromatin binding, transcriptional activation and ER-dependent cell proliferation. The network model uses activity pa...
Tome, Jacob M; Ozer, Abdullah; Pagano, John M; Gheba, Dan; Schroth, Gary P; Lis, John T
2014-06-01
RNA-protein interactions play critical roles in gene regulation, but methods to quantitatively analyze these interactions at a large scale are lacking. We have developed a high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay by adapting a high-throughput DNA sequencer to quantify the binding of fluorescently labeled protein to millions of RNAs anchored to sequenced cDNA templates. Using HiTS-RAP, we measured the affinity of mutagenized libraries of GFP-binding and NELF-E-binding aptamers to their respective targets and identified critical regions of interaction. Mutations additively affected the affinity of the NELF-E-binding aptamer, whose interaction depended mainly on a single-stranded RNA motif, but not that of the GFP aptamer, whose interaction depended primarily on secondary structure.
2015-01-01
High-throughput production of nanoparticles (NPs) with controlled quality is critical for their clinical translation into effective nanomedicines for diagnostics and therapeutics. Here we report a simple and versatile coaxial turbulent jet mixer that can synthesize a variety of NPs at high throughput up to 3 kg/d, while maintaining the advantages of homogeneity, reproducibility, and tunability that are normally accessible only in specialized microscale mixing devices. The device fabrication does not require specialized machining and is easy to operate. As one example, we show reproducible, high-throughput formulation of siRNA-polyelectrolyte polyplex NPs that exhibit effective gene knockdown but exhibit significant dependence on batch size when formulated using conventional methods. The coaxial turbulent jet mixer can accelerate the development of nanomedicines by providing a robust and versatile platform for preparation of NPs at throughputs suitable for in vivo studies, clinical trials, and industrial-scale production. PMID:24824296
Reichman, Melvin; Schabdach, Amanda; Kumar, Meera; Zielinski, Tom; Donover, Preston S; Laury-Kleintop, Lisa D; Lowery, Robert G
2015-12-01
Ras homologous (Rho) family GTPases act as molecular switches controlling cell growth, movement, and gene expression by cycling between inactive guanosine diphosphate (GDP)- and active guanosine triphosphate (GTP)-bound conformations. Guanine nucleotide exchange factors (GEFs) positively regulate Rho GTPases by accelerating GDP dissociation to allow formation of the active, GTP-bound complex. Rho proteins are directly involved in cancer pathways, especially cell migration and invasion, and inhibiting GEFs holds potential as a therapeutic strategy to diminish Rho-dependent oncogenesis. Methods for measuring GEF activity suitable for high-throughput screening (HTS) are limited. We developed a simple, generic biochemical assay method for measuring GEF activity based on the fact that GDP dissociation is generally the rate-limiting step in the Rho GTPase catalytic cycle, and thus addition of a GEF causes an increase in steady-state GTPase activity. We used the Transcreener GDP Assay, which relies on selective immunodetection of GDP, to measure the GEF-dependent stimulation of steady-state GTP hydrolysis by small GTPases using Dbs (Dbl's big sister) as a GEF for Cdc42, RhoA, and RhoB. The assay is well suited for HTS, with a homogenous format and far red fluorescence polarization (FP) readout, and it should be broadly applicable to diverse Rho GEF/GTPase pairs. © 2015 Society for Laboratory Automation and Screening.
USDA-ARS?s Scientific Manuscript database
Recent developments in high-throughput sequencing technology have made low-cost sequencing an attractive approach for many genome analysis tasks. Increasing read lengths, improving quality and the production of increasingly larger numbers of usable sequences per instrument-run continue to make whole...
Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister
2014-05-01
The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.
High throughput system for magnetic manipulation of cells, polymers, and biomaterials
Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.
2008-01-01
In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357
High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials
United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...
Under the ExpoCast program, United States Environmental Protection Agency (EPA) researchers have developed a high-throughput (HT) framework for estimating aggregate exposures to chemicals from multiple pathways to support rapid prioritization of chemicals. Here, we present method...
High-Throughput Models for Exposure-Based Chemical Prioritization in the ExpoCast Project
The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research pr...
TCP Throughput Profiles Using Measurements over Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, inmore » stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.« less
Lai, Y W; Hamann, S; Ehmann, M; Ludwig, A
2011-06-01
We report the development of an advanced high-throughput stress characterization method for thin film materials libraries sputter-deposited on micro-machined cantilever arrays consisting of around 1500 cantilevers on 4-inch silicon-on-insulator wafers. A low-cost custom-designed digital holographic microscope (DHM) is employed to simultaneously monitor the thin film thickness, the surface topography and the curvature of each of the cantilevers before and after deposition. The variation in stress state across the thin film materials library is then calculated by Stoney's equation based on the obtained radii of curvature of the cantilevers and film thicknesses. DHM with nanometer-scale out-of-plane resolution allows stress measurements in a wide range, at least from several MPa to several GPa. By using an automatic x-y translation stage, the local stresses within a 4-inch materials library are mapped with high accuracy within 10 min. The speed of measurement is greatly improved compared with the prior laser scanning approach that needs more than an hour of measuring time. A high-throughput stress measurement of an as-deposited Fe-Pd-W materials library was evaluated for demonstration. The fast characterization method is expected to accelerate the development of (functional) thin films, e.g., (magnetic) shape memory materials, whose functionality is greatly stress dependent. © 2011 American Institute of Physics
In vitro, high-throughput approaches have been widely recommended as an approach to screen chemicals for the potential to cause developmental neurotoxicity and prioritize them for additional testing. The choice of cellular models for such an approach will have important ramificat...
High Throughput Transcriptomics @ USEPA (Toxicology ...
The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.
Valdés, Julio J; Bonham-Carter, Graeme
2006-03-01
A computational intelligence approach is used to explore the problem of detecting internal state changes in time dependent processes; described by heterogeneous, multivariate time series with imprecise data and missing values. Such processes are approximated by collections of time dependent non-linear autoregressive models represented by a special kind of neuro-fuzzy neural network. Grid and high throughput computing model mining procedures based on neuro-fuzzy networks and genetic algorithms, generate: (i) collections of models composed of sets of time lag terms from the time series, and (ii) prediction functions represented by neuro-fuzzy networks. The composition of the models and their prediction capabilities, allows the identification of changes in the internal structure of the process. These changes are associated with the alternation of steady and transient states, zones with abnormal behavior, instability, and other situations. This approach is general, and its sensitivity for detecting subtle changes of state is revealed by simulation experiments. Its potential in the study of complex processes in earth sciences and astrophysics is illustrated with applications using paleoclimate and solar data.
Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C
2016-01-01
Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.
Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.
Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S
1994-01-01
The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.
Bastianini, Stefano; Alvente, Sara; Berteotti, Chiara; Lo Martire, Viviana; Silvani, Alessandro; Swoap, Steven J; Valli, Alice; Zoccoli, Giovanna; Cohen, Gary
2017-01-31
A major limitation in the study of sleep breathing disorders in mouse models of pathology is the need to combine whole-body plethysmography (WBP) to measure respiration with electroencephalography/electromyography (EEG/EMG) to discriminate wake-sleep states. However, murine wake-sleep states may be discriminated from breathing and body movements registered by the WBP signal alone. Our goal was to compare the EEG/EMG-based and the WBP-based scoring of wake-sleep states of mice, and provide formal guidelines for the latter. EEG, EMG, blood pressure and WBP signals were simultaneously recorded from 20 mice. Wake-sleep states were scored based either on EEG/EMG or on WBP signals and sleep-dependent respiratory and cardiovascular estimates were calculated. We found that the overall agreement between the 2 methods was 90%, with a high Cohen's Kappa index (0.82). The inter-rater agreement between 2 experts and between 1 expert and 1 naïve sleep investigators gave similar results. Sleep-dependent respiratory and cardiovascular estimates did not depend on the scoring method. We show that non-invasive discrimination of the wake-sleep states of mice based on visual inspection of the WBP signal is accurate, reliable and reproducible. This work may set the stage for non-invasive high-throughput experiments evaluating sleep and breathing patterns on mouse models of pathophysiology.
Using high-content imaging data from ToxCast to analyze toxicological tipping points (TDS)
Translating results obtained from high-throughput screening to risk assessment is vital for reducing dependence on animal testing. We studied the effects of 976 chemicals (ToxCast Phase I and II) in HepG2 cells using high-content imaging (HCI) to measure dose and time-depende...
High count-rate study of two TES x-ray microcalorimeters with different transition temperatures
NASA Astrophysics Data System (ADS)
Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; Sadleir, John E.; Smith, Stephen J.; Wassell, Edward J.
2017-10-01
We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures (T c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T cs had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.3 eV at 6 keV from lower and higher T c devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the so-called event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96% throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T c (faster) device, and 5.8 eV FWHM with 97% throughput with the lower T c (slower) device at 722 Hz.
Optimizing transformations for automated, high throughput analysis of flow cytometry data
2010-01-01
Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Conclusions Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available. PMID:21050468
Optimizing transformations for automated, high throughput analysis of flow cytometry data.
Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael
2010-11-04
In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
Khan, Arifa S; Vacante, Dominick A; Cassart, Jean-Pol; Ng, Siemon H S; Lambert, Christophe; Charlebois, Robert L; King, Kathryn E
Several nucleic-acid based technologies have recently emerged with capabilities for broad virus detection. One of these, high throughput sequencing, has the potential for novel virus detection because this method does not depend upon prior viral sequence knowledge. However, the use of high throughput sequencing for testing biologicals poses greater challenges as compared to other newly introduced tests due to its technical complexities and big data bioinformatics. Thus, the Advanced Virus Detection Technologies Users Group was formed as a joint effort by regulatory and industry scientists to facilitate discussions and provide a forum for sharing data and experiences using advanced new virus detection technologies, with a focus on high throughput sequencing technologies. The group was initiated as a task force that was coordinated by the Parenteral Drug Association and subsequently became the Advanced Virus Detection Technologies Interest Group to continue efforts for using new technologies for detection of adventitious viruses with broader participation, including international government agencies, academia, and technology service providers. © PDA, Inc. 2016.
Mathur, Priya; Guo, Su
2011-06-01
Zebrafish, a vertebrate model organism amenable to high throughput screening, is an attractive system to model and study the mechanisms underlying human diseases. Alcoholism and alcoholic medical disorders are among the most debilitating diseases, yet the mechanisms by which ethanol inflicts the disease states are not well understood. In recent years zebrafish behavior assays have been used to study learning and memory, fear and anxiety, and social behavior. It is important to characterize the effects of ethanol on zebrafish behavioral repertoires in order to successfully harvest the strength of zebrafish for alcohol research. One prominent effect of alcohol in humans is its effect on anxiety, with acute intermediate doses relieving anxiety and withdrawal from chronic exposure increasing anxiety, both of which have significant contributions to alcohol dependence. In this study, we assess the effects of both acute and chronic ethanol exposure on anxiety-like behaviors in zebrafish, using two behavioral paradigms, the Novel Tank Diving Test and the Light/Dark Choice Assay. Acute ethanol exposure exerted significant dose-dependent anxiolytic effects. However, withdrawal from repeated intermittent ethanol exposure disabled recovery from heightened anxiety. These results demonstrate that zebrafish exhibit different anxiety-like behavioral responses to acute and chronic ethanol exposure, which are remarkably similar to these effects of alcohol in humans. Because of the accessibility of zebrafish to high throughput screening, our results suggest that genes and small molecules identified in zebrafish will be of relevance to understand how acute versus chronic alcohol exposure have opposing effects on the state of anxiety in humans. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...
2017-03-28
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Modeling Protein Expression and Protein Signaling Pathways
Telesca, Donatello; Müller, Peter; Kornblau, Steven M.; Suchard, Marc A.; Ji, Yuan
2015-01-01
High-throughput functional proteomic technologies provide a way to quantify the expression of proteins of interest. Statistical inference centers on identifying the activation state of proteins and their patterns of molecular interaction formalized as dependence structure. Inference on dependence structure is particularly important when proteins are selected because they are part of a common molecular pathway. In that case, inference on dependence structure reveals properties of the underlying pathway. We propose a probability model that represents molecular interactions at the level of hidden binary latent variables that can be interpreted as indicators for active versus inactive states of the proteins. The proposed approach exploits available expert knowledge about the target pathway to define an informative prior on the hidden conditional dependence structure. An important feature of this prior is that it provides an instrument to explicitly anchor the model space to a set of interactions of interest, favoring a local search approach to model determination. We apply our model to reverse-phase protein array data from a study on acute myeloid leukemia. Our inference identifies relevant subpathways in relation to the unfolding of the biological process under study. PMID:26246646
Hu, Peng; Fabyanic, Emily; Kwon, Deborah Y; Tang, Sheng; Zhou, Zhaolan; Wu, Hao
2017-12-07
Massively parallel single-cell RNA sequencing can precisely resolve cellular diversity in a high-throughput manner at low cost, but unbiased isolation of intact single cells from complex tissues such as adult mammalian brains is challenging. Here, we integrate sucrose-gradient-assisted purification of nuclei with droplet microfluidics to develop a highly scalable single-nucleus RNA-seq approach (sNucDrop-seq), which is free of enzymatic dissociation and nucleus sorting. By profiling ∼18,000 nuclei isolated from cortical tissues of adult mice, we demonstrate that sNucDrop-seq not only accurately reveals neuronal and non-neuronal subtype composition with high sensitivity but also enables in-depth analysis of transient transcriptional states driven by neuronal activity, at single-cell resolution, in vivo. Copyright © 2017 Elsevier Inc. All rights reserved.
High-throughput screening in niche-based assay identifies compounds to target preleukemic stem cells
Gerby, Bastien; Veiga, Diogo F.T.; Krosl, Jana; Nourreddine, Sami; Ouellette, Julianne; Haman, André; Lavoie, Geneviève; Fares, Iman; Tremblay, Mathieu; Litalien, Véronique; Ottoni, Elizabeth; Geoffrion, Dominique; Maddox, Paul S.; Chagraoui, Jalila; Hébert, Josée; Sauvageau, Guy; Kwok, Benjamin H.; Roux, Philippe P.
2016-01-01
Current chemotherapies for T cell acute lymphoblastic leukemia (T-ALL) efficiently reduce tumor mass. Nonetheless, disease relapse attributed to survival of preleukemic stem cells (pre-LSCs) is associated with poor prognosis. Herein, we provide direct evidence that pre-LSCs are much less chemosensitive to existing chemotherapy drugs than leukemic blasts because of a distinctive lower proliferative state. Improving therapies for T-ALL requires the development of strategies to target pre-LSCs that are absolutely dependent on their microenvironment. Therefore, we designed a robust protocol for high-throughput screening of compounds that target primary pre-LSCs maintained in a niche-like environment, on stromal cells that were engineered for optimal NOTCH1 activation. The multiparametric readout takes into account the intrinsic complexity of primary cells in order to specifically monitor pre-LSCs, which were induced here by the SCL/TAL1 and LMO1 oncogenes. We screened a targeted library of compounds and determined that the estrogen derivative 2-methoxyestradiol (2-ME2) disrupted both cell-autonomous and non–cell-autonomous pathways. Specifically, 2-ME2 abrogated pre-LSC viability and self-renewal activity in vivo by inhibiting translation of MYC, a downstream effector of NOTCH1, and preventing SCL/TAL1 activity. In contrast, normal hematopoietic stem/progenitor cells remained functional. These results illustrate how recapitulating tissue-like properties of primary cells in high-throughput screening is a promising avenue for innovation in cancer chemotherapy. PMID:27797342
2005-09-01
This research explores the need for a high throughput, high speed network for use in a network centric wartime environment and how commercial...Automated Digital Network System (ADNS). This research explores the need for a high-throughput, high-speed network for use in a network centric ...1 C. DEPARTMENT OF DEFENSE (DOD) DESIRED END STATE ..............2 1. DOD Transformation to Network Centric Warfare (NCW) Operations
Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D
2009-11-01
While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.
Pinto, Nicolas; Doukhan, David; DiCarlo, James J.; Cox, David D.
2009-01-01
While many models of biological object recognition share a common set of “broad-stroke” properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model—e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct “parts” have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision. PMID:19956750
Cornell, Thomas A; Fu, Jing; Newland, Stephanie H; Orner, Brendan P
2013-11-06
Proteins that form cage-like structures have been of much recent cross-disciplinary interest due to their application to bioconjugate and materials chemistry, their biological functions spanning multiple essential cellular processes, and their complex structure, often defined by highly symmetric protein–protein interactions. Thus, establishing the fundamentals of their formation, through detecting and quantifying important protein–protein interactions, could be crucial to understanding essential cellular machinery, and for further development of protein-based technologies. Herein we describe a method to monitor the assembly of protein cages by detecting specific, oligomerization state dependent, protein–protein interactions. Our strategy relies on engineering protein monomers to include cysteine pairs that are presented proximally if the cage state assembles. These assembled pairs of cysteines act as binding sites for the fluorescent reagent FlAsH, which, once bound, provides a readout for successful oligomerization. As a proof of principle, we applied this technique to the iron storage protein, DNA-binding protein from starved cells from E. coli. Several linker lengths and conformations for the presentation of the cysteine pairs were screened to optimize the engineered binding sites. We confirmed that our designs were successful in both lysates and with purified proteins, and that FlAsH binding was dependent upon cage assembly. Following successful characterization of the assay, its throughput was expanded. A two-dimension matrix of pH and denaturing buffer conditions was screened to optimize nanocage stability. We intend to use this method for the high throughput screening of protein cage libraries and of conditions for the generation of inorganic nanoparticles within the cavity of these and other cage proteins.
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Annotare--a tool for annotating high-throughput biomedical investigations and resulting data.
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J; Ball, Catherine A
2010-10-01
Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows.
High Count-Rate Study of Two TES X-Ray Microcalorimeters With Different Transition Temperatures
NASA Technical Reports Server (NTRS)
Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.;
2017-01-01
We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures T(sub c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T(sub c)(sup s) had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.eV at 6 keV from lower and higher T(sub c) devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the socalled event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96 Percent throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T(sub c) (faster) device, and 5.8 eV FWHM with 97 Percent throughput with the lower T(sub c) (slower) device at 722 Hz.
Mapping specificity landscapes of RNA-protein interactions by high throughput sequencing.
Jankowsky, Eckhard; Harris, Michael E
2017-04-15
To function in a biological setting, RNA binding proteins (RBPs) have to discriminate between alternative binding sites in RNAs. This discrimination can occur in the ground state of an RNA-protein binding reaction, in its transition state, or in both. The extent by which RBPs discriminate at these reaction states defines RBP specificity landscapes. Here, we describe the HiTS-Kin and HiTS-EQ techniques, which combine kinetic and equilibrium binding experiments with high throughput sequencing to quantitatively assess substrate discrimination for large numbers of substrate variants at ground and transition states of RNA-protein binding reactions. We discuss experimental design, practical considerations and data analysis and outline how a combination of HiTS-Kin and HiTS-EQ allows the mapping of RBP specificity landscapes. Copyright © 2017 Elsevier Inc. All rights reserved.
Label-free probing of genes by time-domain terahertz sensing.
Haring Bolivar, P; Brucherseifer, M; Nagel, M; Kurz, H; Bosserhoff, A; Büttner, R
2002-11-07
A label-free sensing approach for the label-free characterization of genetic material with terahertz (THz) electromagnetic waves is presented. Time-resolved THz analysis of polynucleotides demonstrates a strong dependence of the complex refractive index of DNA molecules in the THz frequency range on their hybridization state. By monitoring THz signals one can thus infer the binding state (hybridized or denatured) of oligo- and polynucleotides, enabling the label-free determination the genetic composition of unknown DNA sequences. A broadband experimental proof-of-principle in a freespace analytic configuration, as well as a higher-sensitivity approach using integrated THz sensors reaching femtomol detection levels and demonstrating the capability to detect single-base mutations, are presented. The potential application for next generation high-throughput label-free genetic analytic systems is discussed.
Benjamin, Elfrida R; Pruthi, Farhana; Olanrewaju, Shakira; Ilyin, Victor I; Crumley, Gregg; Kutlina, Elena; Valenzano, Kenneth J; Woodward, Richard M
2006-02-01
Voltage-gated sodium channels (NaChs) are relevant targets for pain, epilepsy, and a variety of neurological and cardiac disorders. Traditionally, it has been difficult to develop structure-activity relationships for NaCh inhibitors due to rapid channel kinetics and state-dependent compound interactions. Membrane potential (Vm) dyes in conjunction with a high-throughput fluorescence imaging plate reader (FLIPR) offer a satisfactory 1st-tier solution. Thus, the authors have developed a FLIPR Vm assay of rat Nav1.2 NaCh. Channels were opened by addition of veratridine, and Vm dye responses were measured. The IC50 values from various structural classes of compounds were compared to the resting state binding constant (Kr)and inactivated state binding constant (Ki)obtained using patch-clamp electrophysiology (EP). The FLIPR values correlated with Ki but not Kr. FLIPRIC50 values fell within 0.1-to 1.5-fold of EP Ki values, indicating that the assay generally reports use-dependent inhibition rather than resting state block. The Library of Pharmacologically Active Compounds (LOPAC, Sigma) was screened. Confirmed hits arose from diverse classes such as dopamine receptor antagonists, serotonin transport inhibitors, and kinase inhibitors. These data suggest that NaCh inhibition is inherent in a diverse set of biologically active molecules and may warrant counterscreening NaChs to avoid unwanted secondary pharmacology.
Lee, Ju Hee; Chen, Hongxiang; Kolev, Vihren; Aull, Katherine H.; Jung, Inhee; Wang, Jun; Miyamoto, Shoko; Hosoi, Junichi; Mandinova, Anna; Fisher, David E.
2014-01-01
Skin pigmentation is a complex process including melanogenesis within melanocytes and melanin transfer to the keratinocytes. To develop a comprehensive screening method for novel pigmentation regulators, we used immortalized melanocytes and keratinocytes in co-culture to screen large numbers of compounds. High-throughput screening plates were subjected to digital automated microscopy to quantify the pigmentation via brightfield microscopy. Compounds with pigment suppression were secondarily tested for their effects on expression of MITF and several pigment regulatory genes, and further validated in terms of non-toxicity to keratinocytes/melanocytes and dose dependent activity. The results demonstrate a high-throughput, high-content screening approach, which is applicable to the analysis of large chemical libraries using a co-culture system. We identified candidate pigmentation inhibitors from 4,000 screened compounds including zoxazolamine, 3-methoxycatechol, and alpha-mangostin, which were also shown to modulate expression of MITF and several key pigmentation factors, and are worthy of further evaluation for potential translation to clinical use. PMID:24438532
Critical evaluation of methods to incorporate entropy loss upon binding in high-throughput docking.
Salaniwal, Sumeet; Manas, Eric S; Alvarez, Juan C; Unwalla, Rayomand J
2007-02-01
Proper accounting of the positional/orientational/conformational entropy loss associated with protein-ligand binding is important to obtain reliable predictions of binding affinity. Herein, we critically examine two simplified statistical mechanics-based approaches, namely a constant penalty per rotor method, and a more rigorous method, referred to here as the partition function-based scoring (PFS) method, to account for such entropy losses in high-throughput docking calculations. Our results on the estrogen receptor beta and dihydrofolate reductase proteins demonstrate that, while the constant penalty method over-penalizes molecules for their conformational flexibility, the PFS method behaves in a more "DeltaG-like" manner by penalizing different rotors differently depending on their residual entropy in the bound state. Furthermore, in contrast to no entropic penalty or the constant penalty approximation, the PFS method does not exhibit any bias towards either rigid or flexible molecules in the hit list. Preliminary enrichment studies using a lead-like random molecular database suggest that an accurate representation of the "true" energy landscape of the protein-ligand complex is critical for reliable predictions of relative binding affinities by the PFS method. Copyright 2006 Wiley-Liss, Inc.
Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hura, Greg L.; Menon, Angeli L.; Hammel, Michal
2009-07-20
We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes formore » 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.« less
Single-Cell Analysis of Experience-Dependent Transcriptomic States in Mouse Visual Cortex
Hrvatin, Sinisa; Hochbaum, Daniel R.; Nagy, M. Aurel; Cicconet, Marcelo; Robertson, Keiramarie; Cheadle, Lucas; Zilionis, Rapolas; Ratner, Alex; Borges-Monroy, Rebeca; Klein, Allon M.; Sabatini, Bernardo L.; Greenberg, Michael E.
2017-01-01
Activity-dependent transcriptional responses shape cortical function. However, we lack a comprehensive understanding of the diversity of these responses across the full range of cortical cell types, and how these changes contribute to neuronal plasticity and disease. Here we applied high-throughput single-cell RNA-sequencing to investigate the breadth of transcriptional changes that occur across cell types in mouse visual cortex following exposure to light. We identified significant and divergent transcriptional responses to stimulation in each of the 30 cell types characterized, revealing 611 stimulus-responsive genes. Excitatory pyramidal neurons exhibit inter- and intra-laminar heterogeneity in the induction of stimulus responsive genes. Non-neuronal cells demonstrated clear transcriptional responses that may regulate experience-dependent changes in neurovascular coupling and myelination. Together, these results reveal the dynamic landscape of stimulus-dependent transcriptional changes that occur across cell types in visual cortex, which are likely critical for cortical function and may be sites of de-regulation in developmental brain disorders. PMID:29230054
A high throughput array microscope for the mechanical characterization of biomaterials
NASA Astrophysics Data System (ADS)
Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard
2015-02-01
In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.
High throughput protein production screening
Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA
2009-09-08
Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-07-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory.
NASA Technical Reports Server (NTRS)
Baxley, Brian; Swieringa, Kurt; Berckefeldt, Rick; Boyle, Dan
2017-01-01
NASA's first Air Traffic Management Technology Demonstration (ATD-1) subproject successfully completed a 19-day flight test of an Interval Management (IM) avionics prototype. The prototype was built based on IM standards, integrated into two test aircraft, and then flown in real-world conditions to determine if the goals of improving aircraft efficiency and airport throughput during high-density arrival operations could be met. The ATD-1 concept of operation integrates advanced arrival scheduling, controller decision support tools, and the IM avionics to enable multiple time-based arrival streams into a high-density terminal airspace. IM contributes by calculating airspeeds that enable an aircraft to achieve a spacing interval behind the preceding aircraft. The IM avionics uses its data (route of flight, position, etc.) and Automatic Dependent Surveillance-Broadcast (ADS-B) state data from the Target aircraft to calculate this airspeed. The flight test demonstrated that the IM avionics prototype met the spacing accuracy design goal for three of the four IM operation types tested. The primary issue requiring attention for future IM work is the high rate of IM speed commands and speed reversals. In total, during this flight test, the IM avionics prototype showed significant promise in contributing to the goals of improving aircraft efficiency and airport throughput.
CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.
Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W
2010-09-01
Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.
Annotare—a tool for annotating high-throughput biomedical investigations and resulting data
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J.; Ball, Catherine A.
2010-01-01
Summary: Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Availability and Implementation: Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows. Contact: rshankar@stanford.edu PMID:20733062
Principles of Chromosome Architecture Revealed by Hi-C.
Eagen, Kyle P
2018-06-01
Chromosomes are folded and compacted in interphase nuclei, but the molecular basis of this folding is poorly understood. Chromosome conformation capture methods, such as Hi-C, combine chemical crosslinking of chromatin with fragmentation, DNA ligation, and high-throughput DNA sequencing to detect neighboring loci genome-wide. Hi-C has revealed the segregation of chromatin into active and inactive compartments and the folding of DNA into self-associating domains and loops. Depletion of CTCF, cohesin, or cohesin-associated proteins was recently shown to affect the majority of domains and loops in a manner that is consistent with a model of DNA folding through extrusion of chromatin loops. Compartmentation was not dependent on CTCF or cohesin. Hi-C contact maps represent the superimposition of CTCF/cohesin-dependent and -independent folding states. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Xiao, Ruijuan; Li, Hong; Chen, Liquan
2015-09-01
Looking for solid state electrolytes with fast lithium ion conduction is an important prerequisite for developing all-solid-state lithium secondary batteries. By combining the simulation techniques in different levels of accuracy, e.g. the bond-valence (BV) method and the density functional theory (DFT), a high-throughput design and optimization scheme is proposed for searching fast lithium ion conductors as candidate solid state electrolytes for lithium rechargeable batteries. The screening from more than 1000 compounds is performed through BV-based method, and the ability to predict reliable tendency of the Li+ migration energy barriers is confirmed by comparing with the results from DFT calculations. β-Li3PS4 is taken as a model system to demonstrate the application of this combination method in optimizing properties of solid electrolytes. By employing the high-throughput DFT simulations to more than 200 structures of the doping derivatives of β-Li3PS4, the effects of doping on the ionic conductivities in this material are predicted by the BV calculations. The O-doping scheme is proposed as a promising way to improve the kinetic properties of this materials, and the validity of the optimization is proved by the first-principles molecular dynamics (FPMD) simulations.
NASA Astrophysics Data System (ADS)
Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph
2009-03-01
When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.
NASA Astrophysics Data System (ADS)
Alexander, Kristen; Lopez, Rene; Hampton, Meredith; Desimone, Joseph
2008-10-01
When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.
Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa
2015-01-27
Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.
He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G.; Alvarez-Cohen, Lisa
2015-01-01
ABSTRACT Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied “open-format” and “closed-format” detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. PMID:25626903
Zhou, Jizhong; He, Zhili; Yang, Yunfeng; ...
2015-01-27
Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied “open-format” and “closed-format” detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications andmore » focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions.« less
Automated Transition State Theory Calculations for High-Throughput Kinetics.
Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H
2017-09-21
A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.
High-throughput automatic defect review for 300mm blank wafers with atomic force microscope
NASA Astrophysics Data System (ADS)
Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il
2015-03-01
While feature size in lithography process continuously becomes smaller, defect sizes on blank wafers become more comparable to device sizes. Defects with nm-scale characteristic size could be misclassified by automated optical inspection (AOI) and require post-processing for proper classification. Atomic force microscope (AFM) is known to provide high lateral and the highest vertical resolution by mechanical probing among all techniques. However, its low throughput and tip life in addition to the laborious efforts for finding the defects have been the major limitations of this technique. In this paper we introduce automatic defect review (ADR) AFM as a post-inspection metrology tool for defect study and classification for 300 mm blank wafers and to overcome the limitations stated above. The ADR AFM provides high throughput, high resolution, and non-destructive means for obtaining 3D information for nm-scale defect review and classification.
Development of a high-throughput screen to detect inhibitors of TRPS1 sumoylation.
Brandt, Martin; Szewczuk, Lawrence M; Zhang, Hong; Hong, Xuan; McCormick, Patricia M; Lewis, Tia S; Graham, Taylor I; Hung, Sunny T; Harper-Jones, Amber D; Kerrigan, John J; Wang, Da-Yuan; Dul, Edward; Hou, Wangfang; Ho, Thau F; Meek, Thomas D; Cheung, Mui H; Johanson, Kyung O; Jones, Christopher S; Schwartz, Benjamin; Kumar, Sanjay; Oliff, Allen I; Kirkpatrick, Robert B
2013-06-01
Small ubiquitin-like modifier (SUMO) belongs to the family of ubiquitin-like proteins (Ubls) that can be reversibly conjugated to target-specific lysines on substrate proteins. Although covalently sumoylated products are readily detectible in gel-based assays, there has been little progress toward the development of robust quantitative sumoylation assay formats for the evaluation of large compound libraries. In an effort to identify inhibitors of ubiquitin carrier protein 9 (Ubc9)-dependent sumoylation, a high-throughput fluorescence polarization assay was developed, which allows detection of Lys-1201 sumoylation, corresponding to the major site of functional sumoylation within the transcriptional repressor trichorhino-phalangeal syndrome type I protein (TRPS1). A minimal hexapeptide substrate peptide, TMR-VVK₁₂₀₁TEK, was used in this assay format to afford high-throughput screening of the GlaxoSmithKline diversity compound collection. A total of 728 hits were confirmed but no specific noncovalent inhibitors of Ubc9 dependent trans-sumoylation were found. However, several diaminopyrimidine compounds were identified as inhibitors in the assay with IC₅₀ values of 12.5 μM. These were further characterized to be competent substrates which were subject to sumoylation by SUMO-Ubc9 and which were competitive with the sumoylation of the TRPS1 peptide substrates.
Development of a high-throughput assay for rapid screening of butanologenic strains.
Agu, Chidozie Victor; Lai, Stella M; Ujor, Victor; Biswas, Pradip K; Jones, Andy; Gopalan, Venkat; Ezeji, Thaddeus Chukwuemeka
2018-02-21
We report a Thermotoga hypogea (Th) alcohol dehydrogenase (ADH)-dependent spectrophotometric assay for quantifying the amount of butanol in growth media, an advance that will facilitate rapid high-throughput screening of hypo- and hyper-butanol-producing strains of solventogenic Clostridium species. While a colorimetric nitroblue tetrazolium chloride-based assay for quantitating butanol in acetone-butanol-ethanol (ABE) fermentation broth has been described previously, we determined that Saccharomyces cerevisiae (Sc) ADH used in this earlier study exhibits approximately 13-fold lower catalytic efficiency towards butanol than ethanol. Any Sc ADH-dependent assay for primary quantitation of butanol in an ethanol-butanol mixture is therefore subject to "ethanol interference". To circumvent this limitation and better facilitate identification of hyper-butanol-producing Clostridia, we searched the literature for native ADHs that preferentially utilize butanol over ethanol and identified Th ADH as a candidate. Indeed, recombinant Th ADH exhibited a 6-fold higher catalytic efficiency with butanol than ethanol, as measured using the reduction of NADP + to NADPH that accompanies alcohol oxidation. Moreover, the assay sensitivity was not affected by the presence of acetone, acetic acid or butyric acid (typical ABE fermentation products). We broadened the utility of our assay by adapting it to a high-throughput microtiter plate-based format, and piloted it successfully in an ongoing metabolic engineering initiative.
A High Throughput Model of Post-Traumatic Osteoarthritis using Engineered Cartilage Tissue Analogs
Mohanraj, Bhavana; Meloni, Gregory R.; Mauck, Robert L.; Dodge, George R.
2014-01-01
(1) Objective A number of in vitro models of post-traumatic osteoarthritis (PTOA) have been developed to study the effect of mechanical overload on the processes that regulate cartilage degeneration. While such frameworks are critical for the identification therapeutic targets, existing technologies are limited in their throughput capacity. Here, we validate a test platform for high-throughput mechanical injury incorporating engineered cartilage. (2) Method We utilized a high throughput mechanical testing platform to apply injurious compression to engineered cartilage and determined their strain and strain rate dependent responses to injury. Next, we validated this response by applying the same injury conditions to cartilage explants. Finally, we conducted a pilot screen of putative PTOA therapeutic compounds. (3) Results Engineered cartilage response to injury was strain dependent, with a 2-fold increase in GAG loss at 75% compared to 50% strain. Extensive cell death was observed adjacent to fissures, with membrane rupture corroborated by marked increases in LDH release. Testing of established PTOA therapeutics showed that pan-caspase inhibitor (ZVF) was effective at reducing cell death, while the amphiphilic polymer (P188) and the free-radical scavenger (NAC) reduced GAG loss as compared to injury alone. (4) Conclusions The injury response in this engineered cartilage model replicated key features of the response from cartilage explants, validating this system for application of physiologically relevant injurious compression. This study establishes a novel tool for the discovery of mechanisms governing cartilage injury, as well as a screening platform for the identification of new molecules for the treatment of PTOA. PMID:24999113
NASA Astrophysics Data System (ADS)
Miyata, Masanobu; Ozaki, Taisuke; Takeuchi, Tsunehiro; Nishino, Shunsuke; Inukai, Manabu; Koyano, Mikio
2018-06-01
The electron transport properties of 809 sulfides have been investigated using density functional theory (DFT) calculations in the relaxation time approximation, and a material design rule established for high-performance sulfide thermoelectric (TE) materials. Benchmark electron transport calculations were performed for Cu12Sb4S13 and Cu26V2Ge6S32, revealing that the ratio of the scattering probability of electrons and phonons ( κ lat τ el -1 ) was constant at about 2 × 1014 W K-1 m-1 s-1. The calculated thermopower S dependence of the theoretical dimensionless figure of merit ZT DFT of the 809 sulfides showed a maximum at 140 μV K-1 to 170 μV K-1. Under the assumption of constant κ lat τ el -1 of 2 × 1014 W K-1 m-1 s-1 and constant group velocity v of electrons, a slope of the density of states of 8.6 states eV-2 to 10 states eV-2 is suitable for high- ZT sulfide TE materials. The Lorenz number L dependence of ZT DFT for the 809 sulfides showed a maximum at L of approximately 2.45 × 10-8 V2 K-2. This result demonstrates that the potential of high- ZT sulfide materials is highest when the electron thermal conductivity κ el of the symmetric band is equal to that of the asymmetric band.
Jiang, Guangli; Liu, Leibo; Zhu, Wenping; Yin, Shouyi; Wei, Shaojun
2015-09-04
This paper proposes a real-time feature extraction VLSI architecture for high-resolution images based on the accelerated KAZE algorithm. Firstly, a new system architecture is proposed. It increases the system throughput, provides flexibility in image resolution, and offers trade-offs between speed and scaling robustness. The architecture consists of a two-dimensional pipeline array that fully utilizes computational similarities in octaves. Secondly, a substructure (block-serial discrete-time cellular neural network) that can realize a nonlinear filter is proposed. This structure decreases the memory demand through the removal of data dependency. Thirdly, a hardware-friendly descriptor is introduced in order to overcome the hardware design bottleneck through the polar sample pattern; a simplified method to realize rotation invariance is also presented. Finally, the proposed architecture is designed in TSMC 65 nm CMOS technology. The experimental results show a performance of 127 fps in full HD resolution at 200 MHz frequency. The peak performance reaches 181 GOPS and the throughput is double the speed of other state-of-the-art architectures.
Quantum-Sequencing: Fast electronic single DNA molecule sequencing
NASA Astrophysics Data System (ADS)
Casamada Ribot, Josep; Chatterjee, Anushree; Nagpal, Prashant
2014-03-01
A major goal of third-generation sequencing technologies is to develop a fast, reliable, enzyme-free, high-throughput and cost-effective, single-molecule sequencing method. Here, we present the first demonstration of unique ``electronic fingerprint'' of all nucleotides (A, G, T, C), with single-molecule DNA sequencing, using Quantum-tunneling Sequencing (Q-Seq) at room temperature. We show that the electronic state of the nucleobases shift depending on the pH, with most distinct states identified at acidic pH. We also demonstrate identification of single nucleotide modifications (methylation here). Using these unique electronic fingerprints (or tunneling data), we report a partial sequence of beta lactamase (bla) gene, which encodes resistance to beta-lactam antibiotics, with over 95% success rate. These results highlight the potential of Q-Seq as a robust technique for next-generation sequencing.
Abdiche, Yasmina Noubia; Miles, Adam; Eckman, Josh; Foletti, Davide; Van Blarcom, Thomas J.; Yeung, Yik Andy; Pons, Jaume; Rajpal, Arvind
2014-01-01
Here, we demonstrate how array-based label-free biosensors can be applied to the multiplexed interaction analysis of large panels of analyte/ligand pairs, such as the epitope binning of monoclonal antibodies (mAbs). In this application, the larger the number of mAbs that are analyzed for cross-blocking in a pairwise and combinatorial manner against their specific antigen, the higher the probability of discriminating their epitopes. Since cross-blocking of two mAbs is necessary but not sufficient for them to bind an identical epitope, high-resolution epitope binning analysis determined by high-throughput experiments can enable the identification of mAbs with similar but unique epitopes. We demonstrate that a mAb's epitope and functional activity are correlated, thereby strengthening the relevance of epitope binning data to the discovery of therapeutic mAbs. We evaluated two state-of-the-art label-free biosensors that enable the parallel analysis of 96 unique analyte/ligand interactions and nearly ten thousand total interactions per unattended run. The IBIS-MX96 is a microarray-based surface plasmon resonance imager (SPRi) integrated with continuous flow microspotting technology whereas the Octet-HTX is equipped with disposable fiber optic sensors that use biolayer interferometry (BLI) detection. We compared their throughput, versatility, ease of sample preparation, and sample consumption in the context of epitope binning assays. We conclude that the main advantages of the SPRi technology are its exceptionally low sample consumption, facile sample preparation, and unparalleled unattended throughput. In contrast, the BLI technology is highly flexible because it allows for the simultaneous interaction analysis of 96 independent analyte/ligand pairs, ad hoc sensor replacement and on-line reloading of an analyte- or ligand-array. Thus, the complementary use of these two platforms can expedite applications that are relevant to the discovery of therapeutic mAbs, depending upon the sample availability, and the number and diversity of the interactions being studied. PMID:24651868
RAS - Screens & Assays - Drug Discovery
The RAS Drug Discovery group aims to develop assays that will reveal aspects of RAS biology upon which cancer cells depend. Successful assay formats are made available for high-throughput screening programs to yield potentially effective drug compounds.
High-Throughput Assessment of Cellular Mechanical Properties.
Darling, Eric M; Di Carlo, Dino
2015-01-01
Traditionally, cell analysis has focused on using molecular biomarkers for basic research, cell preparation, and clinical diagnostics; however, new microtechnologies are enabling evaluation of the mechanical properties of cells at throughputs that make them amenable to widespread use. We review the current understanding of how the mechanical characteristics of cells relate to underlying molecular and architectural changes, describe how these changes evolve with cell-state and disease processes, and propose promising biomedical applications that will be facilitated by the increased throughput of mechanical testing: from diagnosing cancer and monitoring immune states to preparing cells for regenerative medicine. We provide background about techniques that laid the groundwork for the quantitative understanding of cell mechanics and discuss current efforts to develop robust techniques for rapid analysis that aim to implement mechanophenotyping as a routine tool in biomedicine. Looking forward, we describe additional milestones that will facilitate broad adoption, as well as new directions not only in mechanically assessing cells but also in perturbing them to passively engineer cell state.
Wu, Jianglai; Tang, Anson H. L.; Mok, Aaron T. Y.; Yan, Wenwei; Chan, Godfrey C. F.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2017-01-01
Apart from the spatial resolution enhancement, scaling of temporal resolution, equivalently the imaging throughput, of fluorescence microscopy is of equal importance in advancing cell biology and clinical diagnostics. Yet, this attribute has mostly been overlooked because of the inherent speed limitation of existing imaging strategies. To address the challenge, we employ an all-optical laser-scanning mechanism, enabled by an array of reconfigurable spatiotemporally-encoded virtual sources, to demonstrate ultrafast fluorescence microscopy at line-scan rate as high as 8 MHz. We show that this technique enables high-throughput single-cell microfluidic fluorescence imaging at 75,000 cells/second and high-speed cellular 2D dynamical imaging at 3,000 frames per second, outperforming the state-of-the-art high-speed cameras and the gold-standard laser scanning strategies. Together with its wide compatibility to the existing imaging modalities, this technology could empower new forms of high-throughput and high-speed biological fluorescence microscopy that was once challenged. PMID:28966855
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-01-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory. PMID:21806374
Implementation of context independent code on a new array processor: The Super-65
NASA Technical Reports Server (NTRS)
Colbert, R. O.; Bowhill, S. A.
1981-01-01
The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.
High Throughput, Polymeric Aqueous Two-Phase Printing of Tumor Spheroids
Atefi, Ehsan; Lemmo, Stephanie; Fyffe, Darcy; Luker, Gary D.; Tavana, Hossein
2014-01-01
This paper presents a new 3D culture microtechnology for high throughput production of tumor spheroids and validates its utility for screening anti-cancer drugs. We use two immiscible polymeric aqueous solutions and microprint a submicroliter drop of the “patterning” phase containing cells into a bath of the “immersion” phase. Selecting proper formulations of biphasic systems using a panel of biocompatible polymers results in the formation of a round drop that confines cells to facilitate spontaneous formation of a spheroid without any external stimuli. Adapting this approach to robotic tools enables straightforward generation and maintenance of spheroids of well-defined size in standard microwell plates and biochemical analysis of spheroids in situ, which is not possible with existing techniques for spheroid culture. To enable high throughput screening, we establish a phase diagram to identify minimum cell densities within specific volumes of the patterning drop to result in a single spheroid. Spheroids show normal growth over long-term incubation and dose-dependent decrease in cellular viability when treated with drug compounds, but present significant resistance compared to monolayer cultures. The unprecedented ease of implementing this microtechnology and its robust performance will benefit high throughput studies of drug screening against cancer cells with physiologically-relevant 3D tumor models. PMID:25411577
Hebbard, Carleigh F F; Wang, Yan; Baker, Catherine J; Morrissey, James H
2014-08-11
Inorganic polyphosphates, linear polymers of orthophosphate, occur naturally throughout biology and have many industrial applications. Their biodegradable nature makes them attractive for a multitude of uses, and it would be important to understand how polyphosphates are turned over enzymatically. Studies of inorganic polyphosphatases are, however, hampered by the lack of high-throughput methods for detecting and quantifying rates of polyphosphate degradation. We now report chromogenic and fluorogenic polyphosphate substrates that permit spectrophotometric monitoring of polyphosphate hydrolysis and allow for high-throughput analyses of both endopolyphosphatase and exopolyphosphatase activities, depending on assay configuration. These substrates contain 4-nitrophenol or 4-methylumbelliferone moieties that are covalently attached to the terminal phosphates of polyphosphate via phosphoester linkages formed during reactions mediated by EDAC (1-ethyl-3-(3-(dimethylamino)propyl)carbodiimide). This report identifies Nudt2 as an inorganic polyphosphatase and also adds to the known coupling chemistry for polyphosphates, permitting facile covalent linkage of alcohols with the terminal phosphates of inorganic polyphosphate.
A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy
Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian
2016-01-01
Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Development and Application of a High Throughput Protein Unfolding Kinetic Assay
Wang, Qiang; Waterhouse, Nicklas; Feyijinmi, Olusegun; Dominguez, Matthew J.; Martinez, Lisa M.; Sharp, Zoey; Service, Rachel; Bothe, Jameson R.; Stollar, Elliott J.
2016-01-01
The kinetics of folding and unfolding underlie protein stability and quantification of these rates provides important insights into the folding process. Here, we present a simple high throughput protein unfolding kinetic assay using a plate reader that is applicable to the studies of the majority of 2-state folding proteins. We validate the assay by measuring kinetic unfolding data for the SH3 (Src Homology 3) domain from Actin Binding Protein 1 (AbpSH3) and its stabilized mutants. The results of our approach are in excellent agreement with published values. We further combine our kinetic assay with a plate reader equilibrium assay, to obtain indirect estimates of folding rates and use these approaches to characterize an AbpSH3-peptide hybrid. Our high throughput protein unfolding kinetic assays allow accurate screening of libraries of mutants by providing both kinetic and equilibrium measurements and provide a means for in-depth ϕ-value analyses. PMID:26745729
NASA Technical Reports Server (NTRS)
Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.
2013-01-01
In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
High-throughput state-machine replication using software transactional memory.
Zhao, Wenbing; Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2016-11-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload.
High-throughput state-machine replication using software transactional memory
Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2017-01-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload. PMID:29075049
Mahan, Alison E; Tedesco, Jacquelynne; Dionne, Kendall; Baruah, Kavitha; Cheng, Hao D; De Jager, Philip L; Barouch, Dan H; Suscovich, Todd; Ackerman, Margaret; Crispin, Max; Alter, Galit
2015-02-01
The N-glycan of the IgG constant region (Fc) plays a central role in tuning and directing multiple antibody functions in vivo, including antibody-dependent cellular cytotoxicity, complement deposition, and the regulation of inflammation, among others. However, traditional methods of N-glycan analysis, including HPLC and mass spectrometry, are technically challenging and ill suited to handle the large numbers of low concentration samples analyzed in clinical or animal studies of the N-glycosylation of polyclonal IgG. Here we describe a capillary electrophoresis-based technique to analyze plasma-derived polyclonal IgG-glycosylation quickly and accurately in a cost-effective, sensitive manner that is well suited for high-throughput analyses. Additionally, because a significant fraction of polyclonal IgG is glycosylated on both Fc and Fab domains, we developed an approach to separate and analyze domain-specific glycosylation in polyclonal human, rhesus and mouse IgGs. Overall, this protocol allows for the rapid, accurate, and sensitive analysis of Fc-specific IgG glycosylation, which is critical for population-level studies of how antibody glycosylation may vary in response to vaccination or infection, and across disease states ranging from autoimmunity to cancer in both clinical and animal studies. Copyright © 2014 Elsevier B.V. All rights reserved.
Probabilistic Assessment of High-Throughput Wireless Sensor Networks
Kim, Robin E.; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F.; Song, Junho
2016-01-01
Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved. PMID:27258270
Iuso, Arcangela; Repp, Birgit; Biagosch, Caroline; Terrile, Caterina; Prokisch, Holger
2017-01-01
Working with isolated mitochondria is the gold standard approach to investigate the function of the electron transport chain in tissues, free from the influence of other cellular factors. In this chapter, we outline a detailed protocol to measure the rate of oxygen consumption (OCR) with the high-throughput analyzer Seahorse XF96. More importantly, this protocol wants to provide practical tips for handling many different samples at once, and take a real advantage of using a high-throughput system. As a proof of concept, we have isolated mitochondria from brain, heart, liver, muscle, kidney, and lung of a wild-type mouse, and measured basal respiration (State II), ADP-stimulated respiration (State III), non-ADP-stimulated respiration (State IV o ), and FCCP-stimulated respiration (State III u ) using respiratory substrates specific to the respiratory chain complex I (RCCI) and complex II (RCCII). Mitochondrial purification and Seahorse runs were performed in less than eight working hours.
High Throughput Transcriptomics @ USEPA (Toxicology Forum)
The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest...
A high-throughput, multi-channel photon-counting detector with picosecond timing
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.
2009-06-01
High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pienkos, Philip T.
2013-11-01
This project is part of the overall effort by and among NREL, Colorado State University, University of Colorado, and Colorado School of Mines known as the Colorado Center for Biorefining and Biofuels. This is part of a larger statewide effort provided for in House Bill 06-1322, establishing a Colorado Collaboratory that envisions these four institutions working together as part of the state'senergy plan. This individual project with Colorado School of Mines is the first of many envisioned in this overall effort. The project focuses on development of high throughput procedures aimed at rapidly isolating and purifying novel microalgal strains (specificallymore » green alga and diatoms) from water samples obtained from unique aquatic environments.« less
Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions
NASA Technical Reports Server (NTRS)
Simons, Rainee N.
2015-01-01
NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.
Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions
NASA Technical Reports Server (NTRS)
Simons, Rainee N.
2015-01-01
NASA's plan to launch several spacecraft into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combs, S.K.; Foust, C.R.; Qualls, A.L.
Pellet injection systems for the next-generation fusion devices, such as the proposed International Thermonuclear Experimental Reactor (ITER), will require feed systems capable of providing a continuous supply of hydrogen ice at high throughputs. A straightforward concept in which multiple extruder units operate in tandem has been under development at the Oak Ridge National Laboratory. A prototype with three large-volume extruder units has been fabricated and tested in the laboratory. In experiments, it was found that each extruder could provide volumetric ice flow rates of up to {approximately}1.3 cm{sup 3}/s (for {approximately}10 s), which is sufficient for fueling fusion reactors atmore » the gigawatt power level. With the three extruders of the prototype operating in sequence, a steady rate of {approximately}0.33 cm{sup 3}/s was maintained for a duration of 1 h. Even steady-state rates approaching the full ITER design value ({approximately}1 cm{sup 3}/s) may be feasible with the prototype. However, additional extruder units (1{endash}3) would facilitate operations at the higher throughputs and reduce the duty cycle of each unit. The prototype can easily accommodate steady-state pellet fueling of present large tokamaks or other near-term plasma experiments.« less
Urasaki, Yasuyo; Fiscus, Ronald R; Le, Thuc T
2016-04-01
We describe an alternative approach to classifying fatty liver by profiling protein post-translational modifications (PTMs) with high-throughput capillary isoelectric focusing (cIEF) immunoassays. Four strains of mice were studied, with fatty livers induced by different causes, such as ageing, genetic mutation, acute drug usage, and high-fat diet. Nutrient-sensitive PTMs of a panel of 12 liver metabolic and signalling proteins were simultaneously evaluated with cIEF immunoassays, using nanograms of total cellular protein per assay. Changes to liver protein acetylation, phosphorylation, and O-N-acetylglucosamine glycosylation were quantified and compared between normal and diseased states. Fatty liver tissues could be distinguished from one another by distinctive protein PTM profiles. Fatty liver is currently classified by morphological assessment of lipid droplets, without identifying the underlying molecular causes. In contrast, high-throughput profiling of protein PTMs has the potential to provide molecular classification of fatty liver. Copyright © 2016 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
Multiplexed mass cytometry profiling of cellular states perturbed by small-molecule regulators
Bodenmiller, Bernd; Zunder, Eli R.; Finck, Rachel; Chen, Tiffany J.; Savig, Erica S.; Bruggner, Robert V.; Simonds, Erin F.; Bendall, Sean C.; Sachs, Karen; Krutzik, Peter O.; Nolan, Garry P.
2013-01-01
The ability to comprehensively explore the impact of bio-active molecules on human samples at the single-cell level can provide great insight for biomedical research. Mass cytometry enables quantitative single-cell analysis with deep dimensionality, but currently lacks high-throughput capability. Here we report a method termed mass-tag cellular barcoding (MCB) that increases mass cytometry throughput by sample multiplexing. 96-well format MCB was used to characterize human peripheral blood mononuclear cell (PBMC) signaling dynamics, cell-to-cell communication, the signaling variability between 8 donors, and to define the impact of 27 inhibitors on this system. For each compound, 14 phosphorylation sites were measured in 14 PBMC types, resulting in 18,816 quantified phosphorylation levels from each multiplexed sample. This high-dimensional systems-level inquiry allowed analysis across cell-type and signaling space, reclassified inhibitors, and revealed off-target effects. MCB enables high-content, high-throughput screening, with potential applications for drug discovery, pre-clinical testing, and mechanistic investigation of human disease. PMID:22902532
Effective mass and Fermi surface complexity factor from ab initio band structure calculations
NASA Astrophysics Data System (ADS)
Gibbs, Zachary M.; Ricci, Francesco; Li, Guodong; Zhu, Hong; Persson, Kristin; Ceder, Gerbrand; Hautier, Geoffroy; Jain, Anubhav; Snyder, G. Jeffrey
2017-02-01
The effective mass is a convenient descriptor of the electronic band structure used to characterize the density of states and electron transport based on a free electron model. While effective mass is an excellent first-order descriptor in real systems, the exact value can have several definitions, each of which describe a different aspect of electron transport. Here we use Boltzmann transport calculations applied to ab initio band structures to extract a density-of-states effective mass from the Seebeck Coefficient and an inertial mass from the electrical conductivity to characterize the band structure irrespective of the exact scattering mechanism. We identify a Fermi Surface Complexity Factor:
Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.
Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani
2007-03-01
Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.
High-throughput cell-based screening reveals a role for ZNF131 as a repressor of ERalpha signaling
Han, Xiao; Guo, Jinhai; Deng, Weiwei; Zhang, Chenying; Du, Peige; Shi, Taiping; Ma, Dalong
2008-01-01
Background Estrogen receptor α (ERα) is a transcription factor whose activity is affected by multiple regulatory cofactors. In an effort to identify the human genes involved in the regulation of ERα, we constructed a high-throughput, cell-based, functional screening platform by linking a response element (ERE) with a reporter gene. This allowed the cellular activity of ERα, in cells cotransfected with the candidate gene, to be quantified in the presence or absence of its cognate ligand E2. Results From a library of 570 human cDNA clones, we identified zinc finger protein 131 (ZNF131) as a repressor of ERα mediated transactivation. ZNF131 is a typical member of the BTB/POZ family of transcription factors, and shows both ubiquitous expression and a high degree of sequence conservation. The luciferase reporter gene assay revealed that ZNF131 inhibits ligand-dependent transactivation by ERα in a dose-dependent manner. Electrophoretic mobility shift assay clearly demonstrated that the interaction between ZNF131 and ERα interrupts or prevents ERα binding to the estrogen response element (ERE). In addition, ZNF131 was able to suppress the expression of pS2, an ERα target gene. Conclusion We suggest that the functional screening platform we constructed can be applied for high-throughput genomic screening candidate ERα-related genes. This in turn may provide new insights into the underlying molecular mechanisms of ERα regulation in mammalian cells. PMID:18847501
Cotney, Justin L; Noonan, James P
2015-02-02
Chromatin immunoprecipitation coupled with high-throughput sequencing (ChIP-Seq) is a powerful method used to identify genome-wide binding patterns of transcription factors and distribution of various histone modifications associated with different chromatin states. In most published studies, ChIP-Seq has been performed on cultured cells grown under controlled conditions, allowing generation of large amounts of material in a homogeneous biological state. Although such studies have provided great insight into the dynamic landscapes of animal genomes, they do not allow the examination of transcription factor binding and chromatin states in adult tissues, developing embryonic structures, or tumors. Such knowledge is critical to understanding the information required to create and maintain a complex biological tissue and to identify noncoding regions of the genome directly involved in tissues affected by complex diseases such as autism. Studying these tissue types with ChIP-Seq can be challenging due to the limited availability of tissues and the lack of complex biological states able to be achieved in culture. These inherent differences require alterations of standard cross-linking and chromatin extraction typically used in cell culture. Here we describe a general approach for using small amounts of animal tissue to perform ChIP-Seq directed at histone modifications and transcription factors. Tissue is homogenized before treatment with formaldehyde to ensure proper cross-linking, and a two-step nuclear isolation is performed to increase extraction of soluble chromatin. Small amounts of soluble chromatin are then used for immunoprecipitation (IP) and prepared for multiplexed high-throughput sequencing. © 2015 Cold Spring Harbor Laboratory Press.
Rogers, George W.; Brand, Martin D.; Petrosyan, Susanna; Ashok, Deepthi; Elorza, Alvaro A.; Ferrick, David A.; Murphy, Anne N.
2011-01-01
Recently developed technologies have enabled multi-well measurement of O2 consumption, facilitating the rate of mitochondrial research, particularly regarding the mechanism of action of drugs and proteins that modulate metabolism. Among these technologies, the Seahorse XF24 Analyzer was designed for use with intact cells attached in a monolayer to a multi-well tissue culture plate. In order to have a high throughput assay system in which both energy demand and substrate availability can be tightly controlled, we have developed a protocol to expand the application of the XF24 Analyzer to include isolated mitochondria. Acquisition of optimal rates requires assay conditions that are unexpectedly distinct from those of conventional polarography. The optimized conditions, derived from experiments with isolated mouse liver mitochondria, allow multi-well assessment of rates of respiration and proton production by mitochondria attached to the bottom of the XF assay plate, and require extremely small quantities of material (1–10 µg of mitochondrial protein per well). Sequential measurement of basal, State 3, State 4, and uncoupler-stimulated respiration can be made in each well through additions of reagents from the injection ports. We describe optimization and validation of this technique using isolated mouse liver and rat heart mitochondria, and apply the approach to discover that inclusion of phosphatase inhibitors in the preparation of the heart mitochondria results in a specific decrease in rates of Complex I-dependent respiration. We believe this new technique will be particularly useful for drug screening and for generating previously unobtainable respiratory data on small mitochondrial samples. PMID:21799747
Physics-informed machine learning for inorganic scintillator discovery
NASA Astrophysics Data System (ADS)
Pilania, G.; McClellan, K. J.; Stanek, C. R.; Uberuaga, B. P.
2018-06-01
Applications of inorganic scintillators—activated with lanthanide dopants, such as Ce and Eu—are found in diverse fields. As a strict requirement to exhibit scintillation, the 4f ground state (with the electronic configuration of [Xe]4fn 5d0) and 5d1 lowest excited state (with the electronic configuration of [Xe]4fn-1 5d1) levels induced by the activator must lie within the host bandgap. Here we introduce a new machine learning (ML) based search strategy for high-throughput chemical space explorations to discover and design novel inorganic scintillators. Building upon well-known physics-based chemical trends for the host dependent electron binding energies within the 4f and 5d1 energy levels of lanthanide ions and available experimental data, the developed ML model—coupled with knowledge of the vacuum referred valence and conduction band edges computed from first principles—can rapidly and reliably estimate the relative positions of the activator's energy levels relative to the valence and conduction band edges of any given host chemistry. Using perovskite oxides and elpasolite halides as examples, the presented approach has been demonstrated to be able to (i) capture systematic chemical trends across host chemistries and (ii) effectively screen promising compounds in a high-throughput manner. While a number of other application-specific performance requirements need to be considered for a viable scintillator, the scheme developed here can be a practically useful tool to systematically down-select the most promising candidate materials in a first line of screening for a subsequent in-depth investigation.
NASA Astrophysics Data System (ADS)
Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina
2016-02-01
Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver.
Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina
2016-01-01
Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver. PMID:26864633
Throughput increase by adjustment of the BARC drying time with coat track process
NASA Astrophysics Data System (ADS)
Brakensiek, Nickolas L.; Long, Ryan
2005-05-01
Throughput of a coater module within the coater track is related to the solvent evaporation rate from the material that is being coated. Evaporation rate is controlled by the spin dynamics of the wafer and airflow dynamics over the wafer. Balancing these effects is the key to achieving very uniform coatings across a flat unpatterned wafer. As today"s coat tracks are being pushed to higher throughputs to match the scanner, the coat module throughput must be increased as well. For chemical manufacturers the evaporation rate of the material depends on the solvent used. One measure of relative evaporation rates is to compare flash points of a solvent. The lower the flash point, the quicker the solvent will evaporate. It is possible to formulate products with these volatile solvents although at a price. Shipping and manufacturing a more flammable product increase chances of fire, thereby increasing insurance premiums. Also, the end user of these chemicals will have to take extra precautions in the fab and in storage of these more flammable chemicals. An alternative coat process is possible which would allow higher throughput in a distinct coat module without sacrificing safety. A tradeoff is required for this process, that being a more complicated coat process and a higher viscosity chemical. The coat process uses the fact that evaporation rate depends on the spin dynamics of the wafer by utilizing a series of spin speeds that first would set the thickness of the material followed by a high spin speed to remove the residual solvent. This new process can yield a throughput of over 150 wafers per hour (wph) given two coat modules. The thickness uniformity of less than 2 nm (3 sigma) is still excellent, while drying times are shorter than 10 seconds to achieve the 150 wph throughput targets.
Product Deformulation to Inform High-throughput Exposure Predictions (SOT)
The health risks posed by the thousands of chemicals in our environment depends on both chemical hazard and exposure. However, relatively few chemicals have estimates of exposure intake, limiting the understanding of risks. We have previously developed a heuristics-based exposur...
GPU Lossless Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.
2014-01-01
Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.
Zottig, Ximena; Meddeb-Mouelhi, Fatma; Beauregard, Marc
2016-03-01
A fluorescence-based assay for the determination of lipase activity using rhodamine B as an indicator, and natural substrates such as olive oil, is described. It is based on the use of a rhodamine B-natural substrate emulsion in liquid state, which is advantageous over agar plate assays. This high-throughput method is simple and rapid and can be automated, making it suitable for screening and metagenomics application. Reaction conditions such as pH and temperature can be varied and controlled. Using triolein or olive oil as a natural substrate allows monitoring of lipase activity in reaction conditions that are closer to those used in industrial settings. The described method is sensitive over a wide range of product concentrations and offers good reproducibility. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allaire, Marc, E-mail: allaire@bnl.gov; Moiseeva, Natalia; Botez, Cristian E.
The correlation coefficients calculated between raw powder diffraction profiles can be used to identify ligand-bound/unbound states of lysozyme. The discovery of ligands that bind specifically to a targeted protein benefits from the development of generic assays for high-throughput screening of a library of chemicals. Protein powder diffraction (PPD) has been proposed as a potential method for use as a structure-based assay for high-throughput screening applications. Building on this effort, powder samples of bound/unbound states of soluble hen-egg white lysozyme precipitated with sodium chloride were compared. The correlation coefficients calculated between the raw diffraction profiles were consistent with the known bindingmore » properties of the ligands and suggested that the PPD approach can be used even prior to a full description using stereochemically restrained Rietveld refinement.« less
High-throughput screening and small animal models, where are we?
Giacomotto, Jean; Ségalat, Laurent
2010-01-01
Current high-throughput screening methods for drug discovery rely on the existence of targets. Moreover, most of the hits generated during screenings turn out to be invalid after further testing in animal models. To by-pass these limitations, efforts are now being made to screen chemical libraries on whole animals. One of the most commonly used animal model in biology is the murine model Mus musculus. However, its cost limit its use in large-scale therapeutic screening. In contrast, the nematode Caenorhabditis elegans, the fruit fly Drosophila melanogaster, and the fish Danio rerio are gaining momentum as screening tools. These organisms combine genetic amenability, low cost and culture conditions that are compatible with large-scale screens. Their main advantage is to allow high-throughput screening in a whole-animal context. Moreover, their use is not dependent on the prior identification of a target and permits the selection of compounds with an improved safety profile. This review surveys the versatility of these animal models for drug discovery and discuss the options available at this day. PMID:20423335
Wu, Szu-Huei; Yao, Chun-Hsu; Hsieh, Chieh-Jui; Liu, Yu-Wei; Chao, Yu-Sheng; Song, Jen-Shin; Lee, Jinq-Chyi
2015-07-10
Sodium-dependent glucose co-transporter 2 (SGLT2) inhibitors are of current interest as a treatment for type 2 diabetes. Efforts have been made to discover phlorizin-related glycosides with good SGLT2 inhibitory activity. To increase structural diversity and better understand the role of non-glycoside SGLT2 inhibitors on glycemic control, we initiated a research program to identify non-glycoside hits from high-throughput screening. Here, we report the development of a novel, fluorogenic probe-based glucose uptake system based on a Cu(I)-catalyzed [3+2] cycloaddition. The safer processes and cheaper substances made the developed assay our first priority for large-scale primary screening as compared to the well-known [(14)C]-labeled α-methyl-D-glucopyranoside ([(14)C]-AMG) radioactive assay. This effort culminated in the identification of a benzimidazole, non-glycoside SGLT2 hit with an EC50 value of 0.62 μM by high-throughput screening of 41,000 compounds. Copyright © 2015 Elsevier B.V. All rights reserved.
High-throughput imaging of adult fluorescent zebrafish with an LED fluorescence macroscope
Blackburn, Jessica S; Liu, Sali; Raimondi, Aubrey R; Ignatius, Myron S; Salthouse, Christopher D; Langenau, David M
2011-01-01
Zebrafish are a useful vertebrate model for the study of development, behavior, disease and cancer. A major advantage of zebrafish is that large numbers of animals can be economically used for experimentation; however, high-throughput methods for imaging live adult zebrafish had not been developed. Here, we describe protocols for building a light-emitting diode (LED) fluorescence macroscope and for using it to simultaneously image up to 30 adult animals that transgenically express a fluorescent protein, are transplanted with fluorescently labeled tumor cells or are tagged with fluorescent elastomers. These protocols show that the LED fluorescence macroscope is capable of distinguishing five fluorescent proteins and can image unanesthetized swimming adult zebrafish in multiple fluorescent channels simultaneously. The macroscope can be built and used for imaging within 1 day, whereas creating fluorescently labeled adult zebrafish requires 1 hour to several months, depending on the method chosen. The LED fluorescence macroscope provides a low-cost, high-throughput method to rapidly screen adult fluorescent zebrafish and it will be useful for imaging transgenic animals, screening for tumor engraftment, and tagging individual fish for long-term analysis. PMID:21293462
Bell, Robert T; Jacobs, Alan G; Sorg, Victoria C; Jung, Byungki; Hill, Megan O; Treml, Benjamin E; Thompson, Michael O
2016-09-12
A high-throughput method for characterizing the temperature dependence of material properties following microsecond to millisecond thermal annealing, exploiting the temperature gradients created by a lateral gradient laser spike anneal (lgLSA), is presented. Laser scans generate spatial thermal gradients of up to 5 °C/μm with peak temperatures ranging from ambient to in excess of 1400 °C, limited only by laser power and materials thermal limits. Discrete spatial property measurements across the temperature gradient are then equivalent to independent measurements after varying temperature anneals. Accurate temperature calibrations, essential to quantitative analysis, are critical and methods for both peak temperature and spatial/temporal temperature profile characterization are presented. These include absolute temperature calibrations based on melting and thermal decomposition, and time-resolved profiles measured using platinum thermistors. A variety of spatially resolved measurement probes, ranging from point-like continuous profiling to large area sampling, are discussed. Examples from annealing of III-V semiconductors, CdSe quantum dots, low-κ dielectrics, and block copolymers are included to demonstrate the flexibility, high throughput, and precision of this technique.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens.
Morgens, David W; Wainberg, Michael; Boyle, Evan A; Ursu, Oana; Araya, Carlos L; Tsui, C Kimberly; Haney, Michael S; Hess, Gaelen T; Han, Kyuho; Jeng, Edwin E; Li, Amy; Snyder, Michael P; Greenleaf, William J; Kundaje, Anshul; Bassik, Michael C
2017-05-05
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens
Morgens, David W.; Wainberg, Michael; Boyle, Evan A.; Ursu, Oana; Araya, Carlos L.; Tsui, C. Kimberly; Haney, Michael S.; Hess, Gaelen T.; Han, Kyuho; Jeng, Edwin E.; Li, Amy; Snyder, Michael P.; Greenleaf, William J.; Kundaje, Anshul; Bassik, Michael C.
2017-01-01
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens. PMID:28474669
Development and use of molecular markers: past and present.
Grover, Atul; Sharma, P C
2016-01-01
Molecular markers, due to their stability, cost-effectiveness and ease of use provide an immensely popular tool for a variety of applications including genome mapping, gene tagging, genetic diversity diversity, phylogenetic analysis and forensic investigations. In the last three decades, a number of molecular marker techniques have been developed and exploited worldwide in different systems. However, only a handful of these techniques, namely RFLPs, RAPDs, AFLPs, ISSRs, SSRs and SNPs have received global acceptance. A recent revolution in DNA sequencing techniques has taken the discovery and application of molecular markers to high-throughput and ultrahigh-throughput levels. Although, the choice of marker will obviously depend on the targeted use, microsatellites, SNPs and genotyping by sequencing (GBS) largely fulfill most of the user requirements. Further, modern transcriptomic and functional markers will lead the ventures onto high-density genetic map construction, identification of QTLs, breeding and conservation strategies in times to come in combination with other high throughput techniques. This review presents an overview of different marker technologies and their variants with a comparative account of their characteristic features and applications.
Experimental and Study Design Considerations for Uncovering Oncometabolites.
Haznadar, Majda; Mathé, Ewy A
2017-01-01
Metabolomics as a field has gained attention due to its potential for biomarker discovery, namely because it directly reflects disease phenotype and is the downstream effect of posttranslational modifications. The field provides a "top-down," integrated view of biochemistry in complex organisms, as opposed to the traditional "bottom-up" approach that aims to analyze networks of interactions between genes, proteins and metabolites. It also allows for the detection of thousands of endogenous metabolites in various clinical biospecimens in a high-throughput manner, including tissue and biofluids such as blood and urine. Of note, because biological fluid samples can be collected relatively easily, the time-dependent fluctuations of metabolites can be readily studied in detail.In this chapter, we aim to provide an overview of (1) analytical methods that are currently employed in the field, and (2) study design concepts that should be considered prior to conducting high-throughput metabolomics studies. While widely applicable, the concepts presented here are namely applicable to high-throughput untargeted studies that aim to search for metabolite biomarkers that are associated with a particular human disease.
Okagbare, Paul I.; Soper, Steven A.
2011-01-01
Microfluidics represents a viable platform for performing High Throughput Screening (HTS) due to its ability to automate fluid handling and generate fluidic networks with high number densities over small footprints appropriate for the simultaneous optical interrogation of many screening assays. While most HTS campaigns depend on fluorescence, readers typically use point detection and serially address the assay results significantly lowering throughput or detection sensitivity due to a low duty cycle. To address this challenge, we present here the fabrication of a high density microfluidic network packed into the imaging area of a large field-of-view (FoV) ultrasensitive fluorescence detection system. The fluidic channels were 1, 5 or 10 μm (width), 1 μm (depth) with a pitch of 1–10 μm and each fluidic processor was individually addressable. The fluidic chip was produced from a molding tool using hot embossing and thermal fusion bonding to enclose the fluidic channels. A 40X microscope objective (numerical aperture = 0.75) created a FoV of 200 μm, providing the ability to interrogate ~25 channels using the current fluidic configuration. An ultrasensitive fluorescence detection system with a large FoV was used to transduce fluorescence signals simultaneously from each fluidic processor onto the active area of an electron multiplying charge-coupled device (EMCCD). The utility of these multichannel networks for HTS was demonstrated by carrying out the high throughput monitoring of the activity of an enzyme, APE1, used as a model screening assay. PMID:20872611
High-throughput sequencing reveals unprecedented diversities of Aspergillus species in outdoor air.
Lee, S; An, C; Xu, S; Lee, S; Yamamoto, N
2016-09-01
This study used the Illumina MiSeq to analyse compositions and diversities of Aspergillus species in outdoor air. The seasonal air samplings were performed at two locations in Seoul, South Korea. The results showed the relative abundances of all Aspergillus species combined ranging from 0·20 to 18% and from 0·19 to 21% based on the number of the internal transcribed spacer 1 (ITS1) and β-tubulin (BenA) gene sequences respectively. Aspergillus fumigatus was the most dominant species with the mean relative abundances of 1·2 and 5·5% based on the number of the ITS1 and BenA sequences respectively. A total of 29 Aspergillus species were detected and identified down to the species rank, among which nine species were known opportunistic pathogens. Remarkably, eight of the nine pathogenic species were detected by either one of the two markers, suggesting the need of using multiple markers and/or primer pairs when the assessments are made based on the high-throughput sequencing. Due to diversity of species within the genus Aspergillus, the high-throughput sequencing was useful to characterize their compositions and diversities in outdoor air, which are thought to be difficult to be accurately characterized by conventional culture and/or Sanger sequencing-based techniques. Aspergillus is a diverse genus of fungi with more than 300 species reported in literature. Aspergillus is important since some species are known allergens and opportunistic human pathogens. Traditionally, growth-dependent methods have been used to detect Aspergillus species in air. However, these methods are limited in the number of isolates that can be analysed for their identities, resulting in inaccurate characterizations of Aspergillus diversities. This study used the high-throughput sequencing to explore Aspergillus diversities in outdoor, which are thought to be difficult to be accurately characterized by traditional growth-dependent techniques. © 2016 The Society for Applied Microbiology.
Genome sequencing in microfabricated high-density picolitre reactors.
Margulies, Marcel; Egholm, Michael; Altman, William E; Attiya, Said; Bader, Joel S; Bemben, Lisa A; Berka, Jan; Braverman, Michael S; Chen, Yi-Ju; Chen, Zhoutao; Dewell, Scott B; Du, Lei; Fierro, Joseph M; Gomes, Xavier V; Godwin, Brian C; He, Wen; Helgesen, Scott; Ho, Chun Heen; Ho, Chun He; Irzyk, Gerard P; Jando, Szilveszter C; Alenquer, Maria L I; Jarvie, Thomas P; Jirage, Kshama B; Kim, Jong-Bum; Knight, James R; Lanza, Janna R; Leamon, John H; Lefkowitz, Steven M; Lei, Ming; Li, Jing; Lohman, Kenton L; Lu, Hong; Makhijani, Vinod B; McDade, Keith E; McKenna, Michael P; Myers, Eugene W; Nickerson, Elizabeth; Nobile, John R; Plant, Ramona; Puc, Bernard P; Ronan, Michael T; Roth, George T; Sarkis, Gary J; Simons, Jan Fredrik; Simpson, John W; Srinivasan, Maithreyan; Tartaro, Karrie R; Tomasz, Alexander; Vogt, Kari A; Volkmer, Greg A; Wang, Shally H; Wang, Yong; Weiner, Michael P; Yu, Pengguang; Begley, Richard F; Rothberg, Jonathan M
2005-09-15
The proliferation of large-scale DNA-sequencing projects in recent years has driven a search for alternative methods to reduce time and cost. Here we describe a scalable, highly parallel sequencing system with raw throughput significantly greater than that of state-of-the-art capillary electrophoresis instruments. The apparatus uses a novel fibre-optic slide of individual wells and is able to sequence 25 million bases, at 99% or better accuracy, in one four-hour run. To achieve an approximately 100-fold increase in throughput over current Sanger sequencing technology, we have developed an emulsion method for DNA amplification and an instrument for sequencing by synthesis using a pyrosequencing protocol optimized for solid support and picolitre-scale volumes. Here we show the utility, throughput, accuracy and robustness of this system by shotgun sequencing and de novo assembly of the Mycoplasma genitalium genome with 96% coverage at 99.96% accuracy in one run of the machine.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992
High-throughput technology for novel SO2 oxidation catalysts
Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F
2011-01-01
We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427
Bazopoulou, Daphne; Chaudhury, Amrita R; Pantazis, Alexandros; Chronis, Nikos
2017-08-24
Discovery of molecular targets or compounds that alter neuronal function can lead to therapeutic advances that ameliorate age-related neurodegenerative pathologies. Currently, there is a lack of in vivo screening technologies for the discovery of compounds that affect the age-dependent neuronal physiology. Here, we present a high-throughput, microfluidic-based assay for automated manipulation and on-chip monitoring and analysis of stimulus-evoked calcium responses of intact C. elegans at various life stages. First, we successfully applied our technology to quantify the effects of aging and age-related genetic and chemical factors in the calcium transients of the ASH sensory neuron. We then performed a large-scale screen of a library of 107 FDA-approved compounds to identify hits that prevented the age-dependent functional deterioration of ASH. The robust performance of our assay makes it a valuable tool for future high-throughput applications based on in vivo functional imaging.
Tong, Zhi-Bin; Hogberg, Helena; Kuo, David; Sakamuru, Srilatha; Xia, Menghang; Smirnova, Lena; Hartung, Thomas; Gerhold, David
2017-02-01
More than 75 000 man-made chemicals contaminate the environment; many of these have not been tested for toxicities. These chemicals demand quantitative high-throughput screening assays to assess them for causative roles in neurotoxicities, including Parkinson's disease and other neurodegenerative disorders. To facilitate high throughput screening for cytotoxicity to neurons, three human neuronal cellular models were compared: SH-SY5Y neuroblastoma cells, LUHMES conditionally-immortalized dopaminergic neurons, and Neural Stem Cells (NSC) derived from human fetal brain. These three cell lines were evaluated for rapidity and degree of differentiation, and sensitivity to 32 known or candidate neurotoxicants. First, expression of neural differentiation genes was assayed during a 7-day differentiation period. Of the three cell lines, LUHMES showed the highest gene expression of neuronal markers after differentiation. Both in the undifferentiated state and after 7 days of neuronal differentiation, LUHMES cells exhibited greater cytotoxic sensitivity to most of 32 suspected or known neurotoxicants than SH-SY5Y or NSCs. LUHMES cells were also unique in being more susceptible to several compounds in the differentiating state than in the undifferentiated state; including known neurotoxicants colchicine, methyl-mercury (II), and vincristine. Gene expression results suggest that differentiating LUHMES cells may be susceptible to apoptosis because they express low levels of anti-apoptotic genes BCL2 and BIRC5/survivin, whereas SH-SY5Y cells may be resistant to apoptosis because they express high levels of BCL2, BIRC5/survivin, and BIRC3 genes. Thus, LUHMES cells exhibited favorable characteristics for neuro-cytotoxicity screening: rapid differentiation into neurons that exhibit high level expression neuronal marker genes, and marked sensitivity of LUHMES cells to known neurotoxicants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
TimeXNet Web: Identifying cellular response networks from diverse omics time-course data.
Tan, Phit Ling; López, Yosvany; Nakai, Kenta; Patil, Ashwini
2018-05-14
Condition-specific time-course omics profiles are frequently used to study cellular response to stimuli and identify associated signaling pathways. However, few online tools allow users to analyze multiple types of high-throughput time-course data. TimeXNet Web is a web server that extracts a time-dependent gene/protein response network from time-course transcriptomic, proteomic or phospho-proteomic data, and an input interaction network. It classifies the given genes/proteins into time-dependent groups based on the time of their highest activity and identifies the most probable paths connecting genes/proteins in consecutive groups. The response sub-network is enriched in activated genes/proteins and contains novel regulators that do not show any observable change in the input data. Users can view the resultant response network and analyze it for functional enrichment. TimeXNet Web supports the analysis of high-throughput data from multiple species by providing high quality, weighted protein-protein interaction networks for 12 model organisms. http://txnet.hgc.jp/. ashwini@hgc.jp. Supplementary data are available at Bioinformatics online.
Performance Evaluation of Bluetooth Low Energy: A Systematic Review.
Tosi, Jacopo; Taffoni, Fabrizio; Santacatterina, Marco; Sannino, Roberto; Formica, Domenico
2017-12-13
Small, compact and embedded sensors are a pervasive technology in everyday life for a wide number of applications (e.g., wearable devices, domotics, e-health systems, etc.). In this context, wireless transmission plays a key role, and among available solutions, Bluetooth Low Energy (BLE) is gaining more and more popularity. BLE merges together good performance, low-energy consumption and widespread diffusion. The aim of this work is to review the main methodologies adopted to investigate BLE performance. The first part of this review is an in-depth description of the protocol, highlighting the main characteristics and implementation details. The second part reviews the state of the art on BLE characteristics and performance. In particular, we analyze throughput, maximum number of connectable sensors, power consumption, latency and maximum reachable range, with the aim to identify what are the current limits of BLE technology. The main results can be resumed as follows: throughput may theoretically reach the limit of ~230 kbps, but actual applications analyzed in this review show throughputs limited to ~100 kbps; the maximum reachable range is strictly dependent on the radio power, and it goes up to a few tens of meters; the maximum number of nodes in the network depends on connection parameters, on the network architecture and specific device characteristics, but it is usually lower than 10; power consumption and latency are largely modeled and analyzed and are strictly dependent on a huge number of parameters. Most of these characteristics are based on analytical models, but there is a need for rigorous experimental evaluations to understand the actual limits.
Performance Evaluation of Bluetooth Low Energy: A Systematic Review
Taffoni, Fabrizio; Santacatterina, Marco; Sannino, Roberto
2017-01-01
Small, compact and embedded sensors are a pervasive technology in everyday life for a wide number of applications (e.g., wearable devices, domotics, e-health systems, etc.). In this context, wireless transmission plays a key role, and among available solutions, Bluetooth Low Energy (BLE) is gaining more and more popularity. BLE merges together good performance, low-energy consumption and widespread diffusion. The aim of this work is to review the main methodologies adopted to investigate BLE performance. The first part of this review is an in-depth description of the protocol, highlighting the main characteristics and implementation details. The second part reviews the state of the art on BLE characteristics and performance. In particular, we analyze throughput, maximum number of connectable sensors, power consumption, latency and maximum reachable range, with the aim to identify what are the current limits of BLE technology. The main results can be resumed as follows: throughput may theoretically reach the limit of ~230 kbps, but actual applications analyzed in this review show throughputs limited to ~100 kbps; the maximum reachable range is strictly dependent on the radio power, and it goes up to a few tens of meters; the maximum number of nodes in the network depends on connection parameters, on the network architecture and specific device characteristics, but it is usually lower than 10; power consumption and latency are largely modeled and analyzed and are strictly dependent on a huge number of parameters. Most of these characteristics are based on analytical models, but there is a need for rigorous experimental evaluations to understand the actual limits. PMID:29236085
Khalid, Ruzelan; Nawawi, Mohd Kamal M; Kawsar, Luthful A; Ghani, Noraida A; Kamil, Anton A; Mustafa, Adli
2013-01-01
M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed.
Zhang, Guozhu; Xie, Changsheng; Zhang, Shunping; Zhao, Jianwei; Lei, Tao; Zeng, Dawen
2014-09-08
A combinatorial high-throughput temperature-programmed method to obtain the optimal operating temperature (OOT) of gas sensor materials is demonstrated here for the first time. A material library consisting of SnO2, ZnO, WO3, and In2O3 sensor films was fabricated by screen printing. Temperature-dependent conductivity curves were obtained by scanning this gas sensor library from 300 to 700 K in different atmospheres (dry air, formaldehyde, carbon monoxide, nitrogen dioxide, toluene and ammonia), giving the OOT of each sensor formulation as a function of the carrier and analyte gases. A comparative study of the temperature-programmed method and a conventional method showed good agreement in measured OOT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR
Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less
NASA Astrophysics Data System (ADS)
Zhuang, Huidong; Zhang, Xiaodong
2013-08-01
In large tokamaks, disruption of high current plasma would damage plasma facing component surfaces (PFCs) or other inner components due to high heat load, electromagnetic force load and runaway electrons. It would also influence the subsequent plasma discharge due to production of impurities during disruptions. So the avoidance and mitigation of disruptions is essential for the next generation of tokamaks, such as ITER. Massive gas injection (MGI) is a promising method of disruption mitigation. A new fast valve has been developed successfully on EAST. The valve can be opened in 0.5 ms, and the duration of open state is largely dependent on the gas pressure and capacitor voltage. The throughput of the valve can be adjusted from 0 mbar·L to 700 mbar·L by changing the capacitor voltage and gas pressure. The response time and throughput of the fast valve can meet the requirement of disruption mitigation on EAST. In the last round campaign of EAST and HT-7 in 2010, the fast valve has operated successfully. He and Ar was used for the disruption mitigation on HT-7. By injecting the proper amount of gas, the current quench rate could be slowed down, and the impurities radiation would be greatly improved. In elongated plasmas of EAST discharges, the experimental data is opposite to that which is expected.
Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan
2016-01-01
Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
AbstractBackground. High-throughput in vitro screening is an important tool for evaluating the potential biological activity of the thousands of existing chemicals in commerce and the hundreds more introduced each year. Among the assay technologies available, high-content imaging...
Hoang, Van-An; Subramaniyam, Sathiyamoorthy; Kang, Jong-Pyo; Kang, Chang Ho; Yang, Deok-Chun
2016-01-01
Traditional molecular methods have been used to examine bacterial communities in ginseng-cultivated soil samples in a time-dependent manner. Despite these efforts, our understanding of the bacterial community is still inadequate. Therefore, in this study, a high-throughput sequencing approach was employed to investigate bacterial diversity in various ginseng field soil samples over cultivation times of 2, 4, and 6 years in the first and second rounds of cultivation. We used non-cultivated soil samples to perform a comparative study. Moreover, this study assessed changes in the bacterial community associated with soil depth and the health state of the ginseng. Bacterial richness decreased through years of cultivation. This study detected differences in relative abundance of bacterial populations between the first and second rounds of cultivation, years of cultivation, and health states of ginseng. These bacterial populations were mainly distributed in the classes Acidobacteria, Alphaproteobacteria, Deltaproteobacteria, Gammaproteobacteria, and Sphingobacteria. In addition, we found that pH, available phosphorus, and exchangeable Ca+ seemed to have high correlations with bacterial class in ginseng cultivated soil. PMID:27187071
Nguyen, Ngoc-Lan; Kim, Yeon-Ju; Hoang, Van-An; Subramaniyam, Sathiyamoorthy; Kang, Jong-Pyo; Kang, Chang Ho; Yang, Deok-Chun
2016-01-01
Traditional molecular methods have been used to examine bacterial communities in ginseng-cultivated soil samples in a time-dependent manner. Despite these efforts, our understanding of the bacterial community is still inadequate. Therefore, in this study, a high-throughput sequencing approach was employed to investigate bacterial diversity in various ginseng field soil samples over cultivation times of 2, 4, and 6 years in the first and second rounds of cultivation. We used non-cultivated soil samples to perform a comparative study. Moreover, this study assessed changes in the bacterial community associated with soil depth and the health state of the ginseng. Bacterial richness decreased through years of cultivation. This study detected differences in relative abundance of bacterial populations between the first and second rounds of cultivation, years of cultivation, and health states of ginseng. These bacterial populations were mainly distributed in the classes Acidobacteria, Alphaproteobacteria, Deltaproteobacteria, Gammaproteobacteria, and Sphingobacteria. In addition, we found that pH, available phosphorus, and exchangeable Ca+ seemed to have high correlations with bacterial class in ginseng cultivated soil.
High throughput DNA damage quantification of human tissue with home-based collection device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costes, Sylvain V.; Tang, Jonathan; Yannone, Steven M.
Kits, methods and systems for providing a service to provide a subject with information regarding the state of a subject's DNA damage. Collection, processing and analysis of samples are also described.
Prediction of Chemical Function: Model Development and Application
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...
THE TOXCAST PROGRAM FOR PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS
The United States Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals...
Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.
Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong
2008-04-01
The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.
VIRTEX-5 Fpga Implementation of Advanced Encryption Standard Algorithm
NASA Astrophysics Data System (ADS)
Rais, Muhammad H.; Qasim, Syed M.
2010-06-01
In this paper, we present an implementation of Advanced Encryption Standard (AES) cryptographic algorithm using state-of-the-art Virtex-5 Field Programmable Gate Array (FPGA). The design is coded in Very High Speed Integrated Circuit Hardware Description Language (VHDL). Timing simulation is performed to verify the functionality of the designed circuit. Performance evaluation is also done in terms of throughput and area. The design implemented on Virtex-5 (XC5VLX50FFG676-3) FPGA achieves a maximum throughput of 4.34 Gbps utilizing a total of 399 slices.
Optimal forwarding ratio on dynamical networks with heterogeneous mobility
NASA Astrophysics Data System (ADS)
Gan, Yu; Tang, Ming; Yang, Hanxin
2013-05-01
Since the discovery of non-Poisson statistics of human mobility trajectories, more attention has been paid to understand the role of these patterns in different dynamics. In this study, we first introduce the heterogeneous mobility of mobile agents into dynamical networks, and then investigate packet forwarding strategy on the heterogeneous dynamical networks. We find that the faster speed and the higher proportion of high-speed agents can enhance the network throughput and reduce the mean traveling time in random forwarding. A hierarchical structure in the dependence of high-speed is observed: the network throughput remains unchanged at small and large high-speed value. It is also interesting to find that a slightly preferential forwarding to high-speed agents can maximize the network capacity. Through theoretical analysis and numerical simulations, we show that the optimal forwarding ratio stems from the local structural heterogeneity of low-speed agents.
2012-01-01
Current therapies to enhance CNS cholinergic function rely primarily on extracellular acetylcholinesterase (AChE) inhibition, a pharmacotherapeutic strategy that produces dose-limiting side effects. The Na+-dependent, high-affinity choline transporter (CHT) is an unexplored target for cholinergic medication development. Although functional at the plasma membrane, CHT at steady-state is localized to synaptic vesicles such that vesicular fusion can support a biosynthetic response to neuronal excitation. To identify allosteric potentiators of CHT activity, we mapped endocytic sequences in the C-terminus of human CHT, identifying transporter mutants that exhibit significantly increased transport function. A stable HEK-293 cell line was generated from one of these mutants (CHT LV-AA) and used to establish a high-throughput screen (HTS) compatible assay based on the electrogenic nature of the transporter. We established that the addition of choline to these cells, at concentrations appropriate for high-affinity choline transport at presynaptic terminals, generates a hemicholinium-3 (HC-3)-sensitive, membrane depolarization that can be used for the screening of CHT inhibitors and activators. Using this assay, we discovered that staurosporine increased CHT LV-AA choline uptake activity, an effect mediated by a decrease in choline KM with no change in Vmax. As staurosporine did not change surface levels of CHT, nor inhibit HC-3 binding, we propose that its action is directly or indirectly allosteric in nature. Surprisingly, staurosporine reduced choline-induced membrane depolarization, suggesting that increased substrate coupling to ion gradients, arising at the expense of nonstoichiometric ion flow, accompanies a shift of CHT to a higher-affinity state. Our findings provide a new approach for the identification of CHT modulators that is compatible with high-throughput screening approaches and presents a novel model by which small molecules can enhance substrate flux through enhanced gradient coupling. PMID:23077721
Ruggiero, Alicia M; Wright, Jane; Ferguson, Shawn M; Lewis, Michelle; Emerson, Katie S; Iwamoto, Hideki; Ivy, Michael T; Holmstrand, Ericka C; Ennis, Elizabeth A; Weaver, C David; Blakely, Randy D
2012-10-17
Current therapies to enhance CNS cholinergic function rely primarily on extracellular acetylcholinesterase (AChE) inhibition, a pharmacotherapeutic strategy that produces dose-limiting side effects. The Na(+)-dependent, high-affinity choline transporter (CHT) is an unexplored target for cholinergic medication development. Although functional at the plasma membrane, CHT at steady-state is localized to synaptic vesicles such that vesicular fusion can support a biosynthetic response to neuronal excitation. To identify allosteric potentiators of CHT activity, we mapped endocytic sequences in the C-terminus of human CHT, identifying transporter mutants that exhibit significantly increased transport function. A stable HEK-293 cell line was generated from one of these mutants (CHT LV-AA) and used to establish a high-throughput screen (HTS) compatible assay based on the electrogenic nature of the transporter. We established that the addition of choline to these cells, at concentrations appropriate for high-affinity choline transport at presynaptic terminals, generates a hemicholinium-3 (HC-3)-sensitive, membrane depolarization that can be used for the screening of CHT inhibitors and activators. Using this assay, we discovered that staurosporine increased CHT LV-AA choline uptake activity, an effect mediated by a decrease in choline K(M) with no change in V(max). As staurosporine did not change surface levels of CHT, nor inhibit HC-3 binding, we propose that its action is directly or indirectly allosteric in nature. Surprisingly, staurosporine reduced choline-induced membrane depolarization, suggesting that increased substrate coupling to ion gradients, arising at the expense of nonstoichiometric ion flow, accompanies a shift of CHT to a higher-affinity state. Our findings provide a new approach for the identification of CHT modulators that is compatible with high-throughput screening approaches and presents a novel model by which small molecules can enhance substrate flux through enhanced gradient coupling.
SMARTIV: combined sequence and structure de-novo motif discovery for in-vivo RNA binding data.
Polishchuk, Maya; Paz, Inbal; Yakhini, Zohar; Mandel-Gutfreund, Yael
2018-05-25
Gene expression regulation is highly dependent on binding of RNA-binding proteins (RBPs) to their RNA targets. Growing evidence supports the notion that both RNA primary sequence and its local secondary structure play a role in specific Protein-RNA recognition and binding. Despite the great advance in high-throughput experimental methods for identifying sequence targets of RBPs, predicting the specific sequence and structure binding preferences of RBPs remains a major challenge. We present a novel webserver, SMARTIV, designed for discovering and visualizing combined RNA sequence and structure motifs from high-throughput RNA-binding data, generated from in-vivo experiments. The uniqueness of SMARTIV is that it predicts motifs from enriched k-mers that combine information from ranked RNA sequences and their predicted secondary structure, obtained using various folding methods. Consequently, SMARTIV generates Position Weight Matrices (PWMs) in a combined sequence and structure alphabet with assigned P-values. SMARTIV concisely represents the sequence and structure motif content as a single graphical logo, which is informative and easy for visual perception. SMARTIV was examined extensively on a variety of high-throughput binding experiments for RBPs from different families, generated from different technologies, showing consistent and accurate results. Finally, SMARTIV is a user-friendly webserver, highly efficient in run-time and freely accessible via http://smartiv.technion.ac.il/.
Pauli, Daniela; Seyfarth, Michael; Dibbelt, Leif
2005-01-01
Applying basic potentiometric and photometric assays, we evaluated the fully automated random access chemistry analyzer Architect c8000, a new member of the Abbott Architect system family, with respect to both its analytical and operational performance and compared it to an established high-throughput chemistry platform, the Abbott Aeroset. Our results demonstrate that intra- and inter-assay imprecision, inaccuracy, lower limit of detection and linear range of the c8000 generally meet actual requirements of laboratory diagnosis; there were only rare exceptions, e.g. assays for plasma lipase or urine uric acid which apparently need to be improved by additional rinsing of reagent pipettors. Even with plasma exhibiting CK activities as high as 40.000 U/l, sample carryover by the c8000 could not be detected. Comparison of methods run on the c8000 and the Aeroset revealed correlation coefficients of 0.98-1.00; if identical chemistries were applied on both analyzers, slopes of regression lines approached unity. With typical laboratory workloads including 10-20% STAT samples and up to 10% samples with high analyte concentrations demanding dilutional reruns, steady-state throughput numbers of 700 to 800 tests per hour were obtained with the c8000. The system generally responded to STAT orders within 2 minutes yielding analytical STAT order completion times of 5 to 15 minutes depending on the type and number of assays requested per sample. Due to its extended test and sample processing capabilities and highly comfortable software, the c8000 may meet the varying needs of clinical laboratories rather well.
Yip, Kenneth W.; Cuddy, Michael; Pinilla, Clemencia; Giulanotti, Marc; Heynen-Genel, Susanne; Matsuzawa, Shu-ichi; Reed, John C.
2014-01-01
PML is a tumor suppressor that promotes apoptosis through both p53-dependent and - independent mechanisms, participates in Rb-mediated cell cycle arrest, inhibits neoangiogenesis, and contributes to maintenance of genomic stability. PML also plays a role in host defense against viruses, conferring antiviral activity. When active, PML localizes to subnuclear structures named PML oncogenic domains (PODs) or PML nuclear bodies (PML-NBs), whereas inactive PML is located diffusely throughout the nucleus of cells, thus providing a morphological indicator. Known activators of PML include arsenicals and interferons, however, these agents induce a plethora of toxic effects, limiting their effectiveness. The objective of the current study was to develop a high content screening (HCS) assay for the identification of chemical activators of PML. We describe methods for automated analysis of POD formation using high throughput microscopy (HTM) to localize PML immunofluorescence in conjunction with image analysis software for POD quantification. Using this HCS assay in 384 well format, we performed pilot screens of a small synthetic chemical library and mixture-based combinatorial libraries, demonstrating the robust performance of the assay. HCS counter-screening assays were also developed for hit characterization, based on immunofluorescence analyses of the subcellular location of phosphorylated H2AX or phosphorylated CHK1, which increase in a punctate nuclear pattern in response to DNA damage. Thus, the HCS assay devised here represents a high throughput screen that can be utilized to discover POD-inducing compounds that may restore the tumor suppressor activity of PML in cancers or possibly promote anti-viral states. PMID:21233309
Strategic and Operational Plan for Integrating Transcriptomics ...
Plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT; the details are in the attached slide presentation presentation on plans for incorporating high throughput transcriptomics into the current high throughput screening activities at NCCT, given at the OECD meeting on June 23, 2016
High-Throughput Experimental Approach Capabilities | Materials Science |
NREL High-Throughput Experimental Approach Capabilities High-Throughput Experimental Approach by yellow and is for materials in the upper right sector. NREL's high-throughput experimental ,Te) and oxysulfide sputtering Combi-5: Nitrides and oxynitride sputtering We also have several non
A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.
Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham
2017-08-01
Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.
Jaramillo, Thomas F; Baeck, Sung-Hyeon; Kleiman-Shwarsctein, Alan; Choi, Kyoung-Shin; Stucky, Galen D; McFarland, Eric W
2005-01-01
High-throughput electrochemical methods have been developed for the investigation of Zn1-xCo(x)O films for photoelectrochemical hydrogen production from water. A library of 120 samples containing 27 different compositions (0
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanic, Vesna; Broadbent, Charlotte; DiMasi, Elaine
2016-11-14
The interactions of mixtures of anionic and amphoteric surfactants with sugar amphiphiles were studied via high throughput small angle x-ray scattering (SAXS). The sugar amphiphile was composed of Caprate, Caprylate, and Oleate mixed ester of methyl glucoside, MeGCCO. Optimal surfactant interactions are sought which have desirable physical properties, which must be identified in a cost effective manner that can access the large phase space of possible molecular combinations. X-ray scattering patterns obtained via high throughput SAXS can probe a combinatorial sample space and reveal the incorporation of MeGCCO into the micelles and the molecular associations between surfactant molecules. Such datamore » make it possible to efficiently assess the effects of the new amphiphiles in the formulation. A specific finding of this study is that formulations containing comparatively monodisperse and homogeneous surfactant mixtures can be reliably tuned by addition of NaCl, which swells the surfactant micelles with a monotonic dependence on salt concentration. In contrast, the presence of multiple different surfactants destroys clear correlations with NaCl concentration, even in otherwise similar series of formulations.« less
NASA Astrophysics Data System (ADS)
Nikzad, Shouleh; Jewell, April D.; Hoenk, Michael E.; Jones, Todd J.; Hennessy, John; Goodsall, Tim; Carver, Alexander G.; Shapiro, Charles; Cheng, Samuel R.; Hamden, Erika T.; Kyne, Gillian; Martin, D. Christopher; Schiminovich, David; Scowen, Paul; France, Kevin; McCandliss, Stephan; Lupu, Roxana E.
2017-07-01
Exciting concepts are under development for flagship, probe class, explorer class, and suborbital class NASA missions in the ultraviolet/optical spectral range. These missions will depend on high-performance silicon detector arrays being delivered affordably and in high numbers. To that end, we have advanced delta-doping technology to high-throughput and high-yield wafer-scale processing, encompassing a multitude of state-of-the-art silicon-based detector formats and designs. We have embarked on a number of field observations, instrument integrations, and independent evaluations of delta-doped arrays. We present recent data and innovations from JPL's Advanced Detectors and Systems Program, including two-dimensional doping technology, JPL's end-to-end postfabrication processing of high-performance UV/optical/NIR arrays and advanced coatings for detectors. While this paper is primarily intended to provide an overview of past work, developments are identified and discussed throughout. Additionally, we present examples of past, in-progress, and planned observations and deployments of delta-doped arrays.
He, Guo-qing; Liu, Tong-jie; Sadiq, Faizan A.; Gu, Jing-si; Zhang, Guo-hua
2017-01-01
Chinese traditional fermented foods have a very long history dating back thousands of years and have become an indispensable part of Chinese dietary culture. A plethora of research has been conducted to unravel the composition and dynamics of microbial consortia associated with Chinese traditional fermented foods using culture-dependent as well as culture-independent methods, like different high-throughput sequencing (HTS) techniques. These HTS techniques enable us to understand the relationship between a food product and its microbes to a greater extent than ever before. Considering the importance of Chinese traditional fermented products, the objective of this paper is to review the diversity and dynamics of microbiota in Chinese traditional fermented foods revealed by HTS approaches. PMID:28378567
Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.
Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli
2018-01-23
So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand.
Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic
Kerkhove, Dwight; Tian, Le; Munteanu, Adrian; De Poorter, Eli
2018-01-01
So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand. PMID:29360798
ToxCast: Using high throughput screening to identify profiles of biological activity
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
Applications of high throughput screening to identify profiles of biological activity
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
High-throughput combinatorial cell co-culture using microfluidics.
Tumarkin, Ethan; Tzadu, Lsan; Csaszar, Elizabeth; Seo, Minseok; Zhang, Hong; Lee, Anna; Peerani, Raheem; Purpura, Kelly; Zandstra, Peter W; Kumacheva, Eugenia
2011-06-01
Co-culture strategies are foundational in cell biology. These systems, which serve as mimics of in vivo tissue niches, are typically poorly defined in terms of cell ratios, local cues and supportive cell-cell interactions. In the stem cell niche, the ability to screen cell-cell interactions and identify local supportive microenvironments has a broad range of applications in transplantation, tissue engineering and wound healing. We present a microfluidic platform for the high-throughput generation of hydrogel microbeads for cell co-culture. Encapsulation of different cell populations in microgels was achieved by introducing in a microfluidic device two streams of distinct cell suspensions, emulsifying the mixed suspension, and gelling the precursor droplets. The cellular composition in the microgels was controlled by varying the volumetric flow rates of the corresponding streams. We demonstrate one of the applications of the microfluidic method by co-encapsulating factor-dependent and responsive blood progenitor cell lines (MBA2 and M07e cells, respectively) at varying ratios, and show that in-bead paracrine secretion can modulate the viability of the factor dependent cells. Furthermore, we show the application of the method as a tool to screen the impact of specific growth factors on a primary human heterogeneous cell population. Co-encapsulation of IL-3 secreting MBA2 cells with umbilical cord blood cells revealed differential sub-population responsiveness to paracrine signals (CD14+ cells were particularly responsive to locally delivered IL-3). This microfluidic co-culture platform should enable high throughput screening of cell co-culture conditions, leading to new strategies to manipulate cell fate. This journal is © The Royal Society of Chemistry 2011
Winpenny, David; Clark, Mellissa
2016-01-01
Background and Purpose Biased GPCR ligands are able to engage with their target receptor in a manner that preferentially activates distinct downstream signalling and offers potential for next generation therapeutics. However, accurate quantification of ligand bias in vitro is complex, and current best practice is not amenable for testing large numbers of compound. We have therefore sought to apply ligand bias theory to an industrial scale screening campaign for the identification of new biased μ receptor agonists. Experimental Approach μ receptor assays with appropriate dynamic range were developed for both Gαi‐dependent signalling and β‐arrestin2 recruitment. Δlog(Emax/EC50) analysis was validated as an alternative for the operational model of agonism in calculating pathway bias towards Gαi‐dependent signalling. The analysis was applied to a high throughput screen to characterize the prevalence and nature of pathway bias among a diverse set of compounds with μ receptor agonist activity. Key Results A high throughput screening campaign yielded 440 hits with greater than 10‐fold bias relative to DAMGO. To validate these results, we quantified pathway bias of a subset of hits using the operational model of agonism. The high degree of correlation across these biased hits confirmed that Δlog(Emax/EC50) was a suitable method for identifying genuine biased ligands within a large collection of diverse compounds. Conclusions and Implications This work demonstrates that using Δlog(Emax/EC50), drug discovery can apply the concept of biased ligand quantification on a large scale and accelerate the deliberate discovery of novel therapeutics acting via this complex pharmacology. PMID:26791140
Karmaus, Agnes L; Toole, Colleen M; Filer, Dayne L; Lewis, Kenneth C; Martin, Matthew T
2016-04-01
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
Toole, Colleen M.; Filer, Dayne L.; Lewis, Kenneth C.; Martin, Matthew T.
2016-01-01
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. PMID:26781511
Suzuki, Miho; Sakata, Ichiro; Sakai, Takafumi; Tomioka, Hiroaki; Nishigaki, Koichi; Tramier, Marc; Coppey-Moisan, Maïté
2015-12-15
Cytometry is a versatile and powerful method applicable to different fields, particularly pharmacology and biomedical studies. Based on the data obtained, cytometric studies are classified into high-throughput (HTP) or high-content screening (HCS) groups. However, assays combining the advantages of both are required to facilitate research. In this study, we developed a high-throughput system to profile cellular populations in terms of time- or dose-dependent responses to apoptotic stimulations because apoptotic inducers are potent anticancer drugs. We previously established assay systems involving protease to monitor live cells for apoptosis using tunable fluorescence resonance energy transfer (FRET)-based bioprobes. These assays can be used for microscopic analyses or fluorescence-activated cell sorting. In this study, we developed FRET-based bioprobes to detect the activity of the apoptotic markers caspase-3 and caspase-9 via changes in bioprobe fluorescence lifetimes using a flow cytometer for direct estimation of FRET efficiencies. Different patterns of changes in the fluorescence lifetimes of these markers during apoptosis were observed, indicating a relationship between discrete steps in the apoptosis process. The findings demonstrate the feasibility of evaluating collective cellular dynamics during apoptosis. Copyright © 2015 Elsevier Inc. All rights reserved.
Microfluidic Imaging Flow Cytometry by Asymmetric-detection Time-stretch Optical Microscopy (ATOM).
Tang, Anson H L; Lai, Queenie T K; Chung, Bob M F; Lee, Kelvin C M; Mok, Aaron T Y; Yip, G K; Shum, Anderson H C; Wong, Kenneth K Y; Tsia, Kevin K
2017-06-28
Scaling the number of measurable parameters, which allows for multidimensional data analysis and thus higher-confidence statistical results, has been the main trend in the advanced development of flow cytometry. Notably, adding high-resolution imaging capabilities allows for the complex morphological analysis of cellular/sub-cellular structures. This is not possible with standard flow cytometers. However, it is valuable for advancing our knowledge of cellular functions and can benefit life science research, clinical diagnostics, and environmental monitoring. Incorporating imaging capabilities into flow cytometry compromises the assay throughput, primarily due to the limitations on speed and sensitivity in the camera technologies. To overcome this speed or throughput challenge facing imaging flow cytometry while preserving the image quality, asymmetric-detection time-stretch optical microscopy (ATOM) has been demonstrated to enable high-contrast, single-cell imaging with sub-cellular resolution, at an imaging throughput as high as 100,000 cells/s. Based on the imaging concept of conventional time-stretch imaging, which relies on all-optical image encoding and retrieval through the use of ultrafast broadband laser pulses, ATOM further advances imaging performance by enhancing the image contrast of unlabeled/unstained cells. This is achieved by accessing the phase-gradient information of the cells, which is spectrally encoded into single-shot broadband pulses. Hence, ATOM is particularly advantageous in high-throughput measurements of single-cell morphology and texture - information indicative of cell types, states, and even functions. Ultimately, this could become a powerful imaging flow cytometry platform for the biophysical phenotyping of cells, complementing the current state-of-the-art biochemical-marker-based cellular assay. This work describes a protocol to establish the key modules of an ATOM system (from optical frontend to data processing and visualization backend), as well as the workflow of imaging flow cytometry based on ATOM, using human cells and micro-algae as the examples.
Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data.
Gallant, Andrew; Leiserson, Mark D M; Kachalov, Maxim; Cowen, Lenore J; Hescott, Benjamin J
2013-01-18
New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric.
Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data
2013-01-01
Background New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Results Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. Conclusion We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric. PMID:23331614
USDA-ARS?s Scientific Manuscript database
The amount of visible and near infrared light reflected by plants varies depending on their health. In this study, multispectral images were acquired by quadcopter for detecting tomato spot wilt virus amongst twenty genetic varieties of peanuts. The plants were visually assessed to acquire ground ...
2016-12-01
near-infrared imaging to evaluate in vivo the tumor targeting properties of the prostate cancer ligands on xenograft models, from which in vivo...2007). (13) Rosca, E.V., Gillies, R.J. & Caplan, M.R. Glioblastoma targeting via integrins is concentration dependent. Biotechnol Bioeng 104, 408
Baseline Survey of Root-Associated Microbes of Taxus chinensis (Pilger) Rehd
Sun, Guiling; Wilson, Iain W.; Wu, Jianqiang; Hoffman, Angela; Cheng, Junwen; Qiu, Deyou
2015-01-01
Taxol (paclitaxel) a diterpenoid is one of the most effective anticancer drugs identified. Biosynthesis of taxol was considered restricted to the Taxus genera until Stierle et al. discovered that an endophytic fungus isolated from Taxus brevifolia could independently synthesize taxol. Little is known about the mechanism of taxol biosynthesis in microbes, but it has been speculated that its biosynthesis may differ from plants. The microbiome from the roots of Taxus chinensis have been extensively investigated with culture-dependent methods to identify taxol synthesizing microbes, but not using culture independent methods.,Using bar-coded high-throughput sequencing in combination with a metagenomics approach, we surveyed the microbial diversity and gene composition of the root-associated microbiomefrom Taxus chinensis (Pilger) Rehd. High-throughput amplicon sequencing revealed 187 fungal OTUs which is higher than any previously reported fungal number identified with the culture-dependent method, suggesting that T. chinensis roots harbor novel and diverse fungi. Some operational taxonomic units (OTU) identified were identical to reported microbe strains possessing the ability to synthesis taxol and several genes previously associated with taxol biosynthesis were identified through metagenomics analysis. PMID:25821956
Baseline survey of root-associated microbes of Taxus chinensis (Pilger) Rehd.
Zhang, Qian; Liu, Hongwei; Sun, Guiling; Wilson, Iain W; Wu, Jianqiang; Hoffman, Angela; Cheng, Junwen; Qiu, Deyou
2015-01-01
Taxol (paclitaxel) a diterpenoid is one of the most effective anticancer drugs identified. Biosynthesis of taxol was considered restricted to the Taxus genera until Stierle et al. discovered that an endophytic fungus isolated from Taxus brevifolia could independently synthesize taxol. Little is known about the mechanism of taxol biosynthesis in microbes, but it has been speculated that its biosynthesis may differ from plants. The microbiome from the roots of Taxus chinensis have been extensively investigated with culture-dependent methods to identify taxol synthesizing microbes, but not using culture independent methods.,Using bar-coded high-throughput sequencing in combination with a metagenomics approach, we surveyed the microbial diversity and gene composition of the root-associated microbiomefrom Taxus chinensis (Pilger) Rehd. High-throughput amplicon sequencing revealed 187 fungal OTUs which is higher than any previously reported fungal number identified with the culture-dependent method, suggesting that T. chinensis roots harbor novel and diverse fungi. Some operational taxonomic units (OTU) identified were identical to reported microbe strains possessing the ability to synthesis taxol and several genes previously associated with taxol biosynthesis were identified through metagenomics analysis.
Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin
2015-10-19
The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.
ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...
In 2007, EPA launched ToxCast™ in order to develop a cost-effective approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time. Using data from state-of-the-art high throughput screening (HTS) bioassays developed in the pharmaceutical i...
40 CFR 65.166 - Periodic reports.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., including a halogen reduction device for a low-throughput transfer rack, is used to control emissions from storage vessels or low-throughput transfer racks, the periodic report shall identify and state the cause...-throughput transfer racks, periodic reports shall include the following information: (1) Periodic reports...
Adaptive Packet Combining Scheme in Three State Channel Model
NASA Astrophysics Data System (ADS)
Saring, Yang; Bulo, Yaka; Bhunia, Chandan Tilak
2018-01-01
The two popular techniques of packet combining based error correction schemes are: Packet Combining (PC) scheme and Aggressive Packet Combining (APC) scheme. PC scheme and APC scheme have their own merits and demerits; PC scheme has better throughput than APC scheme, but suffers from higher packet error rate than APC scheme. The wireless channel state changes all the time. Because of this random and time varying nature of wireless channel, individual application of SR ARQ scheme, PC scheme and APC scheme can't give desired levels of throughput. Better throughput can be achieved if appropriate transmission scheme is used based on the condition of channel. Based on this approach, adaptive packet combining scheme has been proposed to achieve better throughput. The proposed scheme adapts to the channel condition to carry out transmission using PC scheme, APC scheme and SR ARQ scheme to achieve better throughput. Experimentally, it was observed that the error correction capability and throughput of the proposed scheme was significantly better than that of SR ARQ scheme, PC scheme and APC scheme.
Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A
2017-10-01
Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Awan, Muaaz Gul; Saeed, Fahad
2016-05-15
Modern proteomics studies utilize high-throughput mass spectrometers which can produce data at an astonishing rate. These big mass spectrometry (MS) datasets can easily reach peta-scale level creating storage and analytic problems for large-scale systems biology studies. Each spectrum consists of thousands of peaks which have to be processed to deduce the peptide. However, only a small percentage of peaks in a spectrum are useful for peptide deduction as most of the peaks are either noise or not useful for a given spectrum. This redundant processing of non-useful peaks is a bottleneck for streaming high-throughput processing of big MS data. One way to reduce the amount of computation required in a high-throughput environment is to eliminate non-useful peaks. Existing noise removing algorithms are limited in their data-reduction capability and are compute intensive making them unsuitable for big data and high-throughput environments. In this paper we introduce a novel low-complexity technique based on classification, quantization and sampling of MS peaks. We present a novel data-reductive strategy for analysis of Big MS data. Our algorithm, called MS-REDUCE, is capable of eliminating noisy peaks as well as peaks that do not contribute to peptide deduction before any peptide deduction is attempted. Our experiments have shown up to 100× speed up over existing state of the art noise elimination algorithms while maintaining comparable high quality matches. Using our approach we were able to process a million spectra in just under an hour on a moderate server. The developed tool and strategy has been made available to wider proteomics and parallel computing community and the code can be found at https://github.com/pcdslab/MSREDUCE CONTACT: : fahad.saeed@wmich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Nebula: reconstruction and visualization of scattering data in reciprocal space.
Reiten, Andreas; Chernyshov, Dmitry; Mathiesen, Ragnvald H
2015-04-01
Two-dimensional solid-state X-ray detectors can now operate at considerable data throughput rates that allow full three-dimensional sampling of scattering data from extended volumes of reciprocal space within second to minute time-scales. For such experiments, simultaneous analysis and visualization allows for remeasurements and a more dynamic measurement strategy. A new software, Nebula , is presented. It efficiently reconstructs X-ray scattering data, generates three-dimensional reciprocal space data sets that can be visualized interactively, and aims to enable real-time processing in high-throughput measurements by employing parallel computing on commodity hardware.
Nebula: reconstruction and visualization of scattering data in reciprocal space
Reiten, Andreas; Chernyshov, Dmitry; Mathiesen, Ragnvald H.
2015-01-01
Two-dimensional solid-state X-ray detectors can now operate at considerable data throughput rates that allow full three-dimensional sampling of scattering data from extended volumes of reciprocal space within second to minute timescales. For such experiments, simultaneous analysis and visualization allows for remeasurements and a more dynamic measurement strategy. A new software, Nebula, is presented. It efficiently reconstructs X-ray scattering data, generates three-dimensional reciprocal space data sets that can be visualized interactively, and aims to enable real-time processing in high-throughput measurements by employing parallel computing on commodity hardware. PMID:25844083
Wang, Xiao; Gu, Jinghua; Hilakivi-Clarke, Leena; Clarke, Robert; Xuan, Jianhua
2017-01-15
The advent of high-throughput DNA methylation profiling techniques has enabled the possibility of accurate identification of differentially methylated genes for cancer research. The large number of measured loci facilitates whole genome methylation study, yet posing great challenges for differential methylation detection due to the high variability in tumor samples. We have developed a novel probabilistic approach, D: ifferential M: ethylation detection using a hierarchical B: ayesian model exploiting L: ocal D: ependency (DM-BLD), to detect differentially methylated genes based on a Bayesian framework. The DM-BLD approach features a joint model to capture both the local dependency of measured loci and the dependency of methylation change in samples. Specifically, the local dependency is modeled by Leroux conditional autoregressive structure; the dependency of methylation changes is modeled by a discrete Markov random field. A hierarchical Bayesian model is developed to fully take into account the local dependency for differential analysis, in which differential states are embedded as hidden variables. Simulation studies demonstrate that DM-BLD outperforms existing methods for differential methylation detection, particularly when the methylation change is moderate and the variability of methylation in samples is high. DM-BLD has been applied to breast cancer data to identify important methylated genes (such as polycomb target genes and genes involved in transcription factor activity) associated with breast cancer recurrence. A Matlab package of DM-BLD is available at http://www.cbil.ece.vt.edu/software.htm CONTACT: Xuan@vt.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Precise, High-throughput Analysis of Bacterial Growth.
Kurokawa, Masaomi; Ying, Bei-Wen
2017-09-19
Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.
Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli
2013-01-01
M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037
Throughput Calibration of the 52x0.2E1 Aperture
NASA Astrophysics Data System (ADS)
Heap, Sara
2009-07-01
The Next Generation Spectral Library {NGSL} is a library of low-dispersion STIS spectra extending from 0.2-1.0 microns. So far, 378 stars with a wide range in metallicity have been observed. Despite their high S/N>100, many NGSL spectra have 5-10% systematic errors in their spectral energy distributions, which can be traced to throughput variations in the 52x0.2E1 aperture caused by vignetting of a wavelength-dependent asymmetric PSF. We propose to obtain STIS spectra of the HST standard star, BD+75D325, at several positions in the 52x0.2E1 aperture, which will enable us to calibrate the NGSL spectra properly.
NASA Technical Reports Server (NTRS)
Lee, Shihyan; Meister, Gerhard
2017-01-01
Since Moderate Resolution Imaging Spectroradiometer Aqua's launch in 2002, the radiometric system gains of the reflective solar bands have been degrading, indicating changes in the systems optical throughput. To estimate the optical throughput degradation, the electronic gain changes were estimated and removed from the measured system gain. The derived optical throughput degradation shows a rate that is much faster in the shorter wavelengths than the longer wavelengths. The wavelength-dependent optical throughput degradation modulated the relative spectral response (RSR) of the bands. In addition, the optical degradation is also scan angle-dependent due to large changes in response versus the scan angle over time. We estimated the modulated RSR as a function of time and scan angles and its impacts on sensor radiometric calibration for the ocean science. Our results show that the calibration bias could be up to 1.8 % for band 8 (412 nm) due to its larger out-of-band response. For the other ocean bands, the calibration biases are much smaller with magnitudes at least one order smaller.
Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou
2016-01-01
Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907
You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen
2014-01-01
Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.
High Throughput PBTK: Open-Source Data and Tools for ...
Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy
Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela
2014-09-25
Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by <15%. Stability during storage at different temperatures was confirmed for three weeks. The limits of detection and quantification for each biomarker varied from 0.3 to 6.3 μmol/l and from 1.0 to 20.9 μmol/l, respectively. Analyses of urine specimens from affected patients revealed abnormal results. Targeted biomarkers in urine were detected in the first weeks of life. This rapid, simple and robust liquid chromatography/tandem mass spectrometry methodology is an efficient tool applicable to urine screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Oono, Ryoko
2017-01-01
High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions 'how and why are communities different?' This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences.
Modeling Steroidogenesis Disruption Using High-Throughput ...
Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the
Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying
2015-01-01
This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416
2017-01-01
High-throughput sequencing technology has helped microbial community ecologists explore ecological and evolutionary patterns at unprecedented scales. The benefits of a large sample size still typically outweigh that of greater sequencing depths per sample for accurate estimations of ecological inferences. However, excluding or not sequencing rare taxa may mislead the answers to the questions ‘how and why are communities different?’ This study evaluates the confidence intervals of ecological inferences from high-throughput sequencing data of foliar fungal endophytes as case studies through a range of sampling efforts, sequencing depths, and taxonomic resolutions to understand how technical and analytical practices may affect our interpretations. Increasing sampling size reliably decreased confidence intervals across multiple community comparisons. However, the effects of sequencing depths on confidence intervals depended on how rare taxa influenced the dissimilarity estimates among communities and did not significantly decrease confidence intervals for all community comparisons. A comparison of simulated communities under random drift suggests that sequencing depths are important in estimating dissimilarities between microbial communities under neutral selective processes. Confidence interval analyses reveal important biases as well as biological trends in microbial community studies that otherwise may be ignored when communities are only compared for statistically significant differences. PMID:29253889
Reverse Toxicokinetics: From In Vitro Concentration to In Vivo Dose
This talk provided an update to an international audience about the state of the science to relate results from high-throughput bioactivity screening efforts out to an external exposure that would be required to achieve blood concentrations at which these bioactivities may be obs...
Identification and characterization of a new ampelovirus infecting cultivated and wild blackberries
USDA-ARS?s Scientific Manuscript database
A novel ampelovirus from blackberry was identified recently in Mississippi and characterized in the framework of NIFA-funded Specialty Crop Research Initiative (SCRI) Project on viruses affecting blackberries in the southeastern United States. The virus sequence was obtained from high throughput se...
Modeling Reproductive Toxicity for Chemical Prioritization into an Integrated Testing Strategy
The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals in over 500 assays of different molecular targets, cellular responses and cell-states. Of th...
Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
Herington, Jennifer L.; Swale, Daniel R.; Brown, Naoko; Shelton, Elaine L.; Choi, Hyehun; Williams, Charles H.; Hong, Charles C.; Paria, Bibhash C.; Denton, Jerod S.; Reese, Jeff
2015-01-01
The uterine myometrium (UT-myo) is a therapeutic target for preterm labor, labor induction, and postpartum hemorrhage. Stimulation of intracellular Ca2+-release in UT-myo cells by oxytocin is a final pathway controlling myometrial contractions. The goal of this study was to develop a dual-addition assay for high-throughput screening of small molecular compounds, which could regulate Ca2+-mobilization in UT-myo cells, and hence, myometrial contractions. Primary murine UT-myo cells in 384-well plates were loaded with a Ca2+-sensitive fluorescent probe, and then screened for inducers of Ca2+-mobilization and inhibitors of oxytocin-induced Ca2+-mobilization. The assay exhibited robust screening statistics (Z´ = 0.73), DMSO-tolerance, and was validated for high-throughput screening against 2,727 small molecules from the Spectrum, NIH Clinical I and II collections of well-annotated compounds. The screen revealed a hit-rate of 1.80% for agonist and 1.39% for antagonist compounds. Concentration-dependent responses of hit-compounds demonstrated an EC50 less than 10μM for 21 hit-antagonist compounds, compared to only 7 hit-agonist compounds. Subsequent studies focused on hit-antagonist compounds. Based on the percent inhibition and functional annotation analyses, we selected 4 confirmed hit-antagonist compounds (benzbromarone, dipyridamole, fenoterol hydrobromide and nisoldipine) for further analysis. Using an ex vivo isometric contractility assay, each compound significantly inhibited uterine contractility, at different potencies (IC50). Overall, these results demonstrate for the first time that high-throughput small-molecules screening of myometrial Ca2+-mobilization is an ideal primary approach for discovering modulators of uterine contractility. PMID:26600013
Computational Approaches to Phenotyping
Lussier, Yves A.; Liu, Yang
2007-01-01
The recent completion of the Human Genome Project has made possible a high-throughput “systems approach” for accelerating the elucidation of molecular underpinnings of human diseases, and subsequent derivation of molecular-based strategies to more effectively prevent, diagnose, and treat these diseases. Although altered phenotypes are among the most reliable manifestations of altered gene functions, research using systematic analysis of phenotype relationships to study human biology is still in its infancy. This article focuses on the emerging field of high-throughput phenotyping (HTP) phenomics research, which aims to capitalize on novel high-throughput computation and informatics technology developments to derive genomewide molecular networks of genotype–phenotype associations, or “phenomic associations.” The HTP phenomics research field faces the challenge of technological research and development to generate novel tools in computation and informatics that will allow researchers to amass, access, integrate, organize, and manage phenotypic databases across species and enable genomewide analysis to associate phenotypic information with genomic data at different scales of biology. Key state-of-the-art technological advancements critical for HTP phenomics research are covered in this review. In particular, we highlight the power of computational approaches to conduct large-scale phenomics studies. PMID:17202287
High-throughput screening of small-molecule adsorption in MOF-74
NASA Astrophysics Data System (ADS)
Thonhauser, T.; Canepa, P.
2014-03-01
Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H2, CO2, CH4, and H2O in MOF-74- with = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. The exploration of the configurational adsorption space allows us to extract crucial information concerning, for example, the competition of water with CO2 for the adsorption binding sites. We find that only a few noble metals--Rh, Pd, Os, Ir, and Pt--favor the adsorption of CO2 and hence are potential candidates for effective carbon-capture materials. Our findings further reveal significant differences in the binding characteristics of H2, CO2, CH4, and H2O within the MOF structure, indicating that molecular blends can be successfully separated by these nano-porous materials. Supported by DOE DE-FG02-08ER46491.
Aryee, Martin J.; Jaffe, Andrew E.; Corrada-Bravo, Hector; Ladd-Acosta, Christine; Feinberg, Andrew P.; Hansen, Kasper D.; Irizarry, Rafael A.
2014-01-01
Motivation: The recently released Infinium HumanMethylation450 array (the ‘450k’ array) provides a high-throughput assay to quantify DNA methylation (DNAm) at ∼450 000 loci across a range of genomic features. Although less comprehensive than high-throughput sequencing-based techniques, this product is more cost-effective and promises to be the most widely used DNAm high-throughput measurement technology over the next several years. Results: Here we describe a suite of computational tools that incorporate state-of-the-art statistical techniques for the analysis of DNAm data. The software is structured to easily adapt to future versions of the technology. We include methods for preprocessing, quality assessment and detection of differentially methylated regions from the kilobase to the megabase scale. We show how our software provides a powerful and flexible development platform for future methods. We also illustrate how our methods empower the technology to make discoveries previously thought to be possible only with sequencing-based methods. Availability and implementation: http://bioconductor.org/packages/release/bioc/html/minfi.html. Contact: khansen@jhsph.edu; rafa@jimmy.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24478339
The current state of drug discovery and a potential role for NMR metabolomics.
Powers, Robert
2014-07-24
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics.
Emergence of a catalytic tetrad during evolution of a highly active artificial aldolase.
Obexer, Richard; Godina, Alexei; Garrabou, Xavier; Mittl, Peer R E; Baker, David; Griffiths, Andrew D; Hilvert, Donald
2017-01-01
Designing catalysts that achieve the rates and selectivities of natural enzymes is a long-standing goal in protein chemistry. Here, we show that an ultrahigh-throughput droplet-based microfluidic screening platform can be used to improve a previously optimized artificial aldolase by an additional factor of 30 to give a >10 9 rate enhancement that rivals the efficiency of class I aldolases. The resulting enzyme catalyses a reversible aldol reaction with high stereoselectivity and tolerates a broad range of substrates. Biochemical and structural studies show that catalysis depends on a Lys-Tyr-Asn-Tyr tetrad that emerged adjacent to a computationally designed hydrophobic pocket during directed evolution. This constellation of residues is poised to activate the substrate by Schiff base formation, promote mechanistically important proton transfers and stabilize multiple transition states along a complex reaction coordinate. The emergence of such a sophisticated catalytic centre shows that there is nothing magical about the catalytic activities or mechanisms of naturally occurring enzymes, or the evolutionary process that gave rise to them.
Application of ToxCast High-Throughput Screening and ...
Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors
TCP throughput adaptation in WiMax networks using replicator dynamics.
Anastasopoulos, Markos P; Petraki, Dionysia K; Kannan, Rajgopal; Vasilakos, Athanasios V
2010-06-01
The high-frequency segment (10-66 GHz) of the IEEE 802.16 standard seems promising for the implementation of wireless backhaul networks carrying large volumes of Internet traffic. In contrast to wireline backbone networks, where channel errors seldom occur, the TCP protocol in IEEE 802.16 Worldwide Interoperability for Microwave Access networks is conditioned exclusively by wireless channel impairments rather than by congestion. This renders a cross-layer design approach between the transport and physical layers more appropriate during fading periods. In this paper, an adaptive coding and modulation (ACM) scheme for TCP throughput maximization is presented. In the current approach, Internet traffic is modulated and coded employing an adaptive scheme that is mathematically equivalent to the replicator dynamics model. The stability of the proposed ACM scheme is proven, and the dependence of the speed of convergence on various physical-layer parameters is investigated. It is also shown that convergence to the strategy that maximizes TCP throughput may be further accelerated by increasing the amount of information from the physical layer.
Morphology control in polymer blend fibers—a high throughput computing approach
NASA Astrophysics Data System (ADS)
Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar
2016-08-01
Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.
High Throughput Screening For Hazard and Risk of Environmental Contaminants
High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...
Advanced continuous cultivation methods for systems microbiology.
Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo
2015-09-01
Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories.
'Enzyme Test Bench': A biochemical application of the multi-rate modeling
NASA Astrophysics Data System (ADS)
Rachinskiy, K.; Schultze, H.; Boy, M.; Büchs, J.
2008-11-01
In the expanding field of 'white biotechnology' enzymes are frequently applied to catalyze the biochemical reaction from a resource material to a valuable product. Evolutionary designed to catalyze the metabolism in any life form, they selectively accelerate complex reactions under physiological conditions. Modern techniques, such as directed evolution, have been developed to satisfy the increasing demand on enzymes. Applying these techniques together with rational protein design, we aim at improving of enzymes' activity, selectivity and stability. To tap the full potential of these techniques, it is essential to combine them with adequate screening methods. Nowadays a great number of high throughput colorimetric and fluorescent enzyme assays are applied to measure the initial enzyme activity with high throughput. However, the prediction of enzyme long term stability within short experiments is still a challenge. A new high throughput technique for enzyme characterization with specific attention to the long term stability, called 'Enzyme Test Bench', is presented. The concept of the Enzyme Test Bench consists of short term enzyme tests conducted under partly extreme conditions to predict the enzyme long term stability under moderate conditions. The technique is based on the mathematical modeling of temperature dependent enzyme activation and deactivation. Adapting the temperature profiles in sequential experiments by optimum non-linear experimental design, the long term deactivation effects can be purposefully accelerated and detected within hours. During the experiment the enzyme activity is measured online to estimate the model parameters from the obtained data. Thus, the enzyme activity and long term stability can be calculated as a function of temperature. The results of the characterization, based on micro liter format experiments of hours, are in good agreement with the results of long term experiments in 1L format. Thus, the new technique allows for both: the enzyme screening with regard to the long term stability and the choice of the optimal process temperature. The presented article gives a successful example for the application of multi-rate modeling, experimental design and parameter estimation within biochemical engineering. At the same time, it shows the limitations of the methods at the state of the art and addresses the current problems to the applied mathematics community.
Data-dependent bucketing improves reference-free compression of sequencing reads.
Patro, Rob; Kingsford, Carl
2015-09-01
The storage and transmission of high-throughput sequencing data consumes significant resources. As our capacity to produce such data continues to increase, this burden will only grow. One approach to reduce storage and transmission requirements is to compress this sequencing data. We present a novel technique to boost the compression of sequencing that is based on the concept of bucketing similar reads so that they appear nearby in the file. We demonstrate that, by adopting a data-dependent bucketing scheme and employing a number of encoding ideas, we can achieve substantially better compression ratios than existing de novo sequence compression tools, including other bucketing and reordering schemes. Our method, Mince, achieves up to a 45% reduction in file sizes (28% on average) compared with existing state-of-the-art de novo compression schemes. Mince is written in C++11, is open source and has been made available under the GPLv3 license. It is available at http://www.cs.cmu.edu/∼ckingsf/software/mince. carlk@cs.cmu.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Han, Xiaoping; Chen, Haide; Huang, Daosheng; Chen, Huidong; Fei, Lijiang; Cheng, Chen; Huang, He; Yuan, Guo-Cheng; Guo, Guoji
2018-04-05
Human pluripotent stem cells (hPSCs) provide powerful models for studying cellular differentiations and unlimited sources of cells for regenerative medicine. However, a comprehensive single-cell level differentiation roadmap for hPSCs has not been achieved. We use high throughput single-cell RNA-sequencing (scRNA-seq), based on optimized microfluidic circuits, to profile early differentiation lineages in the human embryoid body system. We present a cellular-state landscape for hPSC early differentiation that covers multiple cellular lineages, including neural, muscle, endothelial, stromal, liver, and epithelial cells. Through pseudotime analysis, we construct the developmental trajectories of these progenitor cells and reveal the gene expression dynamics in the process of cell differentiation. We further reprogram primed H9 cells into naïve-like H9 cells to study the cellular-state transition process. We find that genes related to hemogenic endothelium development are enriched in naïve-like H9. Functionally, naïve-like H9 show higher potency for differentiation into hematopoietic lineages than primed cells. Our single-cell analysis reveals the cellular-state landscape of hPSC early differentiation, offering new insights that can be harnessed for optimization of differentiation protocols.
MINER: exploratory analysis of gene interaction networks by machine learning from expression data.
Kadupitige, Sidath Randeni; Leung, Kin Chun; Sellmeier, Julia; Sivieng, Jane; Catchpoole, Daniel R; Bain, Michael E; Gaëta, Bruno A
2009-12-03
The reconstruction of gene regulatory networks from high-throughput "omics" data has become a major goal in the modelling of living systems. Numerous approaches have been proposed, most of which attempt only "one-shot" reconstruction of the whole network with no intervention from the user, or offer only simple correlation analysis to infer gene dependencies. We have developed MINER (Microarray Interactive Network Exploration and Representation), an application that combines multivariate non-linear tree learning of individual gene regulatory dependencies, visualisation of these dependencies as both trees and networks, and representation of known biological relationships based on common Gene Ontology annotations. MINER allows biologists to explore the dependencies influencing the expression of individual genes in a gene expression data set in the form of decision, model or regression trees, using their domain knowledge to guide the exploration and formulate hypotheses. Multiple trees can then be summarised in the form of a gene network diagram. MINER is being adopted by several of our collaborators and has already led to the discovery of a new significant regulatory relationship with subsequent experimental validation. Unlike most gene regulatory network inference methods, MINER allows the user to start from genes of interest and build the network gene-by-gene, incorporating domain expertise in the process. This approach has been used successfully with RNA microarray data but is applicable to other quantitative data produced by high-throughput technologies such as proteomics and "next generation" DNA sequencing.
Goodman High Throughput Spectrograph | SOAR
SPARTAN Near-IR Camera Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER AVAILABLE SOAR 320-850 nm wavelength range. The paper describing the instrument is Clemens et al. (2004) Applying for IRAF. Publishing results based on Goodman data?: ADS link to 2004 SPIE Goodman Spectrograph paper
The National Research Council of the United States National Academies of Science has recently released a document outlining a long-range vision and strategy for transforming toxicity testing from largely whole animal-based testing to one based on in vitro assays. “Toxicity Testin...
Comparisons of high throughput screening data to human exposures assume that media concentrations are equivalent to steady-state blood concentrations. This assumes the partitioning of the chemical between media and cells is equivalent to the partitioning of the chemical between b...
Microengineering methods for cell-based microarrays and high-throughput drug-screening applications.
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-09-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.
Microengineering Methods for Cell Based Microarrays and High-Throughput Drug Screening Applications
Xu, Feng; Wu, JinHui; Wang, ShuQi; Durmus, Naside Gozde; Gurkan, Umut Atakan; Demirci, Utkan
2011-01-01
Screening for effective therapeutic agents from millions of drug candidates is costly, time-consuming and often face ethical concerns due to extensive use of animals. To improve cost-effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems have facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell based drug-screening models, which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell based drug screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds a great potential to provide repeatable 3D cell based constructs with high temporal, spatial control and versatility. PMID:21725152
Microfluidic cell chips for high-throughput drug screening
Chi, Chun-Wei; Ahmed, AH Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong
2016-01-01
The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell–drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers. PMID:27071838
The interdependence between screening methods and screening libraries.
Shelat, Anang A; Guy, R Kiplin
2007-06-01
The most common methods for discovery of chemical compounds capable of manipulating biological function involves some form of screening. The success of such screens is highly dependent on the chemical materials - commonly referred to as libraries - that are assayed. Classic methods for the design of screening libraries have depended on knowledge of target structure and relevant pharmacophores for target focus, and on simple count-based measures to assess other properties. The recent proliferation of two novel screening paradigms, structure-based screening and high-content screening, prompts a profound rethink about the ideal composition of small-molecule screening libraries. We suggest that currently utilized libraries are not optimal for addressing new targets by high-throughput screening, or complex phenotypes by high-content screening.
High Throughput Transcriptomics: From screening to pathways
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Adenylylation of small RNA sequencing adapters using the TS2126 RNA ligase I.
Lama, Lodoe; Ryan, Kevin
2016-01-01
Many high-throughput small RNA next-generation sequencing protocols use 5' preadenylylated DNA oligonucleotide adapters during cDNA library preparation. Preadenylylation of the DNA adapter's 5' end frees from ATP-dependence the ligation of the adapter to RNA collections, thereby avoiding ATP-dependent side reactions. However, preadenylylation of the DNA adapters can be costly and difficult. The currently available method for chemical adenylylation of DNA adapters is inefficient and uses techniques not typically practiced in laboratories profiling cellular RNA expression. An alternative enzymatic method using a commercial RNA ligase was recently introduced, but this enzyme works best as a stoichiometric adenylylating reagent rather than a catalyst and can therefore prove costly when several variant adapters are needed or during scale-up or high-throughput adenylylation procedures. Here, we describe a simple, scalable, and highly efficient method for the 5' adenylylation of DNA oligonucleotides using the thermostable RNA ligase 1 from bacteriophage TS2126. Adapters with 3' blocking groups are adenylylated at >95% yield at catalytic enzyme-to-adapter ratios and need not be gel purified before ligation to RNA acceptors. Experimental conditions are also reported that enable DNA adapters with free 3' ends to be 5' adenylylated at >90% efficiency. © 2015 Lama and Ryan; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
High-throughput monitoring of major cell functions by means of lensfree video microscopy
Kesavan, S. Vinjimore; Momey, F.; Cioni, O.; David-Watine, B.; Dubrulle, N.; Shorte, S.; Sulpice, E.; Freida, D.; Chalmond, B.; Dinten, J. M.; Gidrol, X.; Allier, C.
2014-01-01
Quantification of basic cell functions is a preliminary step to understand complex cellular mechanisms, for e.g., to test compatibility of biomaterials, to assess the effectiveness of drugs and siRNAs, and to control cell behavior. However, commonly used quantification methods are label-dependent, and end-point assays. As an alternative, using our lensfree video microscopy platform to perform high-throughput real-time monitoring of cell culture, we introduce specifically devised metrics that are capable of non-invasive quantification of cell functions such as cell-substrate adhesion, cell spreading, cell division, cell division orientation and cell death. Unlike existing methods, our platform and associated metrics embrace entire population of thousands of cells whilst monitoring the fate of every single cell within the population. This results in a high content description of cell functions that typically contains 25,000 – 900,000 measurements per experiment depending on cell density and period of observation. As proof of concept, we monitored cell-substrate adhesion and spreading kinetics of human Mesenchymal Stem Cells (hMSCs) and primary human fibroblasts, we determined the cell division orientation of hMSCs, and we observed the effect of transfection of siCellDeath (siRNA known to induce cell death) on hMSCs and human Osteo Sarcoma (U2OS) Cells. PMID:25096726
NASA Astrophysics Data System (ADS)
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-12-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.
Scott, Daniel J; Kummer, Lutz; Egloff, Pascal; Bathgate, Ross A D; Plückthun, Andreas
2014-11-01
The largest single class of drug targets is the G protein-coupled receptor (GPCR) family. Modern high-throughput methods for drug discovery require working with pure protein, but this has been a challenge for GPCRs, and thus the success of screening campaigns targeting soluble, catalytic protein domains has not yet been realized for GPCRs. Therefore, most GPCR drug screening has been cell-based, whereas the strategy of choice for drug discovery against soluble proteins is HTS using purified proteins coupled to structure-based drug design. While recent developments are increasing the chances of obtaining GPCR crystal structures, the feasibility of screening directly against purified GPCRs in the unbound state (apo-state) remains low. GPCRs exhibit low stability in detergent micelles, especially in the apo-state, over the time periods required for performing large screens. Recent methods for generating detergent-stable GPCRs, however, offer the potential for researchers to manipulate GPCRs almost like soluble enzymes, opening up new avenues for drug discovery. Here we apply cellular high-throughput encapsulation, solubilization and screening (CHESS) to the neurotensin receptor 1 (NTS1) to generate a variant that is stable in the apo-state when solubilized in detergents. This high stability facilitated the crystal structure determination of this receptor and also allowed us to probe the pharmacology of detergent-solubilized, apo-state NTS1 using robotic ligand binding assays. NTS1 is a target for the development of novel antipsychotics, and thus CHESS-stabilized receptors represent exciting tools for drug discovery. Copyright © 2014 Elsevier B.V. All rights reserved.
High Throughput Experimental Materials Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakutayev, Andriy; Perkins, John; Schwarting, Marcus
The mission of the High Throughput Experimental Materials Database (HTEM DB) is to enable discovery of new materials with useful properties by releasing large amounts of high-quality experimental data to public. The HTEM DB contains information about materials obtained from high-throughput experiments at the National Renewable Energy Laboratory (NREL).
Hawkins, Liam J; Storey, Kenneth B
2017-01-01
Common Western-blot imaging systems have previously been adapted to measure signals from luminescent microplate assays. This can be a cost saving measure as Western-blot imaging systems are common laboratory equipment and could substitute a dedicated luminometer if one is not otherwise available. One previously unrecognized limitation is that the signals captured by the cameras in these systems are not equal for all wells. Signals are dependent on the angle of incidence to the camera, and thus the location of the well on the microplate. Here we show that: •The position of a well on a microplate significantly affects the signal captured by a common Western-blot imaging system from a luminescent assay.•The effect of well position can easily be corrected for.•This method can be applied to commercially available luminescent assays, allowing for high-throughput quantification of a wide range of biological processes and biochemical reactions.
Control structures for high speed processors
NASA Technical Reports Server (NTRS)
Maki, G. K.; Mankin, R.; Owsley, P. A.; Kim, G. M.
1982-01-01
A special processor was designed to function as a Reed Solomon decoder with throughput data rate in the Mhz range. This data rate is significantly greater than is possible with conventional digital architectures. To achieve this rate, the processor design includes sequential, pipelined, distributed, and parallel processing. The processor was designed using a high level language register transfer language. The RTL can be used to describe how the different processes are implemented by the hardware. One problem of special interest was the development of dependent processes which are analogous to software subroutines. For greater flexibility, the RTL control structure was implemented in ROM. The special purpose hardware required approximately 1000 SSI and MSI components. The data rate throughput is 2.5 megabits/second. This data rate is achieved through the use of pipelined and distributed processing. This data rate can be compared with 800 kilobits/second in a recently proposed very large scale integration design of a Reed Solomon encoder.
Large-Scale Discovery of Induced Point Mutations With High-Throughput TILLING
Till, Bradley J.; Reynolds, Steven H.; Greene, Elizabeth A.; Codomo, Christine A.; Enns, Linda C.; Johnson, Jessica E.; Burtner, Chris; Odden, Anthony R.; Young, Kim; Taylor, Nicholas E.; Henikoff, Jorja G.; Comai, Luca; Henikoff, Steven
2003-01-01
TILLING (Targeting Induced Local Lesions in Genomes) is a general reverse-genetic strategy that provides an allelic series of induced point mutations in genes of interest. High-throughput TILLING allows the rapid and low-cost discovery of induced point mutations in populations of chemically mutagenized individuals. As chemical mutagenesis is widely applicable and mutation detection for TILLING is dependent only on sufficient yield of PCR products, TILLING can be applied to most organisms. We have developed TILLING as a service to the Arabidopsis community known as the Arabidopsis TILLING Project (ATP). Our goal is to rapidly deliver allelic series of ethylmethanesulfonate-induced mutations in target 1-kb loci requested by the international research community. In the first year of public operation, ATP has discovered, sequenced, and delivered >1000 mutations in >100 genes ordered by Arabidopsis researchers. The tools and methodologies described here can be adapted to create similar facilities for other organisms. PMID:12618384
Cellular resolution functional imaging in behaving rats using voluntary head restraint
Scott, Benjamin B.; Brody, Carlos D.; Tank, David W.
2013-01-01
SUMMARY High-throughput operant conditioning systems for rodents provide efficient training on sophisticated behavioral tasks. Combining these systems with technologies for cellular resolution functional imaging would provide a powerful approach to study neural dynamics during behavior. Here we describe an integrated two-photon microscope and behavioral apparatus that allows cellular resolution functional imaging of cortical regions during epochs of voluntary head restraint. Rats were trained to initiate periods of restraint up to 8 seconds in duration, which provided the mechanical stability necessary for in vivo imaging while allowing free movement between behavioral trials. A mechanical registration system repositioned the head to within a few microns, allowing the same neuronal populations to be imaged on each trial. In proof-of-principle experiments, calcium dependent fluorescence transients were recorded from GCaMP-labeled cortical neurons. In contrast to previous methods for head restraint, this system can also be incorporated into high-throughput operant conditioning systems. PMID:24055015
TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.
Fimereli, Danai; Detours, Vincent; Konopka, Tomasz
2013-04-01
High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.
Optimizing ultrafast illumination for multiphoton-excited fluorescence imaging
Stoltzfus, Caleb R.; Rebane, Aleksander
2016-01-01
We study the optimal conditions for high throughput two-photon excited fluorescence (2PEF) and three-photon excited fluorescence (3PEF) imaging using femtosecond lasers. We derive relations that allow maximization of the rate of imaging depending on the average power, pulse repetition rate, and noise characteristics of the laser, as well as on the size and structure of the sample. We perform our analysis using ~100 MHz, ~1 MHz and 1 kHz pulse rates and using both a tightly-focused illumination beam with diffraction-limited image resolution, as well loosely focused illumination with a relatively low image resolution, where the latter utilizes separate illumination and fluorescence detection beam paths. Our theoretical estimates agree with the experiments, which makes our approach especially useful for optimizing high throughput imaging of large samples with a field-of-view up to 10x10 cm2. PMID:27231620
Kondrashova, Olga; Love, Clare J.; Lunke, Sebastian; Hsu, Arthur L.; Waring, Paul M.; Taylor, Graham R.
2015-01-01
Whilst next generation sequencing can report point mutations in fixed tissue tumour samples reliably, the accurate determination of copy number is more challenging. The conventional Multiplex Ligation-dependent Probe Amplification (MLPA) assay is an effective tool for measurement of gene dosage, but is restricted to around 50 targets due to size resolution of the MLPA probes. By switching from a size-resolved format, to a sequence-resolved format we developed a scalable, high-throughput, quantitative assay. MLPA-seq is capable of detecting deletions, duplications, and amplifications in as little as 5ng of genomic DNA, including from formalin-fixed paraffin-embedded (FFPE) tumour samples. We show that this method can detect BRCA1, BRCA2, ERBB2 and CCNE1 copy number changes in DNA extracted from snap-frozen and FFPE tumour tissue, with 100% sensitivity and >99.5% specificity. PMID:26569395
"Gadd45b" Knockout Mice Exhibit Selective Deficits in Hippocampus-Dependent Long-Term Memory
ERIC Educational Resources Information Center
Leach, Prescott T.; Poplawski, Shane G.; Kenney, Justin W.; Hoffman, Barbara; Liebermann, Dan A.; Abel, Ted; Gould, Thomas J.
2012-01-01
Growth arrest and DNA damage-inducible [beta] ("Gadd45b") has been shown to be involved in DNA demethylation and may be important for cognitive processes. "Gadd45b" is abnormally expressed in subjects with autism and psychosis, two disorders associated with cognitive deficits. Furthermore, several high-throughput screens have identified "Gadd45b"…
Castration Resistance in Prostate Cancer Is Mediated by the Kinase NEK6. | Office of Cancer Genomics
In prostate cancer, the development of castration resistance is pivotal in progression to aggressive disease. However, understanding of the pathways involved remains incomplete. In this study, we performed a high-throughput genetic screen to identify kinases that enable tumor formation by androgen-dependent prostate epithelial (LHSR-AR) cells under androgen-deprived conditions.
Using High Throughput Screens to Identify Lead Compounds for Alzheimer’s Disease Therapeutics
2008-11-01
showed that six red wine polyphenols, myricetin, morin, quercetin , kaemphferol, catechin and epicatechin were able to dose dependently inhibit formation...O OH OHHO morin OHO OH O OH OH OH quercetin OHO OH O OH OH kaempf erol OHO OH O OH OH (+)-catechin OHO OH O OH OH (-)-epicatechin OHO OH OH OH OH O O
van der Gaast—de Jongh, Christa E.; Diavatopoulos, Dimitri A.; de Jonge, Marien I.
2017-01-01
The respiratory pathogen Streptococcus pneumoniae is a major cause of diseases such as otitis media, pneumonia, sepsis and meningitis. The first step towards infection is colonization of the nasopharynx. Recently, it was shown that agglutinating antibodies play an important role in the prevention of mucosal colonization with S. pneumoniae. Here, we present a novel method to quantify antibody-dependent pneumococcal agglutination in a high-throughput manner using flow cytometry. We found that the concentration of agglutinating antibodies against pneumococcal capsule are directly correlated with changes in the size and complexity of bacterial aggregates, as measured by flow cytometry and confirmed by light microscopy. Using the increase in size, we determined the agglutination index. The cutoff value was set by measuring a series of non-agglutinating antibodies. With this method, we show that not only anti-polysaccharide capsule antibodies are able to induce agglutination but that also anti-PspA protein antibodies have agglutinating capabilities. In conclusion, we have described and validated a novel method to quantify pneumococcal agglutination, which can be used to screen sera from murine or human vaccination studies, in a high-throughput manner. PMID:28288168
Long-lasting, experience-dependent alcohol preference in Drosophila
Peru y Colón de Portugal, Raniero L.; Ojelade, Shamsideen A.; Penninti, Pranav S.; Dove, Rachel J.; Nye, Matthew J.; Acevedo, Summer F.; Lopez, Antonio; Rodan, Aylin R.; Rothenfluh, Adrian
2013-01-01
To understand the molecular and neural mechanisms underlying alcohol addiction, many models ranging from vertebrates to invertebrates have been developed. In Drosophila melanogaster, behavioral paradigms from assaying acute responses to alcohol, to behaviors more closely modeling addiction, have emerged in recent years. However, both the CAFÉ assay, similar to a 2-bottle choice consumption assay, as well as conditioned odor preference, where ethanol is used as the reinforcer, are labor intensive and have low throughput. To address this limitation, we have established a novel ethanol consumption preference assay, called FRAPPÉ, which allows for fast, high throughput measurement of consumption in individual flies, using a fluorescence plate reader. We show that naïve flies do not prefer to consume ethanol, but various pre-exposures, such as ethanol vapor or voluntary ethanol consumption, induce ethanol preference. This ethanol-primed preference is long lasting and is not driven by calories contained in ethanol during the consumption choice. Our novel experience-dependent model of ethanol preference in Drosophila – a highly genetically tractable organism – therefore recapitulates salient features of human alcohol abuse and will facilitate the molecular understanding of the development of alcohol preference. PMID:24164972
20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Evaluation of Sequencing Approaches for High-Throughput Transcriptomics - (BOSC)
Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. The generation of high-throughput global gene expression...
Fault-Tolerant and Elastic Streaming MapReduce with Decentralized Coordination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumbhare, Alok; Frincu, Marc; Simmhan, Yogesh
2015-06-29
The MapReduce programming model, due to its simplicity and scalability, has become an essential tool for processing large data volumes in distributed environments. Recent Stream Processing Systems (SPS) extend this model to provide low-latency analysis of high-velocity continuous data streams. However, integrating MapReduce with streaming poses challenges: first, the runtime variations in data characteristics such as data-rates and key-distribution cause resource overload, that inturn leads to fluctuations in the Quality of the Service (QoS); and second, the stateful reducers, whose state depends on the complete tuple history, necessitates efficient fault-recovery mechanisms to maintain the desired QoS in the presence ofmore » resource failures. We propose an integrated streaming MapReduce architecture leveraging the concept of consistent hashing to support runtime elasticity along with locality-aware data and state replication to provide efficient load-balancing with low-overhead fault-tolerance and parallel fault-recovery from multiple simultaneous failures. Our evaluation on a private cloud shows up to 2:8 improvement in peak throughput compared to Apache Storm SPS, and a low recovery latency of 700 -1500 ms from multiple failures.« less
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo; ...
2017-08-08
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problemmore » thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. Furthermore, we will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.« less
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problemmore » thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. Furthermore, we will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.« less
The HEP.TrkX Project: deep neural networks for HL-LHC online and offline tracking
NASA Astrophysics Data System (ADS)
Farrell, Steven; Anderson, Dustin; Calafiura, Paolo; Cerati, Giuseppe; Gray, Lindsey; Kowalkowski, Jim; Mudigonda, Mayur; Prabhat; Spentzouris, Panagiotis; Spiropoulou, Maria; Tsaris, Aristeidis; Vlimant, Jean-Roch; Zheng, Stephan
2017-08-01
Particle track reconstruction in dense environments such as the detectors of the High Luminosity Large Hadron Collider (HL-LHC) is a challenging pattern recognition problem. Traditional tracking algorithms such as the combinatorial Kalman Filter have been used with great success in LHC experiments for years. However, these state-of-the-art techniques are inherently sequential and scale poorly with the expected increases in detector occupancy in the HL-LHC conditions. The HEP.TrkX project is a pilot project with the aim to identify and develop cross-experiment solutions based on machine learning algorithms for track reconstruction. Machine learning algorithms bring a lot of potential to this problem thanks to their capability to model complex non-linear data dependencies, to learn effective representations of high-dimensional data through training, and to parallelize easily on high-throughput architectures such as GPUs. This contribution will describe our initial explorations into this relatively unexplored idea space. We will discuss the use of recurrent (LSTM) and convolutional neural networks to find and fit tracks in toy detector data.
Jeudy, Christian; Adrian, Marielle; Baussard, Christophe; Bernard, Céline; Bernaud, Eric; Bourion, Virginie; Busset, Hughes; Cabrera-Bosquet, Llorenç; Cointault, Frédéric; Han, Simeng; Lamboeuf, Mickael; Moreau, Delphine; Pivato, Barbara; Prudent, Marion; Trouvelot, Sophie; Truong, Hoai Nam; Vernoud, Vanessa; Voisin, Anne-Sophie; Wipf, Daniel; Salon, Christophe
2016-01-01
In order to maintain high yields while saving water and preserving non-renewable resources and thus limiting the use of chemical fertilizer, it is crucial to select plants with more efficient root systems. This could be achieved through an optimization of both root architecture and root uptake ability and/or through the improvement of positive plant interactions with microorganisms in the rhizosphere. The development of devices suitable for high-throughput phenotyping of root structures remains a major bottleneck. Rhizotrons suitable for plant growth in controlled conditions and non-invasive image acquisition of plant shoot and root systems (RhizoTubes) are described. These RhizoTubes allow growing one to six plants simultaneously, having a maximum height of 1.1 m, up to 8 weeks, depending on plant species. Both shoot and root compartment can be imaged automatically and non-destructively throughout the experiment thanks to an imaging cabin (RhizoCab). RhizoCab contains robots and imaging equipment for obtaining high-resolution pictures of plant roots. Using this versatile experimental setup, we illustrate how some morphometric root traits can be determined for various species including model (Medicago truncatula), crops (Pisum sativum, Brassica napus, Vitis vinifera, Triticum aestivum) and weed (Vulpia myuros) species grown under non-limiting conditions or submitted to various abiotic and biotic constraints. The measurement of the root phenotypic traits using this system was compared to that obtained using "classic" growth conditions in pots. This integrated system, to include 1200 Rhizotubes, will allow high-throughput phenotyping of plant shoots and roots under various abiotic and biotic environmental conditions. Our system allows an easy visualization or extraction of roots and measurement of root traits for high-throughput or kinetic analyses. The utility of this system for studying root system architecture will greatly facilitate the identification of genetic and environmental determinants of key root traits involved in crop responses to stresses, including interactions with soil microorganisms.
Usefulness of heterologous promoters in the Pseudozyma flocculosa gene expression system.
Avis, Tyler J; Anguenot, Raphaël; Neveu, Bertrand; Bolduc, Sébastien; Zhao, Yingyi; Cheng, Yali; Labbé, Caroline; Belzile, François; Bélanger, Richard R
2008-02-01
The basidiomycetous fungus Pseudozyma flocculosa represents a promising new host for the expression of complex recombinant proteins. Two novel heterologous promoter sequences, the Ustilago maydis glyceraldehyde-3-phosphate dehydrogenase (GPD) and Pseudozyma tsukubaensis alpha-glucosidase promoters, were tested for their ability to provide expression in P. flocculosa. In liquid medium, these two promoters produced lower levels of intracellular green fluorescent protein (GFP) as compared to the U. maydis hsp70 promoter. However, GPD and alpha-glucosidase sequences behaved as constitutive promoters whereas the hsp70 promoter appeared to be morphology-dependent. When using the hsp70 promoter, the expression of GFP increased proportionally to the concentration of hygromycin in the culture medium, indicating possible induction of the promoter by the antibiotic. Optimal solid-state culture conditions were designed for high throughput screening of hygromycin-resistant transformants with the hsp70 promoter in P. flocculosa.
Dynamic Environmental Photosynthetic Imaging Reveals Emergent Phenotypes
Cruz, Jeffrey A.; Savage, Linda J.; Zegarac, Robert; ...
2016-06-22
Understanding and improving the productivity and robustness of plant photosynthesis requires high-throughput phenotyping under environmental conditions that are relevant to the field. Here we demonstrate the dynamic environmental photosynthesis imager (DEPI), an experimental platform for integrated, continuous, and high-throughput measurements of photosynthetic parameters during plant growth under reproducible yet dynamic environmental conditions. Using parallel imagers obviates the need to move plants or sensors, reducing artifacts and allowing simultaneous measurement on large numbers of plants. As a result, DEPI can reveal phenotypes that are not evident under standard laboratory conditions but emerge under progressively more dynamic illumination. We show examples inmore » mutants of Arabidopsis of such “emergent phenotypes” that are highly transient and heterogeneous, appearing in different leaves under different conditions and depending in complex ways on both environmental conditions and plant developmental age. Finally, these emergent phenotypes appear to be caused by a range of phenomena, suggesting that such previously unseen processes are critical for plant responses to dynamic environments.« less
Mapping DNA polymerase errors by single-molecule sequencing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David F.; Lu, Jenny; Chang, Seungwoo
Genomic integrity is compromised by DNA polymerase replication errors, which occur in a sequence-dependent manner across the genome. Accurate and complete quantification of a DNA polymerase's error spectrum is challenging because errors are rare and difficult to detect. We report a high-throughput sequencing assay to map in vitro DNA replication errors at the single-molecule level. Unlike previous methods, our assay is able to rapidly detect a large number of polymerase errors at base resolution over any template substrate without quantification bias. To overcome the high error rate of high-throughput sequencing, our assay uses a barcoding strategy in which each replicationmore » product is tagged with a unique nucleotide sequence before amplification. Here, this allows multiple sequencing reads of the same product to be compared so that sequencing errors can be found and removed. We demonstrate the ability of our assay to characterize the average error rate, error hotspots and lesion bypass fidelity of several DNA polymerases.« less
Dolado, Ignacio; Nieto, Joan; Saraiva, Maria João M; Arsequell, Gemma; Valencia, Gregori; Planas, Antoni
2005-01-01
Stabilization of tetrameric transthyretin (TTR) by binding of small ligands is a current strategy aimed at inhibiting amyloid fibrillogenesis in transthyretin-associated pathologies, such as senile systemic amyloidosis (SSA) and familial amyloidotic polyneuropathy (FAP). A kinetic assay is developed for rapid evaluation of compounds as potential in vitro inhibitors in a high-throughput screening format. It is based on monitoring the time-dependent increase of absorbance due to turbidity occurring by acid-induced protein aggregation. The method uses the highly amyloidogenic Y78F mutant of human transthyretin (heterogously expressed in Escherichia coli cells). Initial rates of protein aggregation at different inhibitor concentrations follow a monoexponential dose-response curve from which inhibition parameters are calculated. For the assay development, thyroid hormones and nonsteroidal antiinflamatory drugs were chosen among other reference compounds. Some of them are already known to be in vitro inhibitors of TTR amyloidogenesis. Analysis time is optimized to last 1.5 h, and the method is implemented in microtiter plates for screening of libraries of potential fibrillogenesis inhibitors.
Mapping DNA polymerase errors by single-molecule sequencing
Lee, David F.; Lu, Jenny; Chang, Seungwoo; ...
2016-05-16
Genomic integrity is compromised by DNA polymerase replication errors, which occur in a sequence-dependent manner across the genome. Accurate and complete quantification of a DNA polymerase's error spectrum is challenging because errors are rare and difficult to detect. We report a high-throughput sequencing assay to map in vitro DNA replication errors at the single-molecule level. Unlike previous methods, our assay is able to rapidly detect a large number of polymerase errors at base resolution over any template substrate without quantification bias. To overcome the high error rate of high-throughput sequencing, our assay uses a barcoding strategy in which each replicationmore » product is tagged with a unique nucleotide sequence before amplification. Here, this allows multiple sequencing reads of the same product to be compared so that sequencing errors can be found and removed. We demonstrate the ability of our assay to characterize the average error rate, error hotspots and lesion bypass fidelity of several DNA polymerases.« less
Bhardwaj, Vinay; Srinivasan, Supriya; McGoron, Anthony J
2015-06-21
High throughput intracellular delivery strategies, electroporation, passive and TATHA2 facilitated diffusion of colloidal silver nanoparticles (AgNPs) are investigated for cellular toxicity and uptake using state-of-art analytical techniques. The TATHA2 facilitated approach efficiently delivered high payload with no toxicity, pre-requisites for intracellular applications of plasmonic metal nanoparticles (PMNPs) in sensing and therapeutics.
High Throughput Determination of Critical Human Dosing Parameters (SOT)
High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...
High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)
High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...
Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos
Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...
A Simple Model of Nitrogen Concentration, Throughput, and Denitrification in Estuaries
The Estuary Nitrogen Model (ENM) is a mass balance model that includes calculation of nitrogen losses within bays and estuaries using system flushing time. The model has been used to demonstrate the dependence of throughput and denitrification of nitrogen in bays and estuaries on...
Yoon, Miyoung; Clewell, Harvey J; Andersen, Melvin E
2013-02-01
High throughput in vitro biochemical and cell-based assays have the promise to provide more mechanism-based assessments of the adverse effects of large numbers of chemicals. One of the most challenging hurdles for interpreting in vitro toxicity findings is the need for reverse dosimetry tools that estimate the exposures that will give concentrations in vivo similar to the active concentrations in vitro. Recent experience using IVIVE approaches to estimate in vivo pharmacokinetics (Wetmore et al., 2012) identified the need to develop a hepatic clearance equation that explicitly accounted for a broader set of protein binding and membrane transport processes and did not depend on a well-mixed description of the liver compartment. Here we derive an explicit steady-state hepatic clearance equation that includes these factors. In addition to the derivation, we provide simple computer code to calculate steady-state extraction for any combination of blood flow, membrane transport processes and plasma protein-chemical binding rates. This expanded equation provides a tool to estimate hepatic clearance for a more diverse array of compounds. Copyright © 2012 Elsevier Ltd. All rights reserved.
Soulard, Patricia; McLaughlin, Meg; Stevens, Jessica; Connolly, Brendan; Coli, Rocco; Wang, Leyu; Moore, Jennifer; Kuo, Ming-Shang T; LaMarr, William A; Ozbal, Can C; Bhat, B Ganesh
2008-10-03
Several recent reports suggest that stearoyl-CoA desaturase 1 (SCD1), the rate-limiting enzyme in monounsaturated fatty acid synthesis, plays an important role in regulating lipid homeostasis and lipid oxidation in metabolically active tissues. As several manifestations of type 2 diabetes and related metabolic disorders are associated with alterations in intracellular lipid partitioning, pharmacological manipulation of SCD1 activity might be of benefit in the treatment of these disease states. In an effort to identify small molecule inhibitors of SCD1, we have developed a mass spectrometry based high-throughput screening (HTS) assay using deuterium labeled stearoyl-CoA substrate and induced rat liver microsomes. The methodology developed allows the use of a nonradioactive substrate which avoids interference by the endogenous SCD1 substrate and/or product that exist in the non-purified enzyme source. Throughput of the assay was up to twenty 384-well assay plates per day. The assay was linear with protein concentration and time, and was saturable for stearoyl-CoA substrate (K(m)=10.5 microM). The assay was highly reproducible with an average Z' value=0.6. Conjugated linoleic acid and sterculic acid, known inhibitors of SCD1, exhibited IC(50) values of 0.88 and 0.12 microM, respectively. High-throughput mass spectrometry screening of over 1.7 million compounds in compressed format demonstrated that the enzyme target is druggable. A total of 2515 hits were identified (0.1% hit rate), and 346 were confirmed active (>40% inhibition of total SCD activity at 20 microM--14% conformation rate). Of the confirmed hits 172 had IC(50) values of <10 microM, including 111 <1 microM and 48 <100 nM. A large number of potent drug-like (MW<450) hits representing six different chemical series were identified. The application of mass spectrometry to high-throughput screening permitted the development of a high-quality screening protocol for an otherwise intractable target, SCD1. Further medicinal chemistry and characterization of SCD inhibitors should lead to the development of reagents to treat metabolic disorders.
Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...
2015-01-07
Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.
Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun
2017-01-01
Abstract Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure–property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure–property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure–property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials. PMID:28458737
High-throughput screening (HTS) and modeling of the retinoid ...
Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system
Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)
High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...
High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes
USDA-ARS?s Scientific Manuscript database
High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...
Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)
Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...
A quantitative literature-curated gold standard for kinase-substrate pairs
2011-01-01
We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431
High-Throughput Industrial Coatings Research at The Dow Chemical Company.
Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T
2016-09-12
At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.
Tiersch, Terrence R.; Yang, Huiping; Hu, E.
2011-01-01
With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666
Next Generation Sequencing at the University of Chicago Genomics Core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faber, Pieter
2013-04-24
The University of Chicago Genomics Core provides University of Chicago investigators (and external clients) access to State-of-the-Art genomics capabilities: next generation sequencing, Sanger sequencing / genotyping and micro-arrays (gene expression, genotyping, and methylation). The current presentation will highlight our capabilities in the area of ultra-high throughput sequencing analysis.
This is a presentation describing CSS research on HT predictive methods to modeling exposure and predicting functional substitutes. It will be presented at a forum co-sponsored by the State of California and UC Berekeley on evaluation of chemical alternatives for food contact ch...
High-throughput biosensors for multiplexed foodborne pathogen detection
USDA-ARS?s Scientific Manuscript database
Incidental contamination of foods by harmful bacteria (such as E. coli and Salmonella) and the toxins that they produce is a serious threat to public health and the economy in the United States. The presence of such bacteri and toxins in foods must be rapidly determined at various stages of food pr...
USDA-ARS?s Scientific Manuscript database
This protocol describes a method by which a large collection of the leafy green vegetable lettuce (Lactuca sativa L.) germplasm was screened for likely drought-tolerance traits. Fresh water availability for agricultural use is a growing concern across the United States as well as many regions of th...
QPatch: the missing link between HTS and ion channel drug discovery.
Mathes, Chris; Friis, Søren; Finley, Michael; Liu, Yi
2009-01-01
The conventional patch clamp has long been considered the best approach for studying ion channel function and pharmacology. However, its low throughput has been a major hurdle to overcome for ion channel drug discovery. The recent emergence of higher throughput, automated patch clamp technology begins to break this bottleneck by providing medicinal chemists with high-quality, information-rich data in a more timely fashion. As such, these technologies have the potential to bridge a critical missing link between high-throughput primary screening and meaningful ion channel drug discovery programs. One of these technologies, the QPatch automated patch clamp system developed by Sophion Bioscience, records whole-cell ion channel currents from 16 or 48 individual cells in a parallel fashion. Here, we review the general applicability of the QPatch to studying a wide variety of ion channel types (voltage-/ligand-gated cationic/anionic channels) in various expression systems. The success rate of gigaseals, formation of the whole-cell configuration and usable cells ranged from 40-80%, depending on a number of factors including the cell line used, ion channel expressed, assay development or optimization time and expression level in these studies. We present detailed analyses of the QPatch features and results in case studies in which secondary screening assays were successfully developed for a voltage-gated calcium channel and a ligand-gated TRP channel. The increase in throughput compared to conventional patch clamp with the same cells was approximately 10-fold. We conclude that the QPatch, combining high data quality and speed with user friendliness and suitability for a wide array of ion channels, resides on the cutting edge of automated patch clamp technology and plays a pivotal role in expediting ion channel drug discovery.
High-throughput methods for electron crystallography.
Stokes, David L; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas
2013-01-01
Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing a native lipid environment for these proteins. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, electron microscopy can be used to collect images and diffraction and the corresponding data can be combined to produce a three-dimensional reconstruction, which under favorable conditions can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on titration of cyclodextrin as a chelating agent for detergent; a specialized pipetting robot has been designed not only to add cyclodextrin in a systematic way, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described.
Mahendran, Shalini M; Oikonomopoulou, Katerina; Diamandis, Eleftherios P; Chandran, Vinod
Synovial fluid (SF) is a protein-rich fluid produced into the joint cavity by cells of the synovial membrane. Due to its direct contact with articular cartilage, surfaces of the bone, and the synoviocytes of the inner membrane, it provides a promising reflection of the biochemical state of the joint under varying physiological and pathophysiological conditions. This property of SF has been exploited within numerous studies in search of unique biomarkers of joint pathologies with the ultimate goal of developing minimally invasive clinical assays to detect and/or monitor disease states. Several proteomic methodologies have been employed to mine the SF proteome. From elementary immunoassays to high-throughput analyses using mass spectrometry-based techniques, each has demonstrated distinct advantages and disadvantages in the identification and quantification of SF proteins. This review will explore the role of SF in the elucidation of the arthritis proteome and the extent to which high-throughput techniques have facilitated the discovery and validation of protein biomarkers from osteoarthritis (OA), rheumatoid arthritis (RA), psoriatic arthritis (PsA), and juvenile idiopathic arthritis (JIA) patients.
Sanchez-Luque, Francisco J; Richardson, Sandra R; Faulkner, Geoffrey J
2016-01-01
Mobile genetic elements (MGEs) are of critical importance in genomics and developmental biology. Polymorphic and somatic MGE insertions have the potential to impact the phenotype of an individual, depending on their genomic locations and functional consequences. However, the identification of polymorphic and somatic insertions among the plethora of copies residing in the genome presents a formidable technical challenge. Whole genome sequencing has the potential to address this problem; however, its efficacy depends on the abundance of cells carrying the new insertion. Robust detection of somatic insertions present in only a subset of cells within a given sample can also be prohibitively expensive due to a requirement for high sequencing depth. Here, we describe retrotransposon capture sequencing (RC-seq), a sequence capture approach in which Illumina libraries are enriched for fragments containing the 5' and 3' termini of specific MGEs. RC-seq allows the detection of known polymorphic insertions present in an individual, as well as the identification of rare or private germline insertions not previously described. Furthermore, RC-seq can be used to detect and characterize somatic insertions, providing a valuable tool to elucidate the extent and characteristics of MGE activity in healthy tissues and in various disease states.
Rahi, Praveen; Prakash, Om; Shouche, Yogesh S.
2016-01-01
Matrix-assisted laser desorption/ionization time-of-flight mass-spectrometry (MALDI-TOF MS) based biotyping is an emerging technique for high-throughput and rapid microbial identification. Due to its relatively higher accuracy, comprehensive database of clinically important microorganisms and low-cost compared to other microbial identification methods, MALDI-TOF MS has started replacing existing practices prevalent in clinical diagnosis. However, applicability of MALDI-TOF MS in the area of microbial ecology research is still limited mainly due to the lack of data on non-clinical microorganisms. Intense research activities on cultivation of microbial diversity by conventional as well as by innovative and high-throughput methods has substantially increased the number of microbial species known today. This important area of research is in urgent need of rapid and reliable method(s) for characterization and de-replication of microorganisms from various ecosystems. MALDI-TOF MS based characterization, in our opinion, appears to be the most suitable technique for such studies. Reliability of MALDI-TOF MS based identification method depends mainly on accuracy and width of reference databases, which need continuous expansion and improvement. In this review, we propose a common strategy to generate MALDI-TOF MS spectral database and advocated its sharing, and also discuss the role of MALDI-TOF MS based high-throughput microbial identification in microbial ecology studies. PMID:27625644
Zhao, Yuzheng; Wang, Aoxue; Zou, Yejun; Su, Ni; Loscalzo, Joseph; Yang, Yi
2016-08-01
NADH and its oxidized form NAD(+) have a central role in energy metabolism, and their concentrations are often considered to be among the most important readouts of metabolic state. Here, we present a detailed protocol to image and monitor NAD(+)/NADH redox state in living cells and in vivo using a highly responsive, genetically encoded fluorescent sensor known as SoNar (sensor of NAD(H) redox). The chimeric SoNar protein was initially developed by inserting circularly permuted yellow fluorescent protein (cpYFP) into the NADH-binding domain of Rex protein from Thermus aquaticus (T-Rex). It functions by binding to either NAD(+) or NADH, thus inducing protein conformational changes that affect its fluorescent properties. We first describe steps for how to establish SoNar-expressing cells, and then discuss how to use the system to quantify the intracellular redox state. This approach is sensitive, accurate, simple and able to report subtle perturbations of various pathways of energy metabolism in real time. We also detail the application of SoNar to high-throughput chemical screening of candidate compounds targeting cell metabolism in a microplate-reader-based assay, along with in vivo fluorescence imaging of tumor xenografts expressing SoNar in mice. Typically, the approximate time frame for fluorescence imaging of SoNar is 30 min for living cells and 60 min for living mice. For high-throughput chemical screening in a 384-well-plate assay, the whole procedure generally takes no longer than 60 min to assess the effects of 380 compounds on cell metabolism.
Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich
2006-01-01
The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.
Sha, Shankar Prasad; Jani, Kunal; Sharma, Avinash; Anupma, Anu; Pradhan, Pooja; Shouche, Yogesh; Tamang, Jyoti Prakash
2017-09-08
Marcha and thiat are traditionally prepared amylolytic starters use for production of various ethnic alcoholic beverages in Sikkim and Meghalaya states in India. In the present study we have tried to investigate the bacterial and fungal community composition of marcha and thiat by using high throughput sequencing. Characterization of bacterial community depicts phylum Proteobacteria is the most dominant in both marcha (91.4%) and thiat (53.8%), followed by Firmicutes, and Actinobacteria. Estimates of fungal community composition showed Ascomycota as the dominant phylum. Presence of Zygomycota in marcha distinguishes it from the thiat. The results of NGS analysis revealed dominance of yeasts in marcha whereas molds out numbers in case of thiat. This is the first report on microbial communities of traditionally prepared amylolytic starters of India using high throughput sequencing.
Richens, Joanna L; Urbanowicz, Richard A; Lunt, Elizabeth AM; Metcalf, Rebecca; Corne, Jonathan; Fairclough, Lucy; O'Shea, Paul
2009-01-01
Chronic obstructive pulmonary disease (COPD) is a treatable and preventable disease state, characterised by progressive airflow limitation that is not fully reversible. Although COPD is primarily a disease of the lungs there is now an appreciation that many of the manifestations of disease are outside the lung, leading to the notion that COPD is a systemic disease. Currently, diagnosis of COPD relies on largely descriptive measures to enable classification, such as symptoms and lung function. Here the limitations of existing diagnostic strategies of COPD are discussed and systems biology approaches to diagnosis that build upon current molecular knowledge of the disease are described. These approaches rely on new 'label-free' sensing technologies, such as high-throughput surface plasmon resonance (SPR), that we also describe. PMID:19386108
Gloux, Karine; Leclerc, Marion; Iliozer, Harout; L'Haridon, René; Manichanh, Chaysavanh; Corthier, Gérard; Nalin, Renaud; Blottière, Hervé M; Doré, Joël
2007-06-01
Metagenomic libraries derived from human intestinal microbiota (20,725 clones) were screened for epithelial cell growth modulation. Modulatory clones belonging to the four phyla represented among the metagenomic libraries were identified (hit rate, 0.04 to 8.7% depending on the screening cutoff). Several candidate loci were identified by transposon mutagenesis and subcloning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, Ryan T.; Wang, Chenchen; Rausch, Sarah J.
2014-07-01
A hybrid microchip/capillary CE system was developed to allow unbiased and lossless sample loading and high throughput repeated injections. This new hybrid CE system consists of a polydimethylsiloxane (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel and a fused silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channelmore » and the fused silica capillary separation column. Analytes are rapidly separated in the fused silica capillary with high resolution. High sensitivity MS detection after CE separation is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a good linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates and CE separation voltages.« less
Logares, Ramiro; Haverkamp, Thomas H A; Kumar, Surendra; Lanzén, Anders; Nederbragt, Alexander J; Quince, Christopher; Kauserud, Håvard
2012-10-01
The incursion of High-Throughput Sequencing (HTS) in environmental microbiology brings unique opportunities and challenges. HTS now allows a high-resolution exploration of the vast taxonomic and metabolic diversity present in the microbial world, which can provide an exceptional insight on global ecosystem functioning, ecological processes and evolution. This exploration has also economic potential, as we will have access to the evolutionary innovation present in microbial metabolisms, which could be used for biotechnological development. HTS is also challenging the research community, and the current bottleneck is present in the data analysis side. At the moment, researchers are in a sequence data deluge, with sequencing throughput advancing faster than the computer power needed for data analysis. However, new tools and approaches are being developed constantly and the whole process could be depicted as a fast co-evolution between sequencing technology, informatics and microbiologists. In this work, we examine the most popular and recently commercialized HTS platforms as well as bioinformatics methods for data handling and analysis used in microbial metagenomics. This non-exhaustive review is intended to serve as a broad state-of-the-art guide to researchers expanding into this rapidly evolving field. Copyright © 2012 Elsevier B.V. All rights reserved.
High-Throughput Models for Exposure-Based Chemical ...
The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research program to prioritize chemical inventories for potential hazard. Similar capabilities for estimating exposure potential would support rapid risk-based prioritization for chemicals with limited information; here, we propose a framework for high-throughput exposure assessment. To demonstrate application, an analysis was conducted that predicts human exposure potential for chemicals and estimates uncertainty in these predictions by comparison to biomonitoring data. We evaluated 1936 chemicals using far-field mass balance human exposure models (USEtox and RAIDAR) and an indicator for indoor and/or consumer use. These predictions were compared to exposures inferred by Bayesian analysis from urine concentrations for 82 chemicals reported in the National Health and Nutrition Examination Survey (NHANES). Joint regression on all factors provided a calibrated consensus prediction, the variance of which serves as an empirical determination of uncertainty for prioritization on absolute exposure potential. Information on use was found to be most predictive; generally, chemicals above the limit of detection in NHANES had consumer/indoor use. Coupled with hazard HTS, exposure HTS can place risk earlie
Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda
2017-10-01
The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.
Arbelle, Assaf; Reyes, Jose; Chen, Jia-Yun; Lahav, Galit; Riklin Raviv, Tammy
2018-04-22
We present a novel computational framework for the analysis of high-throughput microscopy videos of living cells. The proposed framework is generally useful and can be applied to different datasets acquired in a variety of laboratory settings. This is accomplished by tying together two fundamental aspects of cell lineage construction, namely cell segmentation and tracking, via a Bayesian inference of dynamic models. In contrast to most existing approaches, which aim to be general, no assumption of cell shape is made. Spatial, temporal, and cross-sectional variation of the analysed data are accommodated by two key contributions. First, time series analysis is exploited to estimate the temporal cell shape uncertainty in addition to cell trajectory. Second, a fast marching (FM) algorithm is used to integrate the inferred cell properties with the observed image measurements in order to obtain image likelihood for cell segmentation, and association. The proposed approach has been tested on eight different time-lapse microscopy data sets, some of which are high-throughput, demonstrating promising results for the detection, segmentation and association of planar cells. Our results surpass the state of the art for the Fluo-C2DL-MSC data set of the Cell Tracking Challenge (Maška et al., 2014). Copyright © 2018 Elsevier B.V. All rights reserved.
Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping
2016-10-01
Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.
A Triple-Fluorophore-Labeled Nucleic Acid pH Nanosensor to Investigate Non-viral Gene Delivery.
Wilson, David R; Routkevitch, Denis; Rui, Yuan; Mosenia, Arman; Wahlin, Karl J; Quinones-Hinojosa, Alfredo; Zack, Donald J; Green, Jordan J
2017-07-05
There is a need for new tools to better quantify intracellular delivery barriers in high-throughput and high-content ways. Here, we synthesized a triple-fluorophore-labeled nucleic acid pH nanosensor for measuring intracellular pH of exogenous DNA at specific time points in a high-throughput manner by flow cytometry following non-viral transfection. By including two pH-sensitive fluorophores and one pH-insensitive fluorophore in the nanosensor, detection of pH was possible over the full physiological range. We further assessed possible correlation between intracellular pH of delivered DNA, cellular uptake of DNA, and DNA reporter gene expression at 24 hr post-transfection for poly-L-lysine and branched polyethylenimine polyplex nanoparticles. While successful transfection was shown to clearly depend on median cellular pH of delivered DNA at the cell population level, surprisingly, on an individual cell basis, there was no significant correlation between intracellular pH and transfection efficacy. To our knowledge, this is the first reported instance of high-throughput single-cell analysis between cellular uptake of DNA, intracellular pH of delivered DNA, and gene expression of the delivered DNA. Using the nanosensor, we demonstrate that the ability of polymeric nanoparticles to avoid an acidic environment is necessary, but not sufficient, for successful transfection. Copyright © 2017 The American Society of Gene and Cell Therapy. Published by Elsevier Inc. All rights reserved.
A high-throughput approach to profile RNA structure.
Delli Ponti, Riccardo; Marti, Stefanie; Armaos, Alexandros; Tartaglia, Gian Gaetano
2017-03-17
Here we introduce the Computational Recognition of Secondary Structure (CROSS) method to calculate the structural profile of an RNA sequence (single- or double-stranded state) at single-nucleotide resolution and without sequence length restrictions. We trained CROSS using data from high-throughput experiments such as Selective 2΄-Hydroxyl Acylation analyzed by Primer Extension (SHAPE; Mouse and HIV transcriptomes) and Parallel Analysis of RNA Structure (PARS; Human and Yeast transcriptomes) as well as high-quality NMR/X-ray structures (PDB database). The algorithm uses primary structure information alone to predict experimental structural profiles with >80% accuracy, showing high performances on large RNAs such as Xist (17 900 nucleotides; Area Under the ROC Curve AUC of 0.75 on dimethyl sulfate (DMS) experiments). We integrated CROSS in thermodynamics-based methods to predict secondary structure and observed an increase in their predictive power by up to 30%. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Systems cell biology of the mitotic spindle.
Saleem, Ramsey A; Aitchison, John D
2010-01-11
Cell division depends critically on the temporally controlled assembly of mitotic spindles, which are responsible for the distribution of duplicated chromosomes to each of the two daughter cells. To gain insight into the process, Vizeacoumar et al., in this issue (Vizeacoumar et al. 2010. J. Cell Biol. doi:10.1083/jcb.200909013), have combined systems genetics with high-throughput and high-content imaging to comprehensively identify and classify novel components that contribute to the morphology and function of the mitotic spindle.
Electric Propulsion of a Different Class: The Challenges of Testing for MegaWatt Missions
2012-08-01
mode akin to steady state. Realizing that the pumping capacity of the Large Vacuum Test Facility (LVTF) at PEPL... Pumping High T/P thruster testing requires high propellant throughput. This reality necessitates the careful survey and selection of appropriate...test facilities to ensure that they have 1) sufficient pumping speed to maintain desired operating pressures and 2) adequate size to mitigate facility
A Combination Therapy of JO-I and Chemotherapy in Ovarian Cancer Models
2013-10-01
which consists of a 3PAR storage backend and is sharing data via a highly available NetApp storage gateway and 2 high throughput commodity storage...Environment is configured as self- service Enterprise cloud and currently hosts more than 700 virtual machines. The network infrastructure consists of...technology infrastructure and information system applications designed to integrate, automate, and standardize operations. These systems fuse state of
High throughput toxicology programs, such as ToxCast and Tox21, have provided biological effects data for thousands of chemicals at multiple concentrations. Compared to traditional, whole-organism approaches, high throughput assays are rapid and cost-effective, yet they generall...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Souders, Christopher L; Liang, Xuefang; Wang, Xiaohong; Ector, Naomi; Zhao, Yuan H; Martyniuk, Christopher J
2018-06-01
Mitochondrial dysfunction is a prevalent molecular event that can result in multiple adverse outcomes. Recently, a novel high throughput method to assess metabolic capacity in fish embryos following exposure to chemicals has been adapted for environmental toxicology. Assessments of oxygen consumption rates using the Seahorse XF(e) 24/96 Extracellular Flux Analyzer (Agilent Technologies) can be used to garner insight into toxicant effects at early stages of development. Here we synthesize the current state of the science using high throughput metabolic profiling in zebrafish embryos, and present considerations for those wishing to adopt high throughput methods for mitochondrial bioenergetics into their research. Chemicals that have been investigated in zebrafish using this metabolic platform include herbicides (e.g. paraquat, diquat), industrial compounds (e.g. benzo-[a]-pyrene, tributyltin), natural products (e.g. quercetin), and anti-bacterial chemicals (i.e. triclosan). Some of these chemicals inhibit mitochondrial endpoints in the μM-mM range, and reduce basal respiration, maximum respiration, and spare capacity. We present a theoretical framework for how one can use mitochondrial performance data in zebrafish to categorize chemicals of concern and prioritize mitochondrial toxicants. Noteworthy is that our studies demonstrate that there can be considerable variation in basal respiration of untreated zebrafish embryos due to clutch-specific effects as well as individual variability, and basal oxygen consumption rates (OCR) can vary on average between 100 and 300 pmol/min/embryo. We also compare OCR between chorionated and dechorionated embryos, as both models are employed to test chemicals. After 24 h, dechorionated embryos remain responsive to mitochondrial toxicants, although they show a blunted response to the uncoupling agent carbonylcyanide-4-trifluoromethoxyphenylhydrazone (FCCP); dechorionated embryos are therefore a viable option for investigations into mitochondrial bioenergetics. We present an adverse outcome pathway framework that incorporates endpoints related to mitochondrial bioenergetics. High throughput bioenergetics assays conducted using whole embryos are expected to support adverse outcome pathways for mitochondrial dysfunction. Copyright © 2018 Elsevier B.V. All rights reserved.
[Current applications of high-throughput DNA sequencing technology in antibody drug research].
Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong
2012-03-01
Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.
O'Donnell, Michael
2015-01-01
State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf
Link and Network Layers Design for Ultra-High-Speed Terahertz-Band Communications Networks
2017-01-01
throughput, and identify the optimal parameter values for their design (Sec. 6.2.3). Moreover, we validate and test the scheme with experimental data obtained...LINK AND NETWORK LAYERS DESIGN FOR ULTRA-HIGH- SPEED TERAHERTZ-BAND COMMUNICATIONS NETWORKS STATE UNIVERSITY OF NEW YORK (SUNY) AT BUFFALO JANUARY...TYPE FINAL TECHNICAL REPORT 3. DATES COVERED (From - To) FEB 2015 – SEP 2016 4. TITLE AND SUBTITLE LINK AND NETWORK LAYERS DESIGN FOR ULTRA-HIGH
Kwak, Jihoon; Genovesio, Auguste; Kang, Myungjoo; Hansen, Michael Adsett Edberg; Han, Sung-Jun
2015-01-01
Genotoxicity testing is an important component of toxicity assessment. As illustrated by the European registration, evaluation, authorization, and restriction of chemicals (REACH) directive, it concerns all the chemicals used in industry. The commonly used in vivo mammalian tests appear to be ill adapted to tackle the large compound sets involved, due to throughput, cost, and ethical issues. The somatic mutation and recombination test (SMART) represents a more scalable alternative, since it uses Drosophila, which develops faster and requires less infrastructure. Despite these advantages, the manual scoring of the hairs on Drosophila wings required for the SMART limits its usage. To overcome this limitation, we have developed an automated SMART readout. It consists of automated imaging, followed by an image analysis pipeline that measures individual wing genotoxicity scores. Finally, we have developed a wing score-based dose-dependency approach that can provide genotoxicity profiles. We have validated our method using 6 compounds, obtaining profiles almost identical to those obtained from manual measures, even for low-genotoxicity compounds such as urethane. The automated SMART, with its faster and more reliable readout, fulfills the need for a high-throughput in vivo test. The flexible imaging strategy we describe and the analysis tools we provide should facilitate the optimization and dissemination of our methods. PMID:25830368
Saccharomyces cerevisiae as a platform for assessing sphingolipid lipid kinase inhibitors
Agah, Sayeh; Mendelson, Anna J.; Eletu, Oluwafunmilayo T.; Barkey-Bircann, Peter; Gesualdi, James
2018-01-01
Successful medicinal chemistry campaigns to discover and optimize sphingosine kinase inhibitors require a robust assay for screening chemical libraries and for determining rank order potencies. Existing assays for these enzymes are laborious, expensive and/or low throughput. The toxicity of excessive levels of phosphorylated sphingoid bases for the budding yeast, Saccharomyces cerevisiae, affords an assay wherein inhibitors added to the culture media rescue growth in a dose-dependent fashion. Herein, we describe our adaptation of a simple, inexpensive, and high throughput assay for assessing inhibitors of sphingosine kinase types 1 and 2 as well as ceramide kinase and for testing enzymatic activity of sphingosine kinase type 2 mutants. The assay was validated using recombinant enzymes and generally agrees with the rank order of potencies of existing inhibitors. PMID:29672528
NASA Astrophysics Data System (ADS)
Zhang, Xuanni; Zhang, Chunmin
2013-01-01
A polarization interference imaging spectrometer based on Savart polariscope was presented. Its optical throughput was analyzed by Jones calculus. The throughput expression was given, and clearly showed that the optical throughput mainly depended on the intensity of incident light, transmissivity, refractive index and the layout of optical system. The simulation and analysis gave the optimum layout in view of both optical throughput and interference fringe visibility, and verified that the layout of our former design was optimum. The simulation showed that a small deviation from the optimum layout influenced interference fringe visibility little for the optimum one, but influenced severely for others, so a small deviation is admissible in the optimum, and this can mitigate the manufacture difficulty. These results pave the way for further research and engineering design.
A study on the achievable data rate in massive MIMO system
NASA Astrophysics Data System (ADS)
Salh, Adeeb; Audah, Lukman; Shah, Nor Shahida M.; Hamzah, Shipun A.
2017-09-01
The achievable high data rates depend on the ability of massive multi-input-multi-output (MIMO) for the fifth-generation (5G) cellular networks, where the massive MIMO systems can support very high energy and spectral efficiencies. A major challenge in mobile broadband networks is how to support the throughput in the future 5G, where the highlight of 5G expected to provide high speed internet for every user. The performance massive MIMO system increase with linear minimum mean square error (MMSE), zero forcing (ZF) and maximum ratio transmission (MRT) when the number of antennas increases to infinity, by deriving the closed-form approximation for achievable data rate expressions. Meanwhile, the high signal-to-noise ratio (SNR) can be mitigated by using MMSE, ZF and MRT, which are used to suppress the inter-cell interference signals between neighboring cells. The achievable sum rate for MMSE is improved based on the distributed users inside cell, mitigated the inter-cell interference caused when send the same signal by other cells. By contrast, MMSE is better than ZF in perfect channel state information (CSI) for approximately 20% of the achievable sum rate.
Optimisation of wavelength modulated Raman spectroscopy: towards high throughput cell screening.
Praveen, Bavishna B; Mazilu, Michael; Marchington, Robert F; Herrington, C Simon; Riches, Andrew; Dholakia, Kishan
2013-01-01
In the field of biomedicine, Raman spectroscopy is a powerful technique to discriminate between normal and cancerous cells. However the strong background signal from the sample and the instrumentation affects the efficiency of this discrimination technique. Wavelength Modulated Raman spectroscopy (WMRS) may suppress the background from the Raman spectra. In this study we demonstrate a systematic approach for optimizing the various parameters of WMRS to achieve a reduction in the acquisition time for potential applications such as higher throughput cell screening. The Signal to Noise Ratio (SNR) of the Raman bands depends on the modulation amplitude, time constant and total acquisition time. It was observed that the sampling rate does not influence the signal to noise ratio of the Raman bands if three or more wavelengths are sampled. With these optimised WMRS parameters, we increased the throughput in the binary classification of normal human urothelial cells and bladder cancer cells by reducing the total acquisition time to 6 s which is significantly lower in comparison to previous acquisition times required for the discrimination between similar cell types.
Lessons from high-throughput protein crystallization screening: 10 years of practical experience
JR, Luft; EH, Snell; GT, DeTitta
2011-01-01
Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073
High-throughput screening based on label-free detection of small molecule microarrays
NASA Astrophysics Data System (ADS)
Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong
2017-02-01
Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.
High-throughput analysis of yeast replicative aging using a microfluidic system
Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong
2015-01-01
Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ? 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 00b5M forskolin for 48??h to induce steroidogenesis followed by chemical treatment for 48??h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 1703b2-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mec
Handfield, Louis-François; Chong, Yolanda T.; Simmons, Jibril; Andrews, Brenda J.; Moses, Alan M.
2013-01-01
Protein subcellular localization has been systematically characterized in budding yeast using fluorescently tagged proteins. Based on the fluorescence microscopy images, subcellular localization of many proteins can be classified automatically using supervised machine learning approaches that have been trained to recognize predefined image classes based on statistical features. Here, we present an unsupervised analysis of protein expression patterns in a set of high-resolution, high-throughput microscope images. Our analysis is based on 7 biologically interpretable features which are evaluated on automatically identified cells, and whose cell-stage dependency is captured by a continuous model for cell growth. We show that it is possible to identify most previously identified localization patterns in a cluster analysis based on these features and that similarities between the inferred expression patterns contain more information about protein function than can be explained by a previous manual categorization of subcellular localization. Furthermore, the inferred cell-stage associated to each fluorescence measurement allows us to visualize large groups of proteins entering the bud at specific stages of bud growth. These correspond to proteins localized to organelles, revealing that the organelles must be entering the bud in a stereotypical order. We also identify and organize a smaller group of proteins that show subtle differences in the way they move around the bud during growth. Our results suggest that biologically interpretable features based on explicit models of cell morphology will yield unprecedented power for pattern discovery in high-resolution, high-throughput microscopy images. PMID:23785265
2015-01-01
A hybrid microchip/capillary electrophoresis (CE) system was developed to allow unbiased and lossless sample loading and high-throughput repeated injections. This new hybrid CE system consists of a poly(dimethylsiloxane) (PDMS) microchip sample injector featuring a pneumatic microvalve that separates a sample introduction channel from a short sample loading channel, and a fused-silica capillary separation column that connects seamlessly to the sample loading channel. The sample introduction channel is pressurized such that when the pneumatic microvalve opens briefly, a variable-volume sample plug is introduced into the loading channel. A high voltage for CE separation is continuously applied across the loading channel and the fused-silica capillary separation column. Analytes are rapidly separated in the fused-silica capillary, and following separation, high-sensitivity MS detection is accomplished via a sheathless CE/ESI-MS interface. The performance evaluation of the complete CE/ESI-MS platform demonstrated that reproducible sample injection with well controlled sample plug volumes could be achieved by using the PDMS microchip injector. The absence of band broadening from microchip to capillary indicated a minimum dead volume at the junction. The capabilities of the new CE/ESI-MS platform in performing high-throughput and quantitative sample analyses were demonstrated by the repeated sample injection without interrupting an ongoing separation and a linear dependence of the total analyte ion abundance on the sample plug volume using a mixture of peptide standards. The separation efficiency of the new platform was also evaluated systematically at different sample injection times, flow rates, and CE separation voltages. PMID:24865952
GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping
Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan
2016-01-01
Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547
GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping.
Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan
2016-01-01
Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables.
Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.
Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy
2011-03-01
The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.
Content Is King: Databases Preserve the Collective Information of Science.
Yates, John R
2018-04-01
Databases store sequence information experimentally gathered to create resources that further science. In the last 20 years databases have become critical components of fields like proteomics where they provide the basis for large-scale and high-throughput proteomic informatics. Amos Bairoch, winner of the Association of Biomolecular Resource Facilities Frederick Sanger Award, has created some of the important databases proteomic research depends upon for accurate interpretation of data.
High speed micromachining with high power UV laser
NASA Astrophysics Data System (ADS)
Patel, Rajesh S.; Bovatsek, James M.
2013-03-01
Increasing demand for creating fine features with high accuracy in manufacturing of electronic mobile devices has fueled growth for lasers in manufacturing. High power, high repetition rate ultraviolet (UV) lasers provide an opportunity to implement a cost effective high quality, high throughput micromachining process in a 24/7 manufacturing environment. The energy available per pulse and the pulse repetition frequency (PRF) of diode pumped solid state (DPSS) nanosecond UV lasers have increased steadily over the years. Efficient use of the available energy from a laser is important to generate accurate fine features at a high speed with high quality. To achieve maximum material removal and minimal thermal damage for any laser micromachining application, use of the optimal process parameters including energy density or fluence (J/cm2), pulse width, and repetition rate is important. In this study we present a new high power, high PRF QuasarR 355-40 laser from Spectra-Physics with TimeShiftTM technology for unique software adjustable pulse width, pulse splitting, and pulse shaping capabilities. The benefits of these features for micromachining include improved throughput and quality. Specific example and results of silicon scribing are described to demonstrate the processing benefits of the Quasar's available power, PRF, and TimeShift technology.
High-throughput Screening Identification of Poliovirus RNA-dependent RNA Polymerase Inhibitors
Campagnola, Grace; Gong, Peng; Peersen, Olve B.
2011-01-01
Viral RNA-dependent RNA polymerase (RdRP) enzymes are essential for the replication of positive-strand RNA viruses and established targets for the development of selective antiviral therapeutics. In this work we have carried out a high-throughput screen of 154,267 compounds to identify poliovirus polymerase inhibitors using a fluorescence based RNA elongation assay. Screening and subsequent validation experiments using kinetic methods and RNA product analysis resulted in the identification of seven inhibitors that affect the RNA binding, initiation, or elongation activity of the polymerase. X-ray crystallography data show clear density for five of the compounds in the active site of the poliovirus polymerase elongation complex. The inhibitors occupy the NTP binding site by stacking on the priming nucleotide and interacting with the templating base, yet competition studies show fairly weak IC50 values in the low μM range. A comparison with nucleotide bound structures suggests that weak binding is likely due to the lack of a triphosphate group on the inhibitors. Consequently, the inhibitors are primarily effective at blocking polymerase initiation and do not effectively compete with NTP binding during processive elongation. These findings are discussed in the context of the polymerase elongation complex structure and allosteric control of the viral RdRP catalytic cycle. PMID:21722674
Chen, Songchang; Liu, Deyuan; Zhang, Junyu; Li, Shuyuan; Zhang, Lanlan; Fan, Jianxia; Luo, Yuqin; Qian, Yeqing; Huang, Hefeng; Liu, Chao; Zhu, Huanhuan; Jiang, Zhengwen; Xu, Chenming
2017-02-01
Chromosomal abnormalities such as aneuploidy have been shown to be responsible for causing spontaneous abortion. Genetic evaluation of abortions is currently underperformed. Screening for aneuploidy in the products of conception can help determine the etiology. We designed a high-throughput ligation-dependent probe amplification (HLPA) assay to examine aneuploidy of 24 chromosomes in miscarriage tissues and aimed to validate the performance of this technique. We carried out aneuploidy screening in 98 fetal tissue samples collected from female subjects with singleton pregnancies who experienced spontaneous abortion. The mean maternal age was 31.6 years (range: 24-43), and the mean gestational age was 10.2 weeks (range: 4.6-14.1). HLPA was performed in parallel with array comparative genomic hybridization, which is the gold standard for aneuploidy detection in clinical practices. The results from the two platforms were compared. Forty-nine out of ninety-eight samples were found to be aneuploid. HLPA showed concordance with array comparative genomic hybridization in diagnosing aneuploidy. High-throughput ligation-dependent probe amplification is a rapid and accurate method for aneuploidy detection. It can be used as a cost-effective screening procedure in clinical spontaneous abortions. © 2016 John Wiley & Sons, Ltd. © 2016 John Wiley & Sons, Ltd.
Yajuan, Xiao; Xin, Liang; Zhiyuan, Li
2012-01-01
The patch clamp technique is commonly used in electrophysiological experiments and offers direct insight into ion channel properties through the characterization of ion channel activity. This technique can be used to elucidate the interaction between a drug and a specific ion channel at different conformational states to understand the ion channel modulators’ mechanisms. The patch clamp technique is regarded as a gold standard for ion channel research; however, it suffers from low throughput and high personnel costs. In the last decade, the development of several automated electrophysiology platforms has greatly increased the screen throughput of whole cell electrophysiological recordings. New advancements in the automated patch clamp systems have aimed to provide high data quality, high content, and high throughput. However, due to the limitations noted above, automated patch clamp systems are not capable of replacing manual patch clamp systems in ion channel research. While automated patch clamp systems are useful for screening large amounts of compounds in cell lines that stably express high levels of ion channels, the manual patch clamp technique is still necessary for studying ion channel properties in some research areas and for specific cell types, including primary cells that have mixed cell types and differentiated cells that derive from induced pluripotent stem cells (iPSCs) or embryonic stem cells (ESCs). Therefore, further improvements in flexibility with regard to cell types and data quality will broaden the applications of the automated patch clamp systems in both academia and industry. PMID:23346269
High-Throughput Incubation and Quantification of Agglutination Assays in a Microfluidic System.
Castro, David; Conchouso, David; Kodzius, Rimantas; Arevalo, Arpys; Foulds, Ian G
2018-06-04
In this paper, we present a two-phase microfluidic system capable of incubating and quantifying microbead-based agglutination assays. The microfluidic system is based on a simple fabrication solution, which requires only laboratory tubing filled with carrier oil, driven by negative pressure using a syringe pump. We provide a user-friendly interface, in which a pipette is used to insert single droplets of a 1.25-µL volume into a system that is continuously running and therefore works entirely on demand without the need for stopping, resetting or washing the system. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5⁻10-fold improvement over traditional agglutination assays. We study system parameters such as channel length, incubation time and flow speed to select optimal assay conditions, using the streptavidin-biotin interaction as a model analyte quantified using optical image processing. We then investigate the effect of changing the concentration of both analyte and microbead concentrations, with a minimum detection limit of 100 ng/mL. The system can be both low- and high-throughput, depending on the rate at which assays are inserted. In our experiments, we were able to easily produce throughputs of 360 assays per hour by simple manual pipetting, which could be increased even further by automation and parallelization. Agglutination assays are a versatile tool, capable of detecting an ever-growing catalog of infectious diseases, proteins and metabolites. A system such as this one is a step towards being able to produce high-throughput microfluidic diagnostic solutions with widespread adoption. The development of analytical techniques in the microfluidic format, such as the one presented in this work, is an important step in being able to continuously monitor the performance and microfluidic outputs of organ-on-chip devices.
Quantifying Nanoparticle Internalization Using a High Throughput Internalization Assay.
Mann, Sarah K; Czuba, Ewa; Selby, Laura I; Such, Georgina K; Johnston, Angus P R
2016-10-01
The internalization of nanoparticles into cells is critical for effective nanoparticle mediated drug delivery. To investigate the kinetics and mechanism of internalization of nanoparticles into cells we have developed a DNA molecular sensor, termed the Specific Hybridization Internalization Probe - SHIP. Self-assembling polymeric 'pHlexi' nanoparticles were functionalized with a Fluorescent Internalization Probe (FIP) and the interactions with two different cell lines (3T3 and CEM cells) were studied. The kinetics of internalization were quantified and chemical inhibitors that inhibited energy dependent endocytosis (sodium azide), dynamin dependent endocytosis (Dyngo-4a) and macropinocytosis (5-(N-ethyl-N-isopropyl) amiloride (EIPA)) were used to study the mechanism of internalization. Nanoparticle internalization kinetics were significantly faster in 3T3 cells than CEM cells. We have shown that ~90% of the nanoparticles associated with 3T3 cells were internalized, compared to only 20% of the nanoparticles associated with CEM cells. Nanoparticle uptake was via a dynamin-dependent pathway, and the nanoparticles were trafficked to lysosomal compartments once internalized. SHIP is able to distinguish between nanoparticles that are associated on the outer cell membrane from nanoparticles that are internalized. This study demonstrates the assay can be used to probe the kinetics of nanoparticle internalization and the mechanisms by which the nanoparticles are taken up by cells. This information is fundamental for engineering more effective nanoparticle delivery systems. The SHIP assay is a simple and a high-throughput technique that could have wide application in therapeutic delivery research.
Pattison, Amanda M; Blomain, Erik S; Merlino, Dante J; Wang, Fang; Crissey, Mary Ann S; Kraft, Crystal L; Rappaport, Jeff A; Snook, Adam E; Lynch, John P; Waldman, Scott A
2016-10-01
Enterotoxigenic Escherichia coli (ETEC) causes ∼20% of the acute infectious diarrhea (AID) episodes worldwide, often by producing heat-stable enterotoxins (STs), which are peptides structurally homologous to paracrine hormones of the intestinal guanylate cyclase C (GUCY2C) receptor. While molecular mechanisms mediating ST-induced intestinal secretion have been defined, advancements in therapeutics have been hampered for decades by the paucity of disease models that integrate molecular and functional endpoints amenable to high-throughput screening. Here, we reveal that mouse and human intestinal enteroids in three-dimensional ex vivo cultures express the components of the GUCY2C secretory signaling axis. ST and its structural analog, linaclotide, an FDA-approved oral secretagog, induced fluid accumulation quantified simultaneously in scores of enteroid lumens, recapitulating ETEC-induced intestinal secretion. Enteroid secretion depended on canonical molecular signaling events responsible for ETEC-induced diarrhea, including cyclic GMP (cGMP) produced by GUCY2C, activation of cGMP-dependent protein kinase (PKG), and opening of the cystic fibrosis transmembrane conductance regulator (CFTR). Importantly, pharmacological inhibition of CFTR abrogated enteroid fluid secretion, providing proof of concept for the utility of this model to screen antidiarrheal agents. Intestinal enteroids offer a unique model, integrating the GUCY2C signaling axis and luminal fluid secretion, to explore the pathophysiology of, and develop platforms for, high-throughput drug screening to identify novel compounds to prevent and treat ETEC diarrheal disease. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Orgovan, Norbert; Peter, Beatrix; Bősze, Szilvia; Ramsden, Jeremy J; Szabó, Bálint; Horvath, Robert
2014-02-07
A novel high-throughput label-free resonant waveguide grating (RWG) imager biosensor, the Epic® BenchTop (BT), was utilized to determine the dependence of cell spreading kinetics on the average surface density (v(RGD)) of integrin ligand RGD-motifs. v(RGD) was tuned over four orders of magnitude by co-adsorbing the biologically inactive PLL-g-PEG and the RGD-functionalized PLL-g-PEG-RGD synthetic copolymers from their mixed solutions onto the sensor surface. Using highly adherent human cervical tumor (HeLa) cells as a model system, cell adhesion kinetic data of unprecedented quality were obtained. Spreading kinetics were fitted with the logistic equation to obtain the spreading rate constant (r) and the maximum biosensor response (Δλmax), which is assumed to be directly proportional to the maximum spread contact area (Amax). r was found to be independent of the surface density of integrin ligands. In contrast, Δλmax increased with increasing RGD surface density until saturation at high densities. Interpreting the latter behavior with a simple kinetic mass action model, a 2D dissociation constant of 1753 ± 243 μm(-2) (corresponding to a 3D dissociation constant of ~30 μM) was obtained for the binding between RGD-specific integrins embedded in the cell membrane and PLL-g-PEG-RGD. All of these results were obtained completely noninvasively without using any labels.
Kang, Yang Jun; Ha, Young-Ran; Lee, Sang-Joon
2016-01-07
Red blood cell (RBC) deformability has been considered a potential biomarker for monitoring pathological disorders. High throughput and detection of subpopulations in RBCs are essential in the measurement of RBC deformability. In this paper, we propose a new method to measure RBC deformability by evaluating temporal variations in the average velocity of blood flow and image intensity of successively clogged RBCs in the microfluidic channel array for specific time durations. In addition, to effectively detect differences in subpopulations of RBCs, an air compliance effect is employed by adding an air cavity into a disposable syringe. The syringe was equally filled with a blood sample (V(blood) = 0.3 mL, hematocrit = 50%) and air (V(air) = 0.3 mL). Owing to the air compliance effect, blood flow in the microfluidic device behaved transiently depending on the fluidic resistance in the microfluidic device. Based on the transient behaviors of blood flows, the deformability of RBCs is quantified by evaluating three representative parameters, namely, minimum value of the average velocity of blood flow, clogging index, and delivered blood volume. The proposed method was applied to measure the deformability of blood samples consisting of homogeneous RBCs fixed with four different concentrations of glutaraldehyde solution (0%-0.23%). The proposed method was also employed to evaluate the deformability of blood samples partially mixed with normal RBCs and hardened RBCs. Thereafter, the deformability of RBCs infected by human malaria parasite Plasmodium falciparum was measured. As a result, the three parameters significantly varied, depending on the degree of deformability. In addition, the deformability measurement of blood samples was successfully completed in a short time (∼10 min). Therefore, the proposed method has significant potential in deformability measurement of blood samples containing hematological diseases with high throughput and precise detection of subpopulations in RBCs.
Emerging approaches in predictive toxicology.
Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan
2014-12-01
Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.
Emerging Approaches in Predictive Toxicology
Zhang, Luoping; McHale, Cliona M.; Greene, Nigel; Snyder, Ronald D.; Rich, Ivan N.; Aardema, Marilyn J.; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan
2016-01-01
Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. PMID:25044351
McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam
2011-05-01
The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core applies pharmaceutical industry project-management principles in an academic setting by bringing together multidisciplinary teams to fill critical scientific and technology gaps, using an experienced team of industry-trained researchers and project managers. The KU HTS proactively engages in supporting grant applications for extramural funding, intellectual-property management and technology transfer. The KU HTS staff further provides educational opportunities for the KU faculty and students to learn cutting-edge technologies in drug-discovery platforms through seminars, workshops, internships and course teaching. This is the first instalment of a two-part contribution from the KU HTS laboratory.
High-Throughput Single-Cell RNA Sequencing and Data Analysis.
Sagar; Herman, Josip Stefan; Pospisilik, John Andrew; Grün, Dominic
2018-01-01
Understanding biological systems at a single cell resolution may reveal several novel insights which remain masked by the conventional population-based techniques providing an average readout of the behavior of cells. Single-cell transcriptome sequencing holds the potential to identify novel cell types and characterize the cellular composition of any organ or tissue in health and disease. Here, we describe a customized high-throughput protocol for single-cell RNA-sequencing (scRNA-seq) combining flow cytometry and a nanoliter-scale robotic system. Since scRNA-seq requires amplification of a low amount of endogenous cellular RNA, leading to substantial technical noise in the dataset, downstream data filtering and analysis require special care. Therefore, we also briefly describe in-house state-of-the-art data analysis algorithms developed to identify cellular subpopulations including rare cell types as well as to derive lineage trees by ordering the identified subpopulations of cells along the inferred differentiation trajectories.
Accelerating the design of solar thermal fuel materials through high throughput simulations.
Liu, Yun; Grossman, Jeffrey C
2014-12-10
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Cryo-planing of frozen-hydrated samples using cryo triple ion gun milling (CryoTIGM™).
Chang, Irene Y T; Joester, Derk
2015-12-01
Cryo-SEM is a high throughput technique for imaging biological ultrastructure in its most pristine state, i.e. without chemical fixation, embedding, or drying. Freeze fracture is routinely used to prepare internal surfaces for cryo-SEM imaging. However, the propagation of the fracture plane is highly dependent on sample properties, and the resulting surface frequently shows substantial topography, which can complicate image analysis and interpretation. We have developed a broad ion beam milling technique, called cryogenic triple ion gun milling (CryoTIGM™ ['krī-ə-,tīm]), for cryo-planing frozen-hydrated biological specimens. Comparing sample preparation by CryoTIGM™ and freeze fracture in three model systems, Baker's yeast, mouse liver tissue, and whole sea urchin embryos, we find that CryoTIGM™ yields very large (∼700,000 μm(2)) and smooth sections that present ultrastructural details at similar or better quality than freeze-fractured samples. A particular strength of CryoTIGM™ is the ability to section samples with hard-soft contrast such as brittle calcite (CaCO3) spicules in the sea urchin embryo. Copyright © 2015 Elsevier Inc. All rights reserved.
The Current State of Drug Discovery and a Potential Role for NMR Metabolomics
2015-01-01
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics. PMID:24588729
Pilling, Michael J; Henderson, Alex; Bird, Benjamin; Brown, Mick D; Clarke, Noel W; Gardner, Peter
2016-06-23
Infrared microscopy has become one of the key techniques in the biomedical research field for interrogating tissue. In partnership with multivariate analysis and machine learning techniques, it has become widely accepted as a method that can distinguish between normal and cancerous tissue with both high sensitivity and high specificity. While spectral histopathology (SHP) is highly promising for improved clinical diagnosis, several practical barriers currently exist, which need to be addressed before successful implementation in the clinic. Sample throughput and speed of acquisition are key barriers and have been driven by the high volume of samples awaiting histopathological examination. FTIR chemical imaging utilising FPA technology is currently state-of-the-art for infrared chemical imaging, and recent advances in its technology have dramatically reduced acquisition times. Despite this, infrared microscopy measurements on a tissue microarray (TMA), often encompassing several million spectra, takes several hours to acquire. The problem lies with the vast quantities of data that FTIR collects; each pixel in a chemical image is derived from a full infrared spectrum, itself composed of thousands of individual data points. Furthermore, data management is quickly becoming a barrier to clinical translation and poses the question of how to store these incessantly growing data sets. Recently, doubts have been raised as to whether the full spectral range is actually required for accurate disease diagnosis using SHP. These studies suggest that once spectral biomarkers have been predetermined it may be possible to diagnose disease based on a limited number of discrete spectral features. In this current study, we explore the possibility of utilising discrete frequency chemical imaging for acquiring high-throughput, high-resolution chemical images. Utilising a quantum cascade laser imaging microscope with discrete frequency collection at key diagnostic wavelengths, we demonstrate that we can diagnose prostate cancer with high sensitivity and specificity. Finally we extend the study to a large patient dataset utilising tissue microarrays, and show that high sensitivity and specificity can be achieved using high-throughput, rapid data collection, thereby paving the way for practical implementation in the clinic.
Karas, Vlad O; Sinnott-Armstrong, Nicholas A; Varghese, Vici; Shafer, Robert W; Greenleaf, William J; Sherlock, Gavin
2018-01-01
Abstract Much of the within species genetic variation is in the form of single nucleotide polymorphisms (SNPs), typically detected by whole genome sequencing (WGS) or microarray-based technologies. However, WGS produces mostly uninformative reads that perfectly match the reference, while microarrays require genome-specific reagents. We have developed Diff-seq, a sequencing-based mismatch detection assay for SNP discovery without the requirement for specialized nucleic-acid reagents. Diff-seq leverages the Surveyor endonuclease to cleave mismatched DNA molecules that are generated after cross-annealing of a complex pool of DNA fragments. Sequencing libraries enriched for Surveyor-cleaved molecules result in increased coverage at the variant sites. Diff-seq detected all mismatches present in an initial test substrate, with specific enrichment dependent on the identity and context of the variation. Application to viral sequences resulted in increased observation of variant alleles in a biologically relevant context. Diff-Seq has the potential to increase the sensitivity and efficiency of high-throughput sequencing in the detection of variation. PMID:29361139
NASA Astrophysics Data System (ADS)
Xu, Shicai; Zhan, Jian; Man, Baoyuan; Jiang, Shouzhen; Yue, Weiwei; Gao, Shoubao; Guo, Chengang; Liu, Hanping; Li, Zhenhua; Wang, Jihua; Zhou, Yaoqi
2017-03-01
Reliable determination of binding kinetics and affinity of DNA hybridization and single-base mismatches plays an essential role in systems biology, personalized and precision medicine. The standard tools are optical-based sensors that are difficult to operate in low cost and to miniaturize for high-throughput measurement. Biosensors based on nanowire field-effect transistors have been developed, but reliable and cost-effective fabrication remains a challenge. Here, we demonstrate that a graphene single-crystal domain patterned into multiple channels can measure time- and concentration-dependent DNA hybridization kinetics and affinity reliably and sensitively, with a detection limit of 10 pM for DNA. It can distinguish single-base mutations quantitatively in real time. An analytical model is developed to estimate probe density, efficiency of hybridization and the maximum sensor response. The results suggest a promising future for cost-effective, high-throughput screening of drug candidates, genetic variations and disease biomarkers by using an integrated, miniaturized, all-electrical multiplexed, graphene-based DNA array.
Quantifying domain-ligand affinities and specificities by high-throughput holdup assay
Vincentelli, Renaud; Luck, Katja; Poirson, Juline; Polanowska, Jolanta; Abdat, Julie; Blémont, Marilyne; Turchetto, Jeremy; Iv, François; Ricquier, Kevin; Straub, Marie-Laure; Forster, Anne; Cassonnet, Patricia; Borg, Jean-Paul; Jacob, Yves; Masson, Murielle; Nominé, Yves; Reboul, Jérôme; Wolff, Nicolas; Charbonnier, Sebastian; Travé, Gilles
2015-01-01
Many protein interactions are mediated by small linear motifs interacting specifically with defined families of globular domains. Quantifying the specificity of a motif requires measuring and comparing its binding affinities to all its putative target domains. To this aim, we developed the high-throughput holdup assay, a chromatographic approach that can measure up to a thousand domain-motif equilibrium binding affinities per day. Extracts of overexpressed domains are incubated with peptide-coated resins and subjected to filtration. Binding affinities are deduced from microfluidic capillary electrophoresis of flow-throughs. After benchmarking the approach on 210 PDZ-peptide pairs with known affinities, we determined the affinities of two viral PDZ-binding motifs derived from Human Papillomavirus E6 oncoproteins for 209 PDZ domains covering 79% of the human PDZome. We obtained exquisite sequence-dependent binding profiles, describing quantitatively the PDZome recognition specificity of each motif. This approach, applicable to many categories of domain-ligand interactions, has a wide potential for quantifying the specificities of interactomes. PMID:26053890
FLIC: High-Throughput, Continuous Analysis of Feeding Behaviors in Drosophila
Pletcher, Scott D.
2014-01-01
We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter) detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies. PMID:24978054
A real-time high-throughput fluorescence assay for sphingosine kinases
Lima, Santiago; Milstien, Sheldon; Spiegel, Sarah
2014-01-01
Sphingosine kinases (SphKs), of which there are two isoforms, SphK1 and SphK2, have been implicated in regulation of many important cellular processes. We have developed an assay for monitoring SphK1 and SphK2 activity in real time without the need for organic partitioning of products, radioactive materials, or specialized equipment. The assay conveniently follows SphK-dependent changes in 7-nitro-2-1,3-benzoxadiazol-4-yl (NBD)-labeled sphingosine (Sph) fluorescence and can be easily performed in 384-well plate format with small reaction volumes. We present data showing dose-proportional responses to enzyme, substrate, and inhibitor concentrations. The SphK1 and SphK2 binding affinities for NBD-Sph and the IC50 values of inhibitors determined were consistent with those reported with other methods. Because of the versatility and simplicity of the assay, it should facilitate the routine characterization of inhibitors and SphK mutants and can be readily used for compound library screening in high-throughput format. PMID:24792926
Asati, Atul; Kachurina, Olga; Kachurin, Anatoly
2012-01-01
Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders. PMID:22952605
High-throughput sample adaptive offset hardware architecture for high-efficiency video coding
NASA Astrophysics Data System (ADS)
Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin
2018-03-01
A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.
Kronewitter, Scott R; An, Hyun Joo; de Leoz, Maria Lorna; Lebrilla, Carlito B; Miyamoto, Suzanne; Leiserowitz, Gary S
2009-06-01
Annotation of the human serum N-linked glycome is a formidable challenge but is necessary for disease marker discovery. A new theoretical glycan library was constructed and proposed to provide all possible glycan compositions in serum. It was developed based on established glycobiology and retrosynthetic state-transition networks. We find that at least 331 compositions are possible in the serum N-linked glycome. By pairing the theoretical glycan mass library with a high mass accuracy and high-resolution MS, human serum glycans were effectively profiled. Correct isotopic envelope deconvolution to monoisotopic masses and the high mass accuracy instruments drastically reduced the amount of false composition assignments. The high throughput capacity enabled by this library permitted the rapid glycan profiling of large control populations. With the use of the library, a human serum glycan mass profile was developed from 46 healthy individuals. This paper presents a theoretical N-linked glycan mass library that was used for accurate high-throughput human serum glycan profiling. Rapid methods for evaluating a patient's glycome are instrumental for studying glycan-based markers.
An overview of the Nuclear Electric Xenon Ion System (NEXIS) program
NASA Technical Reports Server (NTRS)
Polk, Jay E.; Goebel, Don; Brophy, John R.; Beatty, John; Monheiser, J.; Giles, D.; Hobson, D.; Wilson, F.; Christensen, J.; De Pano, M.;
2003-01-01
NASA is investigating high power, high specific impulse propulsion technologies that could enable ambitious flights such as multi-body rendezvous missions, outer planet orbiters and interstellar precursor missions. The requirements for these missions are much more demanding than those for state-of-the-art solar-powered ion propulsion applications. The purpose of the NEXIS program is to develop advanced ion thruster technologies that satisfy the requirements for high power, high specific impulse operation, high efficiency and long thruster life. The nominal design point for the NEXIS thruster is 20 kWe at a specific impulse of 7500 s with an efficiency over 78% and a xenon throughput capability of greater than 2000 kg. These performance and throughput goals will be achieved by applying a combination of advanced technologies including a large discharge chamber, erosion resistant carbon-carbon grids, an advanced reservoir hollow cathode and techniques for increasing propellant efficiency such as grid masking and accelerator grid aperture diameter tailoring. This paper provides an overview of the challenges associated with these requirements and how they are being addressed in the NEXIS program.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Model Selection in Systems Biology Depends on Experimental Design
Silk, Daniel; Kirk, Paul D. W.; Barnes, Chris P.; Toni, Tina; Stumpf, Michael P. H.
2014-01-01
Experimental design attempts to maximise the information available for modelling tasks. An optimal experiment allows the inferred models or parameters to be chosen with the highest expected degree of confidence. If the true system is faithfully reproduced by one of the models, the merit of this approach is clear - we simply wish to identify it and the true parameters with the most certainty. However, in the more realistic situation where all models are incorrect or incomplete, the interpretation of model selection outcomes and the role of experimental design needs to be examined more carefully. Using a novel experimental design and model selection framework for stochastic state-space models, we perform high-throughput in-silico analyses on families of gene regulatory cascade models, to show that the selected model can depend on the experiment performed. We observe that experimental design thus makes confidence a criterion for model choice, but that this does not necessarily correlate with a model's predictive power or correctness. Finally, in the special case of linear ordinary differential equation (ODE) models, we explore how wrong a model has to be before it influences the conclusions of a model selection analysis. PMID:24922483
Model selection in systems biology depends on experimental design.
Silk, Daniel; Kirk, Paul D W; Barnes, Chris P; Toni, Tina; Stumpf, Michael P H
2014-06-01
Experimental design attempts to maximise the information available for modelling tasks. An optimal experiment allows the inferred models or parameters to be chosen with the highest expected degree of confidence. If the true system is faithfully reproduced by one of the models, the merit of this approach is clear - we simply wish to identify it and the true parameters with the most certainty. However, in the more realistic situation where all models are incorrect or incomplete, the interpretation of model selection outcomes and the role of experimental design needs to be examined more carefully. Using a novel experimental design and model selection framework for stochastic state-space models, we perform high-throughput in-silico analyses on families of gene regulatory cascade models, to show that the selected model can depend on the experiment performed. We observe that experimental design thus makes confidence a criterion for model choice, but that this does not necessarily correlate with a model's predictive power or correctness. Finally, in the special case of linear ordinary differential equation (ODE) models, we explore how wrong a model has to be before it influences the conclusions of a model selection analysis.
Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-09-01
Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.
NASA Astrophysics Data System (ADS)
Bae, Euiwon; Patsekin, Valery; Rajwa, Bartek; Bhunia, Arun K.; Holdman, Cheryl; Davisson, V. Jo; Hirleman, E. Daniel; Robinson, J. Paul
2012-04-01
A microbial high-throughput screening (HTS) system was developed that enabled high-speed combinatorial studies directly on bacterial colonies. The system consists of a forward scatterometer for elastic light scatter (ELS) detection, a plate transporter for sample handling, and a robotic incubator for automatic incubation. To minimize the ELS pattern-capturing time, a new calibration plate and correction algorithms were both designed, which dramatically reduced correction steps during acquisition of the circularly symmetric ELS patterns. Integration of three different control software programs was implemented, and the performance of the system was demonstrated with single-species detection for library generation and with time-resolved measurement for understanding ELS colony growth correlation, using Escherichia coli and Listeria. An in-house colony-tracking module enabled researchers to easily understand the time-dependent variation of the ELS from identical colony, which enabled further analysis in other biochemical experiments. The microbial HTS system provided an average scan time of 4.9 s per colony and the capability of automatically collecting more than 4000 ELS patterns within a 7-h time span.
Czerniecki, Stefan M; Cruz, Nelly M; Harder, Jennifer L; Menon, Rajasree; Annis, James; Otto, Edgar A; Gulieva, Ramila E; Islas, Laura V; Kim, Yong Kyun; Tran, Linh M; Martins, Timothy J; Pippin, Jeffrey W; Fu, Hongxia; Kretzler, Matthias; Shankland, Stuart J; Himmelfarb, Jonathan; Moon, Randall T; Paragas, Neal; Freedman, Benjamin S
2018-05-15
Organoids derived from human pluripotent stem cells are a potentially powerful tool for high-throughput screening (HTS), but the complexity of organoid cultures poses a significant challenge for miniaturization and automation. Here, we present a fully automated, HTS-compatible platform for enhanced differentiation and phenotyping of human kidney organoids. The entire 21-day protocol, from plating to differentiation to analysis, can be performed automatically by liquid-handling robots, or alternatively by manual pipetting. High-content imaging analysis reveals both dose-dependent and threshold effects during organoid differentiation. Immunofluorescence and single-cell RNA sequencing identify previously undetected parietal, interstitial, and partially differentiated compartments within organoids and define conditions that greatly expand the vascular endothelium. Chemical modulation of toxicity and disease phenotypes can be quantified for safety and efficacy prediction. Screening in gene-edited organoids in this system reveals an unexpected role for myosin in polycystic kidney disease. Organoids in HTS formats thus establish an attractive platform for multidimensional phenotypic screening. Copyright © 2018 Elsevier Inc. All rights reserved.
The draft genome sequence of cork oak
Ramos, António Marcos; Usié, Ana; Barbosa, Pedro; Barros, Pedro M.; Capote, Tiago; Chaves, Inês; Simões, Fernanda; Abreu, Isabl; Carrasquinho, Isabel; Faro, Carlos; Guimarães, Joana B.; Mendonça, Diogo; Nóbrega, Filomena; Rodrigues, Leandra; Saibo, Nelson J. M.; Varela, Maria Carolina; Egas, Conceição; Matos, José; Miguel, Célia M.; Oliveira, M. Margarida; Ricardo, Cândido P.; Gonçalves, Sónia
2018-01-01
Cork oak (Quercus suber) is native to southwest Europe and northwest Africa where it plays a crucial environmental and economical role. To tackle the cork oak production and industrial challenges, advanced research is imperative but dependent on the availability of a sequenced genome. To address this, we produced the first draft version of the cork oak genome. We followed a de novo assembly strategy based on high-throughput sequence data, which generated a draft genome comprising 23,347 scaffolds and 953.3 Mb in size. A total of 79,752 genes and 83,814 transcripts were predicted, including 33,658 high-confidence genes. An InterPro signature assignment was detected for 69,218 transcripts, which represented 82.6% of the total. Validation studies demonstrated the genome assembly and annotation completeness and highlighted the usefulness of the draft genome for read mapping of high-throughput sequence data generated using different protocols. All data generated is available through the public databases where it was deposited, being therefore ready to use by the academic and industry communities working on cork oak and/or related species. PMID:29786699
The draft genome sequence of cork oak.
Ramos, António Marcos; Usié, Ana; Barbosa, Pedro; Barros, Pedro M; Capote, Tiago; Chaves, Inês; Simões, Fernanda; Abreu, Isabl; Carrasquinho, Isabel; Faro, Carlos; Guimarães, Joana B; Mendonça, Diogo; Nóbrega, Filomena; Rodrigues, Leandra; Saibo, Nelson J M; Varela, Maria Carolina; Egas, Conceição; Matos, José; Miguel, Célia M; Oliveira, M Margarida; Ricardo, Cândido P; Gonçalves, Sónia
2018-05-22
Cork oak (Quercus suber) is native to southwest Europe and northwest Africa where it plays a crucial environmental and economical role. To tackle the cork oak production and industrial challenges, advanced research is imperative but dependent on the availability of a sequenced genome. To address this, we produced the first draft version of the cork oak genome. We followed a de novo assembly strategy based on high-throughput sequence data, which generated a draft genome comprising 23,347 scaffolds and 953.3 Mb in size. A total of 79,752 genes and 83,814 transcripts were predicted, including 33,658 high-confidence genes. An InterPro signature assignment was detected for 69,218 transcripts, which represented 82.6% of the total. Validation studies demonstrated the genome assembly and annotation completeness and highlighted the usefulness of the draft genome for read mapping of high-throughput sequence data generated using different protocols. All data generated is available through the public databases where it was deposited, being therefore ready to use by the academic and industry communities working on cork oak and/or related species.
Fluorescence High-Throughput Screening for Inhibitors of TonB Action.
Nairn, Brittany L; Eliasson, Olivia S; Hyder, Dallas R; Long, Noah J; Majumdar, Aritri; Chakravorty, Somnath; McDonald, Peter; Roy, Anuradha; Newton, Salete M; Klebba, Phillip E
2017-05-15
Gram-negative bacteria acquire ferric siderophores through TonB-dependent outer membrane transporters (TBDT). By fluorescence spectroscopic hgh-throughput screening (FLHTS), we identified inhibitors of TonB-dependent ferric enterobactin (FeEnt) uptake through Escherichia coli FepA (EcoFepA). Among 165 inhibitors found in a primary screen of 17,441 compounds, we evaluated 20 in secondary tests: TonB-dependent ferric siderophore uptake and colicin killing and proton motive force-dependent lactose transport. Six of 20 primary hits inhibited TonB-dependent activity in all tests. Comparison of their effects on [ 59 Fe]Ent and [ 14 C]lactose accumulation suggested several as proton ionophores, but two chemicals, ebselen and ST0082990, are likely not proton ionophores and may inhibit TonB-ExbBD. The facility of FLHTS against E. coli led us to adapt it to Acinetobacter baumannii We identified its FepA ortholog (AbaFepA), deleted and cloned its structural gene, genetically engineered 8 Cys substitutions in its surface loops, labeled them with fluorescein, and made fluorescence spectroscopic observations of FeEnt uptake in A. baumannii Several Cys substitutions in AbaFepA (S279C, T562C, and S665C) were readily fluoresceinated and then suitable as sensors of FeEnt transport. As in E. coli , the test monitored TonB-dependent FeEnt uptake by AbaFepA. In microtiter format with A. baumannii , FLHTS produced Z' factors 0.6 to 0.8. These data validated the FLHTS strategy against even distantly related Gram-negative bacterial pathogens. Overall, it discovered agents that block TonB-dependent transport and showed the potential to find compounds that act against Gram-negative CRE (carbapenem-resistant Enterobacteriaceae) /ESKAPE ( Enterococcus faecium , Staphylococcus aureus , Klebsiella pneumoniae , Acinetobacter baumannii , Pseudomonas aeruginosa , and Enterobacter species) pathogens. Our results suggest that hundreds of such chemicals may exist in larger compound libraries. IMPORTANCE Antibiotic resistance in Gram-negative bacteria has spurred efforts to find novel compounds against new targets. The CRE/ESKAPE pathogens are resistant bacteria that include Acinetobacter baumannii , a common cause of ventilator-associated pneumonia and sepsis. We performed fluorescence high-throughput screening (FLHTS) against Escherichia coli to find inhibitors of TonB-dependent iron transport, tested them against A. baumannii , and then adapted the FLHTS technology to allow direct screening against A. baumannii This methodology is expandable to other drug-resistant Gram-negative pathogens. Compounds that block TonB action may interfere with iron acquisition from eukaryotic hosts and thereby constitute bacteriostatic antibiotics that prevent microbial colonization of human and animals. The FLHTS method may identify both species-specific and broad-spectrum agents against Gram-negative bacteria. Copyright © 2017 American Society for Microbiology.
Altered pharmacology of native rodent spinal cord TRPV1 after phosphorylation
Mogg, AJ; Mill, CEJ; Folly, EA; Beattie, RE; Blanco, MJ; Beck, JP; Broad, LM
2013-01-01
Background and Purpose Evidence suggests that phosphorylation of TRPV1 is an important component underlying its aberrant activation in pathological pain states. To date, the detailed pharmacology of diverse TRPV1 receptor agonists and antagonists has yet to be reported for native TRPV1 under phosphorylating conditions. Our goal was to optimize a relatively high-throughput methodology to allow pharmacological characterization of the native TRPV1 receptor using a spinal cord neuropeptide release assay under naive and phosphorylating states. Experimental Approach Herein, we describe characterization of rodent TRPV1 by measurement of CGRP release from acutely isolated lumbar (L1-L6) spinal cord using a 96-well technique that combines use of native, adult tissue with quantitation of CGRP release by elisa. Key Results We have studied a diverse panel of TRPV1 agonists and antagonists under basal and phosphorylating conditions. We show that TRPV1-mediated CGRP release is evoked, in a temperature-dependent manner, by a PKC activator, phorbol 12,13-dibutyrate (PDBu); and that treatment with PDBu increases the potency and efficacy of known TRPV1 chemical agonists, in an agonist-specific manner. We also show that the pharmacological profile of diverse TRPV1 antagonists is dependent on whether the stimulus is PDBu or capsaicin. Of note, HPPB was identified as an antagonist of capsaicin-evoked, but a potentiator of PDBu-evoked, CGRP release. Conclusions and Implications Our findings indicate that both TRPV1 agonist and antagonist profiles can be differentially altered by PKC activation. These findings may offer new insights for targeting TRPV1 in pain states. PMID:23062150
Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo
2008-01-23
To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.
NASA Astrophysics Data System (ADS)
Zhang, Xiao-Yong; Wang, Guang-Hua; Xu, Xin-Ya; Nong, Xu-Hua; Wang, Jie; Amin, Muhammad; Qi, Shu-Hua
2016-10-01
The present study investigated the fungal diversity in four different deep-sea sediments from Okinawa Trough using high-throughput Illumina sequencing of the nuclear ribosomal internal transcribed spacer-1 (ITS1). A total of 40,297 fungal ITS1 sequences clustered into 420 operational taxonomic units (OTUs) with 97% sequence similarity and 170 taxa were recovered from these sediments. Most ITS1 sequences (78%) belonged to the phylum Ascomycota, followed by Basidiomycota (17.3%), Zygomycota (1.5%) and Chytridiomycota (0.8%), and a small proportion (2.4%) belonged to unassigned fungal phyla. Compared with previous studies on fungal diversity of sediments from deep-sea environments by culture-dependent approach and clone library analysis, the present result suggested that Illumina sequencing had been dramatically accelerating the discovery of fungal community of deep-sea sediments. Furthermore, our results revealed that Sordariomycetes was the most diverse and abundant fungal class in this study, challenging the traditional view that the diversity of Sordariomycetes phylotypes was low in the deep-sea environments. In addition, more than 12 taxa accounted for 21.5% sequences were found to be rarely reported as deep-sea fungi, suggesting the deep-sea sediments from Okinawa Trough harbored a plethora of different fungal communities compared with other deep-sea environments. To our knowledge, this study is the first exploration of the fungal diversity in deep-sea sediments from Okinawa Trough using high-throughput Illumina sequencing.
Fu, Jiaqi; Fernandez, Daniel; Ferrer, Marc; Titus, Steven A; Buehler, Eugen; Lal-Nag, Madhu A
2017-06-01
The widespread use of two-dimensional (2D) monolayer cultures for high-throughput screening (HTS) to identify targets in drug discovery has led to attrition in the number of drug targets being validated. Solid tumors are complex, aberrantly growing microenvironments that harness structural components from stroma, nutrients fed through vasculature, and immunosuppressive factors. Increasing evidence of stromally-derived signaling broadens the complexity of our understanding of the tumor microenvironment while stressing the importance of developing better models that reflect these interactions. Three-dimensional (3D) models may be more sensitive to certain gene-silencing events than 2D models because of their components of hypoxia, nutrient gradients, and increased dependence on cell-cell interactions and therefore are more representative of in vivo interactions. Colorectal cancer (CRC) and breast cancer (BC) models composed of epithelial cells only, deemed single-cell-type tumor spheroids (SCTS) and multi-cell-type tumor spheroids (MCTS), containing fibroblasts were developed for RNAi HTS in 384-well microplates with flat-bottom wells for 2D screening and round-bottom, ultra-low-attachment wells for 3D screening. We describe the development of a high-throughput assay platform that can assess physiologically relevant phenotypic differences between screening 2D versus 3D SCTS, 3D SCTS, and MCTS in the context of different cancer subtypes. This assay platform represents a paradigm shift in how we approach drug discovery that can reduce the attrition rate of drugs that enter the clinic.
Dawadi, Mahesh B.; Degliumberto, Lou; Perry, David S.; ...
2017-08-10
We used a high-throughput CW slit-jet apparatus coupled to a high-resolution FTIR to record the asymmetric NO stretch band of nitromethane. The b-type band, including torsionally excited states with m ≤ 3, has been assigned for Ka" ≤ 10, J" ≤ 20. The ground state combination differences derived from these assigned levels were fit with the RAM36 program to give an RMS deviation of 0.0006 cm -1. The band origin is 1583.0 (±0.1) cm -1 and the torsional level spacing is nearly identical to that in the ground state. The upper state levels are split into multiplets by perturbations. Wemore » also fit a subset of the available upper state combination differences for m = 0, Ka' ≤ 7, J' ≤ 10 with the same program, but with rather poorer precision (0.01 cm -1) than for the ground state.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawadi, Mahesh B.; Degliumberto, Lou; Perry, David S.
We used a high-throughput CW slit-jet apparatus coupled to a high-resolution FTIR to record the asymmetric NO stretch band of nitromethane. The b-type band, including torsionally excited states with m ≤ 3, has been assigned for Ka" ≤ 10, J" ≤ 20. The ground state combination differences derived from these assigned levels were fit with the RAM36 program to give an RMS deviation of 0.0006 cm -1. The band origin is 1583.0 (±0.1) cm -1 and the torsional level spacing is nearly identical to that in the ground state. The upper state levels are split into multiplets by perturbations. Wemore » also fit a subset of the available upper state combination differences for m = 0, Ka' ≤ 7, J' ≤ 10 with the same program, but with rather poorer precision (0.01 cm -1) than for the ground state.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawadi, Mahesh B.; Degliumberto, Lou; Perry, David S.
A high-throughput CW slit-jet apparatus coupled to a high-resolution FTIR was used to record the asymmetric NO stretch band of nitromethane. The b-type band, including torsionally excited states with m ≤ 3, has been assigned for Ka" ≤ 10, J" ≤ 20. The ground state combination differences derived from these assigned levels were fit with the RAM36 program to give an RMS deviation of 0.0006 cm-1. The band origin is 1583.0 (+/- 0.1) cm-1 and the torsional level spacing is nearly identical to that in the ground state. The upper state levels are split into multiplets by perturbations. A subsetmore » of the available upper state combination differences for m = 0, Ka' ≤ 7, J' ≤ 10 were fit with the same program, but with rather poorer precision (0.01 cm-1) than for the ground state.« less
NASA Astrophysics Data System (ADS)
Dawadi, Mahesh B.; Degliumberto, Lou; Perry, David S.; Mettee, Howard D.; Sams, Robert L.
2018-01-01
A high-throughput CW slit-jet apparatus coupled to a high-resolution FTIR was used to record the asymmetric NO stretch band of nitromethane. The b-type band, including torsionally excited states with m ≤ 3, has been assigned for Ka″ ≤ 10, J″ ≤ 20. The ground state combination differences derived from these assigned levels were fit with the RAM36 program to give an RMS deviation of 0.0006 cm-1. The band origin is 1583.0 (±0.1) cm-1 and the torsional level spacing is nearly identical to that in the ground state. The upper state levels are split into multiplets by perturbations. A subset of the available upper state combination differences for m = 0, Ka‧ ≤ 7, J‧ ≤ 10 were fit with the same program, but with rather poorer precision (0.01 cm-1) than for the ground state.
High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis
Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.
2016-09-23
High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less
Schaufele, Fred
2013-01-01
Förster resonance energy transfer (FRET) between fluorescent proteins (FPs) provides insights into the proximities and orientations of FPs as surrogates of the biochemical interactions and structures of the factors to which the FPs are genetically fused. As powerful as FRET methods are, technical issues have impeded their broad adoption in the biologic sciences. One hurdle to accurate and reproducible FRET microscopy measurement stems from variable fluorescence backgrounds both within a field and between different fields. Those variations introduce errors into the precise quantification of fluorescence levels on which the quantitative accuracy of FRET measurement is highly dependent. This measurement error is particularly problematic for screening campaigns since minimal well-to-well variation is necessary to faithfully identify wells with altered values. High content screening depends also upon maximizing the numbers of cells imaged, which is best achieved by low magnification high throughput microscopy. But, low magnification introduces flat-field correction issues that degrade the accuracy of background correction to cause poor reproducibility in FRET measurement. For live cell imaging, fluorescence of cell culture media in the fluorescence collection channels for the FPs commonly used for FRET analysis is a high source of background error. These signal-to-noise problems are compounded by the desire to express proteins at biologically meaningful levels that may only be marginally above the strong fluorescence background. Here, techniques are presented that correct for background fluctuations. Accurate calculation of FRET is realized even from images in which a non-flat background is 10-fold higher than the signal. PMID:23927839
Birone, Claire; Brown, Maria; Hernandez, Jesus; Neff, Sherry; Williams, Daniel; Allaire, Marc; Orville, Allen M.; Sweet, Robert M.; Soares, Alexei S.
2014-01-01
High throughput screening technologies such as acoustic droplet ejection (ADE) greatly increase the rate at which X-ray diffraction data can be acquired from crystals. One promising high throughput screening application of ADE is to rapidly combine protein crystals with fragment libraries. In this approach, each fragment soaks into a protein crystal either directly on data collection media or on a moving conveyor belt which then delivers the crystals to the X-ray beam. By simultaneously handling multiple crystals combined with fragment specimens, these techniques relax the automounter duty-cycle bottleneck that currently prevents optimal exploitation of third generation synchrotrons. Two factors limit the speed and scope of projects that are suitable for fragment screening using techniques such as ADE. Firstly, in applications where the high throughput screening apparatus is located inside the X-ray station (such as the conveyor belt system described above), the speed of data acquisition is limited by the time required for each fragment to soak into its protein crystal. Secondly, in applications where crystals are combined with fragments directly on data acquisition media (including both of the ADE methods described above), the maximum time that fragments have to soak into crystals is limited by evaporative dehydration of the protein crystals during the fragment soak. Here we demonstrate that both of these problems can be minimized by using small crystals, because the soak time required for a fragment hit to attain high occupancy depends approximately linearly on crystal size. PMID:24988328
Cole, Krystal; Roessler, Christian G; Mulé, Elizabeth A; Benson-Xu, Emma J; Mullen, Jeffrey D; Le, Benjamin A; Tieman, Alanna M; Birone, Claire; Brown, Maria; Hernandez, Jesus; Neff, Sherry; Williams, Daniel; Allaire, Marc; Orville, Allen M; Sweet, Robert M; Soares, Alexei S
2014-01-01
High throughput screening technologies such as acoustic droplet ejection (ADE) greatly increase the rate at which X-ray diffraction data can be acquired from crystals. One promising high throughput screening application of ADE is to rapidly combine protein crystals with fragment libraries. In this approach, each fragment soaks into a protein crystal either directly on data collection media or on a moving conveyor belt which then delivers the crystals to the X-ray beam. By simultaneously handling multiple crystals combined with fragment specimens, these techniques relax the automounter duty-cycle bottleneck that currently prevents optimal exploitation of third generation synchrotrons. Two factors limit the speed and scope of projects that are suitable for fragment screening using techniques such as ADE. Firstly, in applications where the high throughput screening apparatus is located inside the X-ray station (such as the conveyor belt system described above), the speed of data acquisition is limited by the time required for each fragment to soak into its protein crystal. Secondly, in applications where crystals are combined with fragments directly on data acquisition media (including both of the ADE methods described above), the maximum time that fragments have to soak into crystals is limited by evaporative dehydration of the protein crystals during the fragment soak. Here we demonstrate that both of these problems can be minimized by using small crystals, because the soak time required for a fragment hit to attain high occupancy depends approximately linearly on crystal size.
Li, Fumin; Wang, Jun; Jenkins, Rand
2016-05-01
There is an ever-increasing demand for high-throughput LC-MS/MS bioanalytical assays to support drug discovery and development. Matrix effects of sofosbuvir (protonated) and paclitaxel (sodiated) were thoroughly evaluated using high-throughput chromatography (defined as having a run time ≤1 min) under 14 elution conditions with extracts from protein precipitation, liquid-liquid extraction and solid-phase extraction. A slight separation, in terms of retention time, between underlying matrix components and sofosbuvir/paclitaxel can greatly alleviate matrix effects. High-throughput chromatography, with proper optimization, can provide rapid and effective chromatographic separation under 1 min to alleviate matrix effects and enhance assay ruggedness for regulated bioanalysis.
Mang, Samuel; Bucher, Hannes; Nickolaus, Peter
2016-01-01
The scintillation proximity assay (SPA) technology has been widely used to establish high throughput screens (HTS) for a range of targets in the pharmaceutical industry. PDE12 (aka. 2'- phosphodiesterase) has been published to participate in the degradation of oligoadenylates that are involved in the establishment of an antiviral state via the activation of ribonuclease L (RNAse-L). Degradation of oligoadenylates by PDE12 terminates these antiviral activities, leading to decreased resistance of cells for a variety of viral pathogens. Therefore inhibitors of PDE12 are discussed as antiviral therapy. Here we describe the use of the yttrium silicate SPA bead technology to assess inhibitory activity of compounds against PDE12 in a homogeneous, robust HTS feasible assay using tritiated adenosine-P-adenylate ([3H]ApA) as substrate. We found that the used [3H]ApA educt, was not able to bind to SPA beads, whereas the product [3H]AMP, as known before, was able to bind to SPA beads. This enables the measurement of PDE12 activity on [3H]ApA as a substrate using a wallac microbeta counter. This method describes a robust and high throughput capable format in terms of specificity, commonly used compound solvents, ease of detection and assay matrices. The method could facilitate the search for PDE12 inhibitors as antiviral compounds.
NASA Astrophysics Data System (ADS)
Hong, Hyundae; Benac, Jasenka; Riggsbee, Daniel; Koutsky, Keith
2014-03-01
High throughput (HT) phenotyping of crops is essential to increase yield in environments deteriorated by climate change. The controlled environment of a greenhouse offers an ideal platform to study the genotype to phenotype linkages for crop screening. Advanced imaging technologies are used to study plants' responses to resource limitations such as water and nutrient deficiency. Advanced imaging technologies coupled with automation make HT phenotyping in the greenhouse not only feasible, but practical. Monsanto has a state of the art automated greenhouse (AGH) facility. Handling of the soil, pots water and nutrients are all completely automated. Images of the plants are acquired by multiple hyperspectral and broadband cameras. The hyperspectral cameras cover wavelengths from visible light through short wave infra-red (SWIR). Inhouse developed software analyzes the images to measure plant morphological and biochemical properties. We measure phenotypic metrics like plant area, height, and width as well as biomass. Hyperspectral imaging allows us to measure biochemcical metrics such as chlorophyll, anthocyanin, and foliar water content. The last 4 years of AGH operations on crops like corn, soybean, and cotton have demonstrated successful application of imaging and analysis technologies for high throughput plant phenotyping. Using HT phenotyping, scientists have been showing strong correlations to environmental conditions, such as water and nutrient deficits, as well as the ability to tease apart distinct differences in the genetic backgrounds of crops.
Performance Improvement in Geographic Routing for Vehicular Ad Hoc Networks
Kaiwartya, Omprakash; Kumar, Sushil; Lobiyal, D. K.; Abdullah, Abdul Hanan; Hassan, Ahmed Nazar
2014-01-01
Geographic routing is one of the most investigated themes by researchers for reliable and efficient dissemination of information in Vehicular Ad Hoc Networks (VANETs). Recently, different Geographic Distance Routing (GEDIR) protocols have been suggested in the literature. These protocols focus on reducing the forwarding region towards destination to select the Next Hop Vehicles (NHV). Most of these protocols suffer from the problem of elevated one-hop link disconnection, high end-to-end delay and low throughput even at normal vehicle speed in high vehicle density environment. This paper proposes a Geographic Distance Routing protocol based on Segment vehicle, Link quality and Degree of connectivity (SLD-GEDIR). The protocol selects a reliable NHV using the criteria segment vehicles, one-hop link quality and degree of connectivity. The proposed protocol has been simulated in NS-2 and its performance has been compared with the state-of-the-art protocols: P-GEDIR, J-GEDIR and V-GEDIR. The empirical results clearly reveal that SLD-GEDIR has lower link disconnection and end-to-end delay, and higher throughput as compared to the state-of-the-art protocols. It should be noted that the performance of the proposed protocol is preserved irrespective of vehicle density and speed. PMID:25429415
Performance improvement in geographic routing for Vehicular Ad Hoc Networks.
Kaiwartya, Omprakash; Kumar, Sushil; Lobiyal, D K; Abdullah, Abdul Hanan; Hassan, Ahmed Nazar
2014-11-25
Geographic routing is one of the most investigated themes by researchers for reliable and efficient dissemination of information in Vehicular Ad Hoc Networks (VANETs). Recently, different Geographic Distance Routing (GEDIR) protocols have been suggested in the literature. These protocols focus on reducing the forwarding region towards destination to select the Next Hop Vehicles (NHV). Most of these protocols suffer from the problem of elevated one-hop link disconnection, high end-to-end delay and low throughput even at normal vehicle speed in high vehicle density environment. This paper proposes a Geographic Distance Routing protocol based on Segment vehicle, Link quality and Degree of connectivity (SLD-GEDIR). The protocol selects a reliable NHV using the criteria segment vehicles, one-hop link quality and degree of connectivity. The proposed protocol has been simulated in NS-2 and its performance has been compared with the state-of-the-art protocols: P-GEDIR, J-GEDIR and V-GEDIR. The empirical results clearly reveal that SLD-GEDIR has lower link disconnection and end-to-end delay, and higher throughput as compared to the state-of-the-art protocols. It should be noted that the performance of the proposed protocol is preserved irrespective of vehicle density and speed.
Microfluidic multiplexing of solid-state nanopores
NASA Astrophysics Data System (ADS)
Jain, Tarun; Rasera, Benjamin C.; Guerrero, Ricardo Jose S.; Lim, Jong-Min; Karnik, Rohit
2017-12-01
Although solid-state nanopores enable electronic analysis of many clinically and biologically relevant molecular structures, there are few existing device architectures that enable high-throughput measurement of solid-state nanopores. Herein, we report a method for microfluidic integration of multiple solid-state nanopores at a high density of one nanopore per (35 µm2). By configuring microfluidic devices with microfluidic valves, the nanopores can be rinsed from a single fluid input while retaining compatibility for multichannel electrical measurements. The microfluidic valves serve the dual purpose of fluidic switching and electric switching, enabling serial multiplexing of the eight nanopores with a single pair of electrodes. Furthermore, the device architecture exhibits low noise and is compatible with electroporation-based in situ nanopore fabrication, providing a scalable platform for automated electronic measurement of a large number of integrated solid-state nanopores.
Kračun, Stjepan Krešimir; Fangel, Jonatan Ulrik; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Vidal-Melgosa, Silvia; Willats, William George Tycho
2017-01-01
Cell walls are an important feature of plant cells and a major component of the plant glycome. They have both structural and physiological functions and are critical for plant growth and development. The diversity and complexity of these structures demand advanced high-throughput techniques to answer questions about their structure, functions and roles in both fundamental and applied scientific fields. Microarray technology provides both the high-throughput and the feasibility aspects required to meet that demand. In this chapter, some of the most recent microarray-based techniques relating to plant cell walls are described together with an overview of related contemporary techniques applied to carbohydrate microarrays and their general potential in glycoscience. A detailed experimental procedure for high-throughput mapping of plant cell wall glycans using the comprehensive microarray polymer profiling (CoMPP) technique is included in the chapter and provides a good example of both the robust and high-throughput nature of microarrays as well as their applicability to plant glycomics.
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean
2016-10-01
To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.
Characterizing ncRNAs in Human Pathogenic Protists Using High-Throughput Sequencing Technology
Collins, Lesley Joan
2011-01-01
ncRNAs are key genes in many human diseases including cancer and viral infection, as well as providing critical functions in pathogenic organisms such as fungi, bacteria, viruses, and protists. Until now the identification and characterization of ncRNAs associated with disease has been slow or inaccurate requiring many years of testing to understand complicated RNA and protein gene relationships. High-throughput sequencing now offers the opportunity to characterize miRNAs, siRNAs, small nucleolar RNAs (snoRNAs), and long ncRNAs on a genomic scale, making it faster and easier to clarify how these ncRNAs contribute to the disease state. However, this technology is still relatively new, and ncRNA discovery is not an application of high priority for streamlined bioinformatics. Here we summarize background concepts and practical approaches for ncRNA analysis using high-throughput sequencing, and how it relates to understanding human disease. As a case study, we focus on the parasitic protists Giardia lamblia and Trichomonas vaginalis, where large evolutionary distance has meant difficulties in comparing ncRNAs with those from model eukaryotes. A combination of biological, computational, and sequencing approaches has enabled easier classification of ncRNA classes such as snoRNAs, but has also aided the identification of novel classes. It is hoped that a higher level of understanding of ncRNA expression and interaction may aid in the development of less harsh treatment for protist-based diseases. PMID:22303390
GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit
Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik
2013-01-01
Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358
Reconstructing the regulatory circuit of cell fate determination in yeast mating response.
Shao, Bin; Yuan, Haiyu; Zhang, Rongfei; Wang, Xuan; Zhang, Shuwen; Ouyang, Qi; Hao, Nan; Luo, Chunxiong
2017-07-01
Massive technological advances enabled high-throughput measurements of proteomic changes in biological processes. However, retrieving biological insights from large-scale protein dynamics data remains a challenging task. Here we used the mating differentiation in yeast Saccharomyces cerevisiae as a model and developed integrated experimental and computational approaches to analyze the proteomic dynamics during the process of cell fate determination. When exposed to a high dose of mating pheromone, the yeast cell undergoes growth arrest and forms a shmoo-like morphology; however, at intermediate doses, chemotropic elongated growth is initialized. To understand the gene regulatory networks that control this differentiation switch, we employed a high-throughput microfluidic imaging system that allows real-time and simultaneous measurements of cell growth and protein expression. Using kinetic modeling of protein dynamics, we classified the stimulus-dependent changes in protein abundance into two sources: global changes due to physiological alterations and gene-specific changes. A quantitative framework was proposed to decouple gene-specific regulatory modes from the growth-dependent global modulation of protein abundance. Based on the temporal patterns of gene-specific regulation, we established the network architectures underlying distinct cell fates using a reverse engineering method and uncovered the dose-dependent rewiring of gene regulatory network during mating differentiation. Furthermore, our results suggested a potential crosstalk between the pheromone response pathway and the target of rapamycin (TOR)-regulated ribosomal biogenesis pathway, which might underlie a cell differentiation switch in yeast mating response. In summary, our modeling approach addresses the distinct impacts of the global and gene-specific regulation on the control of protein dynamics and provides new insights into the mechanisms of cell fate determination. We anticipate that our integrated experimental and modeling strategies could be widely applicable to other biological systems.
Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Y; Grossman, JC
2014-12-01
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less
Anderson, Lissa C; DeHart, Caroline J; Kaiser, Nathan K; Fellers, Ryan T; Smith, Donald F; Greer, Joseph B; LeDuc, Richard D; Blakney, Greg T; Thomas, Paul M; Kelleher, Neil L; Hendrickson, Christopher L
2017-02-03
Successful high-throughput characterization of intact proteins from complex biological samples by mass spectrometry requires instrumentation capable of high mass resolving power, mass accuracy, sensitivity, and spectral acquisition rate. These limitations often necessitate the performance of hundreds of LC-MS/MS experiments to obtain reasonable coverage of the targeted proteome, which is still typically limited to molecular weights below 30 kDa. The National High Magnetic Field Laboratory (NHMFL) recently installed a 21 T FT-ICR mass spectrometer, which is part of the NHMFL FT-ICR User Facility and available to all qualified users. Here we demonstrate top-down LC-21 T FT-ICR MS/MS of intact proteins derived from human colorectal cancer cell lysate. We identified a combined total of 684 unique protein entries observed as 3238 unique proteoforms at a 1% false discovery rate, based on rapid, data-dependent acquisition of collision-induced and electron-transfer dissociation tandem mass spectra from just 40 LC-MS/MS experiments. Our identifications included 372 proteoforms with molecular weights over 30 kDa detected at isotopic resolution, which substantially extends the accessible mass range for high-throughput top-down LC-MS/MS.
Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren
2016-11-01
Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is usually limited by sampling size. Sequence conservation-based methods are further confounded by structural constraints and multifunctionality of proteins. Here we present a method that can systematically identify and annotate functional residues of a given protein. We used a high-throughput functional profiling platform to identify essential residues. Coupling it with homologous-structure comparison, we were able to annotate multiple functions of proteins. We demonstrated the method with the PB1 protein of influenza A virus and identified novel functional residues in addition to its canonical function as an RNA-dependent RNA polymerase. Not limited to virology, this method is generally applicable to other proteins that can be functionally selected and about which homologous-structure information is available. Copyright © 2016 Du et al.
High definition infrared chemical imaging of colorectal tissue using a Spero QCL microscope.
Bird, B; Rowlette, J
2017-04-10
Mid-infrared microscopy has become a key technique in the field of biomedical science and spectroscopy. This label-free, non-destructive technique permits the visualisation of a wide range of intrinsic biochemical markers in tissues, cells and biofluids by detection of the vibrational modes of the constituent molecules. Together, infrared microscopy and chemometrics is a widely accepted method that can distinguish healthy and diseased states with high accuracy. However, despite the exponential growth of the field and its research world-wide, several barriers currently exist for its full translation into the clinical sphere, namely sample throughput and data management. The advent and incorporation of quantum cascade lasers (QCLs) into infrared microscopes could help propel the field over these remaining hurdles. Such systems offer several advantages over their FT-IR counterparts, a simpler instrument architecture, improved photon flux, use of room temperature camera systems, and the flexibility of a tunable illumination source. In this current study we explore the use of a QCL infrared microscope to produce high definition, high throughput chemical images useful for the screening of biopsied colorectal tissue.
Improving Data Transfer Throughput with Direct Search Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaprakash, Prasanna; Morozov, Vitali; Kettimuthu, Rajkumar
2016-01-01
Improving data transfer throughput over high-speed long-distance networks has become increasingly difficult. Numerous factors such as nondeterministic congestion, dynamics of the transfer protocol, and multiuser and multitask source and destination endpoints, as well as interactions among these factors, contribute to this difficulty. A promising approach to improving throughput consists in using parallel streams at the application layer.We formulate and solve the problem of choosing the number of such streams from a mathematical optimization perspective. We propose the use of direct search methods, a class of easy-to-implement and light-weight mathematical optimization algorithms, to improve the performance of data transfers by dynamicallymore » adapting the number of parallel streams in a manner that does not require domain expertise, instrumentation, analytical models, or historic data. We apply our method to transfers performed with the GridFTP protocol, and illustrate the effectiveness of the proposed algorithm when used within Globus, a state-of-the-art data transfer tool, on productionWAN links and servers. We show that when compared to user default settings our direct search methods can achieve up to 10x performance improvement under certain conditions. We also show that our method can overcome performance degradation due to external compute and network load on source end points, a common scenario at high performance computing facilities.« less
FMLRC: Hybrid long read error correction using an FM-index.
Wang, Jeremy R; Holt, James; McMillan, Leonard; Jones, Corbin D
2018-02-09
Long read sequencing is changing the landscape of genomic research, especially de novo assembly. Despite the high error rate inherent to long read technologies, increased read lengths dramatically improve the continuity and accuracy of genome assemblies. However, the cost and throughput of these technologies limits their application to complex genomes. One solution is to decrease the cost and time to assemble novel genomes by leveraging "hybrid" assemblies that use long reads for scaffolding and short reads for accuracy. We describe a novel method leveraging a multi-string Burrows-Wheeler Transform with auxiliary FM-index to correct errors in long read sequences using a set of complementary short reads. We demonstrate that our method efficiently produces significantly more high quality corrected sequence than existing hybrid error-correction methods. We also show that our method produces more contiguous assemblies, in many cases, than existing state-of-the-art hybrid and long-read only de novo assembly methods. Our method accurately corrects long read sequence data using complementary short reads. We demonstrate higher total throughput of corrected long reads and a corresponding increase in contiguity of the resulting de novo assemblies. Improved throughput and computational efficiency than existing methods will help better economically utilize emerging long read sequencing technologies.
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
Greene, M. I.; Ladelfa, C. J.; Bivacca, S. J.
1980-05-01
Flash hydropyrolysis (FHP) of coal is an emerging technology for the direct production of methane, ethane and BTX in a single-stage, high throughput reactor. The FHP technique involves the short residence time (1-2 seconds), rapid heatup of coal in a dilute-phase, transport reactor. When integrated into an overall, grass-roots conversion complex, the FHP technique can be utilized to generate a product consisting of SNG, ethylene/propylene, benzene and Fischer-Tropsch-based alcohols. This paper summarizes the process engineering and economics of conceptualized facility based on an FHP reactor operation with a lignitic coal. The plant is hypothetically sited near the extensive lignite fields located in the Texas region of the United States. Utilizing utility-financing methods for the costing of SNG, and selling the chemicals cogenerated at petrochemical market prices, the 20-year average SNG cost has been computed to vary between $3-4/MM Btu, depending upon the coal costs, interest rates, debt/equity ratio, coproduct chemicals prices, etc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T. S.
Meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is stable in time and uniform over the sky to 1% precision or better. Past surveys have achieved photometric precision of 1-2% by calibrating the survey's stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations in the wavelength dependence of the atmospheric transmissionmore » and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors using photometry from the Dark Energy Survey (DES) as an example. We define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes, when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the systematic chromatic errors caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane, can be up to 2% in some bandpasses. We compare the calculated systematic chromatic errors with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput. The residual after correction is less than 0.3%. We also find that the errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.« less
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT).
Howard, E L; Whittock, S P; Jakše, J; Carling, J; Matthews, P D; Probasco, G; Henning, J A; Darby, P; Cerenak, A; Javornik, B; Kilian, A; Koutoulis, A
2011-05-01
Implementation of molecular methods in hop (Humulus lupulus L.) breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. However, use of molecular marker technology is limited due to expense, time inefficiency, laborious methodology and dependence on DNA sequence information. Diversity arrays technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of quality polymorphic markers without reliance on DNA sequence information. This study is the first to utilise DArT for hop genotyping, identifying 730 polymorphic markers from 92 hop accessions. The marker quality was high and similar to the quality of DArT markers previously generated for other species; although percentage polymorphism and polymorphism information content (PIC) were lower than in previous studies deploying other marker systems in hop. Genetic relationships in hop illustrated by DArT in this study coincide with knowledge generated using alternate methods. Several statistical analyses separated the hop accessions into genetically differentiated North American and European groupings, with hybrids between the two groups clearly distinguishable. Levels of genetic diversity were similar in the North American and European groups, but higher in the hybrid group. The markers produced from this time and cost-efficient genotyping tool will be a valuable resource for numerous applications in hop breeding and genetics studies, such as mapping, marker-assisted selection, genetic identity testing, guidance in the maintenance of genetic diversity and the directed breeding of superior cultivars.
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
Fixed Delay Interferometry for Doppler Extrasolar Planet Detection
NASA Astrophysics Data System (ADS)
Ge, Jian
2002-06-01
We present a new technique based on fixed delay interferometry for high-throughput, high-precision, and multiobject Doppler radial velocity (RV) surveys for extrasolar planets. The Doppler measurements are conducted by monitoring the stellar fringe phase shifts of the interferometer instead of absorption-line centroid shifts as in state-of-the-art echelle spectroscopy. High Doppler sensitivity is achieved through optimizing the optical delay in the interferometer and reducing photon noise by measuring multiple fringes over a broad band. This broadband operation is performed by coupling the interferometer with a low- to medium-resolution postdisperser. The resulting fringing spectra over the bandpass are recorded on a two-dimensional detector, with fringes sampled in the slit spatial direction and the spectrum sampled in the dispersion direction. The resulting total Doppler sensitivity is, in theory, independent of the dispersing power of the postdisperser, which allows for the development of new-generation RV machines with much reduced size, high stability, and low cost compared to echelles. This technique has the potential to improve RV survey efficiency by 2-3 orders of magnitude over the cross-dispersed echelle spectroscopy approach, which would allow a full-sky RV survey of hundreds of thousands of stars for planets, brown dwarfs, and stellar companions once the instrument is operated as a multiobject instrument and is optimized for high throughput. The simple interferometer response potentially allows this technique to be operated at other wavelengths independent of popular iodine reference sources, being actively used in most of the current echelles for Doppler planet searches, to search for planets around early-type stars, white dwarfs, and M, L, and T dwarfs for the first time. The high throughput of this instrument could also allow investigation of extragalactic objects for RV variations at high precision.
A review of snapshot multidimensional optical imaging: measuring photon tags in parallel
Gao, Liang; Wang, Lihong V.
2015-01-01
Multidimensional optical imaging has seen remarkable growth in the past decade. Rather than measuring only the two-dimensional spatial distribution of light, as in conventional photography, multidimensional optical imaging captures light in up to nine dimensions, providing unprecedented information about incident photons’ spatial coordinates, emittance angles, wavelength, time, and polarization. Multidimensional optical imaging can be accomplished either by scanning or parallel acquisition. Compared with scanning-based imagers, parallel acquisition—also dubbed snapshot imaging—has a prominent advantage in maximizing optical throughput, particularly when measuring a datacube of high dimensions. Here, we first categorize snapshot multidimensional imagers based on their acquisition and image reconstruction strategies, then highlight the snapshot advantage in the context of optical throughput, and finally we discuss their state-of-the-art implementations and applications. PMID:27134340
High-throughput measurements of the optical redox ratio using a commercial microplate reader.
Cannon, Taylor M; Shah, Amy T; Walsh, Alex J; Skala, Melissa C
2015-01-01
There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p < 0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p < 0.05) and lack of response (p > 0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.
Auerbach, Scott; Filer, Dayne; Reif, David; Walker, Vickie; Holloway, Alison C.; Schlezinger, Jennifer; Srinivasan, Supriya; Svoboda, Daniel; Judson, Richard; Bucher, John R.; Thayer, Kristina A.
2016-01-01
Background: Diabetes and obesity are major threats to public health in the United States and abroad. Understanding the role that chemicals in our environment play in the development of these conditions is an emerging issue in environmental health, although identifying and prioritizing chemicals for testing beyond those already implicated in the literature is challenging. This review is intended to help researchers generate hypotheses about chemicals that may contribute to diabetes and to obesity-related health outcomes by summarizing relevant findings from the U.S. Environmental Protection Agency (EPA) ToxCast™ high-throughput screening (HTS) program. Objectives: Our aim was to develop new hypotheses around environmental chemicals of potential interest for diabetes- or obesity-related outcomes using high-throughput screening data. Methods: We identified ToxCast™ assay targets relevant to several biological processes related to diabetes and obesity (insulin sensitivity in peripheral tissue, pancreatic islet and β cell function, adipocyte differentiation, and feeding behavior) and presented chemical screening data against those assay targets to identify chemicals of potential interest. Discussion: The results of this screening-level analysis suggest that the spectrum of environmental chemicals to consider in research related to diabetes and obesity is much broader than indicated by research papers and reviews published in the peer-reviewed literature. Testing hypotheses based on ToxCast™ data will also help assess the predictive utility of this HTS platform. Conclusions: More research is required to put these screening-level analyses into context, but the information presented in this review should facilitate the development of new hypotheses. Citation: Auerbach S, Filer D, Reif D, Walker V, Holloway AC, Schlezinger J, Srinivasan S, Svoboda D, Judson R, Bucher JR, Thayer KA. 2016. Prioritizing environmental chemicals for obesity and diabetes outcomes research: a screening approach using ToxCast™ high-throughput data. Environ Health Perspect 124:1141–1154; http://dx.doi.org/10.1289/ehp.1510456 PMID:26978842
Auerbach, Scott; Filer, Dayne; Reif, David; Walker, Vickie; Holloway, Alison C; Schlezinger, Jennifer; Srinivasan, Supriya; Svoboda, Daniel; Judson, Richard; Bucher, John R; Thayer, Kristina A
2016-08-01
Diabetes and obesity are major threats to public health in the United States and abroad. Understanding the role that chemicals in our environment play in the development of these conditions is an emerging issue in environmental health, although identifying and prioritizing chemicals for testing beyond those already implicated in the literature is challenging. This review is intended to help researchers generate hypotheses about chemicals that may contribute to diabetes and to obesity-related health outcomes by summarizing relevant findings from the U.S. Environmental Protection Agency (EPA) ToxCast™ high-throughput screening (HTS) program. Our aim was to develop new hypotheses around environmental chemicals of potential interest for diabetes- or obesity-related outcomes using high-throughput screening data. We identified ToxCast™ assay targets relevant to several biological processes related to diabetes and obesity (insulin sensitivity in peripheral tissue, pancreatic islet and β cell function, adipocyte differentiation, and feeding behavior) and presented chemical screening data against those assay targets to identify chemicals of potential interest. The results of this screening-level analysis suggest that the spectrum of environmental chemicals to consider in research related to diabetes and obesity is much broader than indicated by research papers and reviews published in the peer-reviewed literature. Testing hypotheses based on ToxCast™ data will also help assess the predictive utility of this HTS platform. More research is required to put these screening-level analyses into context, but the information presented in this review should facilitate the development of new hypotheses. Auerbach S, Filer D, Reif D, Walker V, Holloway AC, Schlezinger J, Srinivasan S, Svoboda D, Judson R, Bucher JR, Thayer KA. 2016. Prioritizing environmental chemicals for obesity and diabetes outcomes research: a screening approach using ToxCast™ high-throughput data. Environ Health Perspect 124:1141-1154; http://dx.doi.org/10.1289/ehp.1510456.
NASA Technical Reports Server (NTRS)
Crozier, Stewart N.
1990-01-01
Random access signaling, which allows slotted packets to spill over into adjacent slots, is investigated. It is shown that sloppy-slotted ALOHA can always provide higher throughput than conventional slotted ALOHA. The degree of improvement depends on the timing error distribution. Throughput performance is presented for Gaussian timing error distributions, modified to include timing error corrections. A general channel capacity lower bound, independent of the specific timing error distribution, is also presented.
Maier-Kiener, Verena; Schuh, Benjamin; George, Easo P.; ...
2016-11-19
The equiatomic high-entropy alloy (HEA), CrMnFeCoNi, has recently been shown to be microstructurally unstable, resulting in a multi-phase microstructure after intermediate-temperature annealing treatments. The decomposition occurs rapidly in the nanocrystalline (NC) state and after longer annealing times in coarse-grained states. To characterize the mechanical properties of differently annealed NC states containing multiple phases, nanoindentation was used in this paper. The results revealed besides drastic changes in hardness, also for the first time significant changes in the Young's modulus and strain rate sensitivity. Finally, nanoindentation of NC HEAs is, therefore, a useful complementary screening tool with high potential as a highmore » throughput approach to detect phase decomposition, which can also be used to qualitatively predict the long-term stability of single-phase HEAs.« less
Haeili, Mehri; Moore, Casey; Davis, Christopher J. C.; Cochran, James B.; Shah, Santosh; Shrestha, Tej B.; Zhang, Yaofang; Bossmann, Stefan H.; Benjamin, William H.
2014-01-01
Macrophages take advantage of the antibacterial properties of copper ions in the killing of bacterial intruders. However, despite the importance of copper for innate immune functions, coordinated efforts to exploit copper ions for therapeutic interventions against bacterial infections are not yet in place. Here we report a novel high-throughput screening platform specifically developed for the discovery and characterization of compounds with copper-dependent antibacterial properties toward methicillin-resistant Staphylococcus aureus (MRSA). We detail how one of the identified compounds, glyoxal-bis(N4-methylthiosemicarbazone) (GTSM), exerts its potent strictly copper-dependent antibacterial properties on MRSA. Our data indicate that the activity of the GTSM-copper complex goes beyond the general antibacterial effects of accumulated copper ions and suggest that, in contrast to prevailing opinion, copper complexes can indeed exhibit species- and target-specific activities. Based on experimental evidence, we propose that copper ions impose structural changes upon binding to the otherwise inactive GTSM ligand and transfer antibacterial properties to the chelate. In turn, GTSM determines target specificity and utilizes a redox-sensitive release mechanism through which copper ions are deployed at or in close proximity to a putative target. According to our proof-of-concept screen, copper activation is not a rare event and even extends to already established drugs. Thus, copper-activated compounds could define a novel class of anti-MRSA agents that amplify copper-dependent innate immune functions of the host. To this end, we provide a blueprint for a high-throughput drug screening campaign which considers the antibacterial properties of copper ions at the host-pathogen interface. PMID:24752262
OLEDs for lighting: new approaches
NASA Astrophysics Data System (ADS)
Duggal, Anil R.; Foust, Donald F.; Nealon, William F.; Heller, Christian M.
2004-02-01
OLED technology has improved to the point where it is now possible to envision developing OLEDs as a low cost solid state light source. In order to realize this, significant advances have to be made in device efficiency, lifetime at high brightness, high throughput fabrication, and the generation of illumination quality white light. In this talk, the requirements for general lighting will be reviewed and various approaches to meeting them will be outlined. Emphasis will be placed on a new monolithic series-connected OLED design architecture that promises scalability without high fabrication cost or design complexity.
Asymmetric Multilevel Outphasing (AMO): A New Architecture for All-Silicon mm-Wave Transmitter ICs
2015-06-12
power-amplifiers for mobile basestation infrastructure and handsets. NanoSemi Inc. designs linearization solutions for analog front-ends such as...ward flexible, multi-standard radio chips, increases the need for high-precision, high-throughput and energy-efficient backend processing. The desire...peak PAE is affected by less than 1% (46 mW/(46 mW 1.8 W/0.4)) by this 64-QAM capable AMO SCS backend . 378 IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 48
What can one learn about material structure given a single first-principles calculation?
NASA Astrophysics Data System (ADS)
Rajen, Nicholas; Coh, Sinisa
2018-05-01
We extract a variable X from electron orbitals Ψn k and energies En k in the parent high-symmetry structure of a wide range of complex oxides: perovskites, rutiles, pyrochlores, and cristobalites. Even though calculation was done only in the parent structure, with no distortions, we show that X dictates material's true ground-state structure. We propose using Wannier functions to extract concealed variables such as X both for material structure prediction and for high-throughput approaches.
A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting
Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.
2016-01-01
Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945
Yang, Jijin; Ferranti, David C; Stern, Lewis A; Sanford, Colin A; Huang, Jason; Ren, Zheng; Qin, Lu-Chang; Hall, Adam R
2011-07-15
We report the formation of solid-state nanopores using a scanning helium ion microscope. The fabrication process offers the advantage of high sample throughput along with fine control over nanopore dimensions, producing single pores with diameters below 4 nm. Electronic noise associated with ion transport through the resultant pores is found to be comparable with levels measured on devices made with the established technique of transmission electron microscope milling. We demonstrate the utility of our nanopores for biomolecular analysis by measuring the passage of double-strand DNA.
2016-07-29
Research Addressing Contaminants in Low Permeability Zones - A State of the Science Review SERDP Project ER-1740 JULY 2016 Tom Sale Saeed...process, or service by trade name, trademark, manufacturer , or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or...managing releases of chlorinated solvents and other persistent contaminants in groundwater in unconsolidated sediments. N/A U U U UU 126 Dr. Tom Sale 970
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
High-throughput, image-based screening of pooled genetic variant libraries
Emanuel, George; Moffitt, Jeffrey R.; Zhuang, Xiaowei
2018-01-01
Image-based, high-throughput screening of genetic perturbations will advance both biology and biotechnology. We report a high-throughput screening method that allows diverse genotypes and corresponding phenotypes to be imaged in numerous individual cells. We achieve genotyping by introducing barcoded genetic variants into cells and using massively multiplexed FISH to measure the barcodes. We demonstrated this method by screening mutants of the fluorescent protein YFAST, yielding brighter and more photostable YFAST variants. PMID:29083401
Experimental Design for Combinatorial and High Throughput Materials Development
NASA Astrophysics Data System (ADS)
Cawse, James N.
2002-12-01
In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.
Arabaci, Murat; Djordjevic, Ivan B; Saunders, Ross; Marcoccia, Roberto M
2010-02-01
In order to achieve high-speed transmission over optical transport networks (OTNs) and maximize its throughput, we propose using a rate-adaptive polarization-multiplexed coded multilevel modulation with coherent detection based on component non-binary quasi-cyclic (QC) LDPC codes. Compared to prior-art bit-interleaved LDPC-coded modulation (BI-LDPC-CM) scheme, the proposed non-binary LDPC-coded modulation (NB-LDPC-CM) scheme not only reduces latency due to symbol- instead of bit-level processing but also provides either impressive reduction in computational complexity or striking improvements in coding gain depending on the constellation size. As the paper presents, compared to its prior-art binary counterpart, the proposed NB-LDPC-CM scheme addresses the needs of future OTNs, which are achieving the target BER performance and providing maximum possible throughput both over the entire lifetime of the OTN, better.
High-throughput screening of chemical effects on ...
Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples on steroidogenesis via HPLC-MS/MS quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a three stage screening strategy. The first stage established the maximum tolerated concentration (MTC; >70% viability) per sample. The second stage quantified changes in hormone levels at the MTC while the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were pre-stimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2,060 chemical samples evaluated, 524 samples were selected for six-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into five distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A d
NASA Technical Reports Server (NTRS)
Eckberg, Dwain L.
2003-01-01
Respiratory activity phasically alters membrane potentials of preganglionic vagal and sympathetic motoneurones and continuously modulates their responsiveness to stimulatory inputs. The most obvious manifestation of this 'respiratory gating' is respiratory sinus arrhythmia, the rhythmic fluctuations of electrocardiographic R-R intervals observed in healthy resting humans. Phasic autonomic motoneurone firing, reflecting the throughput of the system, depends importantly on the intensity of stimulatory inputs, such that when levels of stimulation are low (as with high arterial pressure and sympathetic activity, or low arterial pressure and vagal activity), respiratory fluctuations of sympathetic or vagal firing are also low. The respiratory gate has a finite capacity, and high levels of stimulation override the ability of respiration to gate autonomic responsiveness. Autonomic throughput also depends importantly on other factors, including especially, the frequency of breathing, the rate at which the gate opens and closes. Respiratory sinus arrhythmia is small at rapid, and large at slow breathing rates. The strong correlation between systolic pressure and R-R intervals at respiratory frequencies reflects the influence of respiration on these two measures, rather than arterial baroreflex physiology. A wide range of evidence suggests that respiratory activity gates the timing of autonomic motoneurone firing, but does not influence its tonic level. I propose that the most enduring significance of respiratory gating is its use as a precisely controlled experimental tool to tease out and better understand otherwise inaccessible human autonomic neurophysiological mechanisms.
Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing
Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi
2016-01-01
Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039
Charge-sensitive front-end electronics with operational amplifiers for CdZnTe detectors
NASA Astrophysics Data System (ADS)
Födisch, P.; Berthel, M.; Lange, B.; Kirschke, T.; Enghardt, W.; Kaever, P.
2016-09-01
Cadmium zinc telluride (CdZnTe, CZT) radiation detectors are suitable for a variety of applications, due to their high spatial resolution and spectroscopic energy performance at room temperature. However, state-of-the-art detector systems require high-performance readout electronics. Though an application-specific integrated circuit (ASIC) is an adequate solution for the readout, requirements of high dynamic range and high throughput are not available in any commercial circuit. Consequently, the present study develops the analog front-end electronics with operational amplifiers for an 8×8 pixelated CZT detector. For this purpose, we modeled an electrical equivalent circuit of the CZT detector with the associated charge-sensitive amplifier (CSA). Based on a detailed network analysis, the circuit design is completed by numerical values for various features such as ballistic deficit, charge-to-voltage gain, rise time, and noise level. A verification of the performance is carried out by synthetic detector signals and a pixel detector. The experimental results with the pixel detector assembly and a 22Na radioactive source emphasize the depth dependence of the measured energy. After pulse processing with depth correction based on the fit of the weighting potential, the energy resolution is 2.2% (FWHM) for the 511 keV photopeak.
High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Ultra-High Density Holographic Memory Module with Solid-State Architecture
NASA Technical Reports Server (NTRS)
Markov, Vladimir B.
2000-01-01
NASA's terrestrial. space, and deep-space missions require technology that allows storing. retrieving, and processing a large volume of information. Holographic memory offers high-density data storage with parallel access and high throughput. Several methods exist for data multiplexing based on the fundamental principles of volume hologram selectivity. We recently demonstrated that a spatial (amplitude-phase) encoding of the reference wave (SERW) looks promising as a way to increase the storage density. The SERW hologram offers a method other than traditional methods of selectivity, such as spatial de-correlation between recorded and reconstruction fields, In this report we present the experimental results of the SERW-hologram memory module with solid-state architecture, which is of particular interest for space operations.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
Argueta, Edwin; Shaji, Jeena; Gopalan, Arun; Liao, Peilin; Snurr, Randall Q; Gómez-Gualdrón, Diego A
2018-01-09
Metal-organic frameworks (MOFs) are porous crystalline materials with attractive properties for gas separation and storage. Their remarkable tunability makes it possible to create millions of MOF variations but creates the need for fast material screening to identify promising structures. Computational high-throughput screening (HTS) is a possible solution, but its usefulness is tied to accurate predictions of MOF adsorption properties. Accurate adsorption simulations often require an accurate description of electrostatic interactions, which depend on the electronic charges of the MOF atoms. HTS-compatible methods to assign charges to MOF atoms need to accurately reproduce electrostatic potentials (ESPs) and be computationally affordable, but current methods present an unsatisfactory trade-off between computational cost and accuracy. We illustrate a method to assign charges to MOF atoms based on ab initio calculations on MOF molecular building blocks. A library of building blocks with built-in charges is thus created and used by an automated MOF construction code to create hundreds of MOFs with charges "inherited" from the constituent building blocks. The molecular building block-based (MBBB) charges are similar to REPEAT charges-which are charges that reproduce ESPs obtained from ab initio calculations on crystallographic unit cells of nanoporous crystals-and thus similar predictions of adsorption loadings, heats of adsorption, and Henry's constants are obtained with either method. The presented results indicate that the MBBB method to assign charges to MOF atoms is suitable for use in computational high-throughput screening of MOFs for applications that involve adsorption of molecules such as carbon dioxide.
High-throughput analysis of peptide binding modules
Liu, Bernard A.; Engelmann, Brett; Nash, Piers D.
2014-01-01
Modular protein interaction domains that recognize linear peptide motifs are found in hundreds of proteins within the human genome. Some protein interaction domains such as SH2, 14-3-3, Chromo and Bromo domains serve to recognize post-translational modification of amino acids (such as phosphorylation, acetylation, methylation etc.) and translate these into discrete cellular responses. Other modules such as SH3 and PDZ domains recognize linear peptide epitopes and serve to organize protein complexes based on localization and regions of elevated concentration. In both cases, the ability to nucleate specific signaling complexes is in large part dependent on the selectivity of a given protein module for its cognate peptide ligand. High throughput analysis of peptide-binding domains by peptide or protein arrays, phage display, mass spectrometry or other HTP techniques provides new insight into the potential protein-protein interactions prescribed by individual or even whole families of modules. Systems level analyses have also promoted a deeper understanding of the underlying principles that govern selective protein-protein interactions and how selectivity evolves. Lastly, there is a growing appreciation for the limitations and potential pitfalls of high-throughput analysis of protein-peptide interactomes. This review will examine some of the common approaches utilized for large-scale studies of protein interaction domains and suggest a set of standards for the analysis and validation of datasets from large-scale studies of peptide-binding modules. We will also highlight how data from large-scale studies of modular interaction domain families can provide insight into systems level properties such as the linguistics of selective interactions. PMID:22610655
NASA Astrophysics Data System (ADS)
Hinuma, Yoyo; Kumagai, Yu; Tanaka, Isao; Oba, Fumiyasu
2017-02-01
The band alignment of prototypical semiconductors and insulators is investigated using first-principles calculations. A dielectric-dependent hybrid functional, where the nonlocal Fock exchange mixing is set at the reciprocal of the static electronic dielectric constant and the exchange correlation is otherwise treated as in the Perdew-Burke-Ernzerhof (PBE0) hybrid functional, is used as well as the Heyd-Scuseria-Ernzerhof (HSE06) hybrid and PBE semilocal functionals. In addition, these hybrid functionals are applied non-self-consistently to accelerate calculations. The systems considered include C and Si in the diamond structure, BN, AlP, AlAs, AlSb, GaP, GaAs, InP, ZnS, ZnSe, ZnTe, CdS, CdSe, and CdTe in the zinc-blende structure, MgO in the rocksalt structure, and GaN and ZnO in the wurtzite structure. Surface band positions with respect to the vacuum level, i.e., ionization potentials and electron affinities, and band offsets at selected zinc-blende heterointerfaces are evaluated as well as band gaps. The non-self-consistent approach speeds up hybrid functional calculations by an order of magnitude, while it is shown using HSE06 that the resultant band gaps and surface band positions are similar to the self-consistent results. The dielectric-dependent hybrid functional improves the band gaps and surface band positions of wide-gap systems over HSE06. The interfacial band offsets are predicted with a similar degree of precision. Overall, the performance of the dielectric-dependent hybrid functional is comparable to the G W0 approximation based on many-body perturbation theory in the prediction of band gaps and alignments for most systems. The present results demonstrate that the dielectric-dependent hybrid functional, particularly when applied non-self-consistently, is promising for applications to systematic calculations or high-throughput screening that demand both computational efficiency and sufficient accuracy.
Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.
Yang, Darren; Wong, Wesley P
2018-01-01
We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.
High throughput single cell counting in droplet-based microfluidics.
Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie
2017-05-02
Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.
2016-12-01
AWARD NUMBER: W81XWH-13-1-0371 TITLE: High-Throughput Sequencing of Germline and Tumor From Men with Early- Onset Metastatic Prostate Cancer...DATES COVERED 30 Sep 2013 - 29 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER High-Throughput Sequencing of Germline and Tumor From Men with...presenting with metastatic prostate cancer at a young age (before age 60 years). Whole exome sequencing identified a panel of germline variants that have
Che, James; Yu, Victor; Dhar, Manjima; Renier, Corinne; Matsumoto, Melissa; Heirich, Kyra; Garon, Edward B; Goldman, Jonathan; Rao, Jianyu; Sledge, George W; Pegram, Mark D; Sheth, Shruti; Jeffrey, Stefanie S; Kulkarni, Rajan P; Sollier, Elodie; Di Carlo, Dino
2016-03-15
Circulating tumor cells (CTCs) are emerging as rare but clinically significant non-invasive cellular biomarkers for cancer patient prognosis, treatment selection, and treatment monitoring. Current CTC isolation approaches, such as immunoaffinity, filtration, or size-based techniques, are often limited by throughput, purity, large output volumes, or inability to obtain viable cells for downstream analysis. For all technologies, traditional immunofluorescent staining alone has been employed to distinguish and confirm the presence of isolated CTCs among contaminating blood cells, although cells isolated by size may express vastly different phenotypes. Consequently, CTC definitions have been non-trivial, researcher-dependent, and evolving. Here we describe a complete set of objective criteria, leveraging well-established cytomorphological features of malignancy, by which we identify large CTCs. We apply the criteria to CTCs enriched from stage IV lung and breast cancer patient blood samples using the High Throughput Vortex Chip (Vortex HT), an improved microfluidic technology for the label-free, size-based enrichment and concentration of rare cells. We achieve improved capture efficiency (up to 83%), high speed of processing (8 mL/min of 10x diluted blood, or 800 μL/min of whole blood), and high purity (avg. background of 28.8±23.6 white blood cells per mL of whole blood). We show markedly improved performance of CTC capture (84% positive test rate) in comparison to previous Vortex designs and the current FDA-approved gold standard CellSearch assay. The results demonstrate the ability to quickly collect viable and pure populations of abnormal large circulating cells unbiased by molecular characteristics, which helps uncover further heterogeneity in these cells.
Infrastructure to Support Ultra High Throughput Biodosimetry Screening after a Radiological Event
Garty, G.; Karam, P.A.; Brenner, D. J.
2011-01-01
Purpose After a large-scale radiological event, there will be a pressing need to assess, within a few days, the radiation doses received by tens or hundreds of thousands of individuals. This is for triage, to prevent treatment locations from being overwhelmed, in what is sure to be a resource limited scenario, as well as to facilitate dose-dependent treatment decisions. In addition there are psychosocial considerations, in that active reassurance of minimal exposure is a potentially effective antidote to mass panic, as well as long-term considerations, to facilitate later studies of cancer and other long-term disease risks. Materials and Methods As described elsewhere in this issue, we are developing a Rapid Automated Biodosimetry Tool (RABiT). The RABiT allows high throughput analysis of thousands of blood samples per day, providing a dose estimate that can be used to support clinical triage and treatment decisions. Results Development of the RABiT has motivated us to consider the logistics of incorporating such a system into the existing emergency response scenarios of a large metropolitan area. We present here a view of how one or more centralized biodosimetry readout devices might be incorporated into an infrastructure in which fingerstick blood samples are taken at many distributed locations within an affected city or region and transported to centralized locations. Conclusions High throughput biodosimetry systems offer the opportunity to perform biodosimetric assessments on a large number of persons. As such systems reach a high level of maturity, emergency response scenarios will need to be tweaked to make use of these powerful tools. This can be done relatively easily within the framework of current scenarios. PMID:21675819
NASA Astrophysics Data System (ADS)
Buongiorno Nardelli, Marco
High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends almost exclusively on the the design of efficient algorithms for electronic structure simulations of realistic material systems beyond the limitations of the current standard theories. In this talk, I will review recent progress in theoretical and computational tools, and in particular, discuss the development and validation of novel functionals within Density Functional Theory and of local basis representations for effective ab-initio tight-binding schemes. Marco Buongiorno Nardelli is a pioneer in the development of computational platforms for theory/data/applications integration rooted in his profound and extensive expertise in the design of electronic structure codes and in his vision for sustainable and innovative software development for high-performance materials simulations. His research activities range from the design and discovery of novel materials for 21st century applications in renewable energy, environment, nano-electronics and devices, the development of advanced electronic structure theories and high-throughput techniques in materials genomics and computational materials design, to an active role as community scientific software developer (QUANTUM ESPRESSO, WanT, AFLOWpi)
Che, James; Yu, Victor; Dhar, Manjima; Renier, Corinne; Matsumoto, Melissa; Heirich, Kyra; Garon, Edward B.; Goldman, Jonathan; Rao, Jianyu; Sledge, George W.; Pegram, Mark D.; Sheth, Shruti; Jeffrey, Stefanie S.; Kulkarni, Rajan P.; Sollier, Elodie; Di Carlo, Dino
2016-01-01
Circulating tumor cells (CTCs) are emerging as rare but clinically significant non-invasive cellular biomarkers for cancer patient prognosis, treatment selection, and treatment monitoring. Current CTC isolation approaches, such as immunoaffinity, filtration, or size-based techniques, are often limited by throughput, purity, large output volumes, or inability to obtain viable cells for downstream analysis. For all technologies, traditional immunofluorescent staining alone has been employed to distinguish and confirm the presence of isolated CTCs among contaminating blood cells, although cells isolated by size may express vastly different phenotypes. Consequently, CTC definitions have been non-trivial, researcher-dependent, and evolving. Here we describe a complete set of objective criteria, leveraging well-established cytomorphological features of malignancy, by which we identify large CTCs. We apply the criteria to CTCs enriched from stage IV lung and breast cancer patient blood samples using the High Throughput Vortex Chip (Vortex HT), an improved microfluidic technology for the label-free, size-based enrichment and concentration of rare cells. We achieve improved capture efficiency (up to 83%), high speed of processing (8 mL/min of 10x diluted blood, or 800 μL/min of whole blood), and high purity (avg. background of 28.8±23.6 white blood cells per mL of whole blood). We show markedly improved performance of CTC capture (84% positive test rate) in comparison to previous Vortex designs and the current FDA-approved gold standard CellSearch assay. The results demonstrate the ability to quickly collect viable and pure populations of abnormal large circulating cells unbiased by molecular characteristics, which helps uncover further heterogeneity in these cells. PMID:26863573
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Allan Ray
1987-05-01
Increases in high speed hardware have mandated studies in software techniques to exploit the parallel capabilities. This thesis examines the effects a run-time scheduler has on a multiprocessor. The model consists of directed, acyclic graphs, generated from serial FORTRAN benchmark programs by the parallel compiler Parafrase. A multitasked, multiprogrammed environment is created. Dependencies are generated by the compiler. Tasks are bidimensional, i.e., they may specify both time and processor requests. Processor requests may be folded into execution time by the scheduler. The graphs may arrive at arbitrary time intervals. The general case is NP-hard, thus, a variety of heuristics aremore » examined by a simulator. Multiprogramming demonstrates a greater need for a run-time scheduler than does monoprogramming for a variety of reasons, e.g., greater stress on the processors, a larger number of independent control paths, more variety in the task parameters, etc. The dynamic critical path series of algorithms perform well. Dynamic critical volume did not add much. Unfortunately, dynamic critical path maximizes turnaround time as well as throughput. Two schedulers are presented which balance throughput and turnaround time. The first requires classification of jobs by type; the second requires selection of a ratio value which is dependent upon system parameters. 45 refs., 19 figs., 20 tabs.« less
Nicolas, Jonathan; Hendriksen, Peter J M; Gerssen, Arjen; Bovee, Toine F H; Rietjens, Ivonne M C M
2014-01-01
Marine biotoxins can accumulate in fish and shellfish, representing a possible threat for consumers. Many marine biotoxins affect neuronal function essentially through their interaction with ion channels or receptors, leading to different symptoms including paralysis and even death. The detection of marine biotoxins in seafood products is therefore a priority. Official methods for control are often still using in vivo assays, such as the mouse bioassay. This test is considered unethical and the development of alternative assays is urgently required. Chemical analyses as well as in vitro assays have been developed to detect marine biotoxins in seafood. However, most of the current in vitro alternatives to animal testing present disadvantages: low throughput and lack of sensitivity resulting in a high number of false-negative results. Thus, there is an urgent need for the development of new in vitro tests that would allow the detection of marine biotoxins in seafood products at a low cost, with high throughput combined with high sensitivity, reproducibility, and predictivity. Mode of action based in vitro bioassays may provide tools that fulfil these requirements. This review covers the current state of the art of such mode of action based alternative assays to detect neurotoxic marine biotoxins in seafood. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Influence relevance voting: an accurate and interpretable virtual high throughput screening method.
Swamidass, S Joshua; Azencott, Chloé-Agathe; Lin, Ting-Wan; Gramajo, Hugo; Tsai, Shiou-Chuan; Baldi, Pierre
2009-04-01
Given activity training data from high-throughput screening (HTS) experiments, virtual high-throughput screening (vHTS) methods aim to predict in silico the activity of untested chemicals. We present a novel method, the Influence Relevance Voter (IRV), specifically tailored for the vHTS task. The IRV is a low-parameter neural network which refines a k-nearest neighbor classifier by nonlinearly combining the influences of a chemical's neighbors in the training set. Influences are decomposed, also nonlinearly, into a relevance component and a vote component. The IRV is benchmarked using the data and rules of two large, open, competitions, and its performance compared to the performance of other participating methods, as well as of an in-house support vector machine (SVM) method. On these benchmark data sets, IRV achieves state-of-the-art results, comparable to the SVM in one case, and significantly better than the SVM in the other, retrieving three times as many actives in the top 1% of its prediction-sorted list. The IRV presents several other important advantages over SVMs and other methods: (1) the output predictions have a probabilistic semantic; (2) the underlying inferences are interpretable; (3) the training time is very short, on the order of minutes even for very large data sets; (4) the risk of overfitting is minimal, due to the small number of free parameters; and (5) additional information can easily be incorporated into the IRV architecture. Combined with its performance, these qualities make the IRV particularly well suited for vHTS.
High-throughput linear optical stretcher for mechanical characterization of blood cells.
Roth, Kevin B; Neeves, Keith B; Squier, Jeff; Marr, David W M
2016-04-01
This study describes a linear optical stretcher as a high-throughput mechanical property cytometer. Custom, inexpensive, and scalable optics image a linear diode bar source into a microfluidic channel, where cells are hydrodynamically focused into the optical stretcher. Upon entering the stretching region, antipodal optical forces generated by the refraction of tightly focused laser light at the cell membrane deform each cell in flow. Each cell relaxes as it flows out of the trap and is compared to the stretched state to determine deformation. The deformation response of untreated red blood cells and neutrophils were compared to chemically treated cells. Statistically significant differences were observed between normal, diamide-treated, and glutaraldehyde-treated red blood cells, as well as between normal and cytochalasin D-treated neutrophils. Based on the behavior of the pure, untreated populations of red cells and neutrophils, a mixed population of these cells was tested and the discrete populations were identified by deformability. © 2015 International Society for Advancement of Cytometry. © 2015 International Society for Advancement of Cytometry.
Physico-chemical foundations underpinning microarray and next-generation sequencing experiments
Harrison, Andrew; Binder, Hans; Buhot, Arnaud; Burden, Conrad J.; Carlon, Enrico; Gibas, Cynthia; Gamble, Lara J.; Halperin, Avraham; Hooyberghs, Jef; Kreil, David P.; Levicky, Rastislav; Noble, Peter A.; Ott, Albrecht; Pettitt, B. Montgomery; Tautz, Diethard; Pozhitkov, Alexander E.
2013-01-01
Hybridization of nucleic acids on solid surfaces is a key process involved in high-throughput technologies such as microarrays and, in some cases, next-generation sequencing (NGS). A physical understanding of the hybridization process helps to determine the accuracy of these technologies. The goal of a widespread research program is to develop reliable transformations between the raw signals reported by the technologies and individual molecular concentrations from an ensemble of nucleic acids. This research has inputs from many areas, from bioinformatics and biostatistics, to theoretical and experimental biochemistry and biophysics, to computer simulations. A group of leading researchers met in Ploen Germany in 2011 to discuss present knowledge and limitations of our physico-chemical understanding of high-throughput nucleic acid technologies. This meeting inspired us to write this summary, which provides an overview of the state-of-the-art approaches based on physico-chemical foundation to modeling of the nucleic acids hybridization process on solid surfaces. In addition, practical application of current knowledge is emphasized. PMID:23307556
Integrated nanopore sensing platform with sub-microsecond temporal resolution
Rosenstein, Jacob K; Wanunu, Meni; Merchant, Christopher A; Drndic, Marija; Shepard, Kenneth L
2012-01-01
Nanopore sensors have attracted considerable interest for high-throughput sensing of individual nucleic acids and proteins without the need for chemical labels or complex optics. A prevailing problem in nanopore applications is that the transport kinetics of single biomolecules are often faster than the measurement time resolution. Methods to slow down biomolecular transport can be troublesome and are at odds with the natural goal of high-throughput sensing. Here we introduce a low-noise measurement platform that integrates a complementary metal-oxide semiconductor (CMOS) preamplifier with solid-state nanopores in thin silicon nitride membranes. With this platform we achieved a signal-to-noise ratio exceeding five at a bandwidth of 1 MHz, which to our knowledge is the highest bandwidth nanopore recording to date. We demonstrate transient signals as brief as 1 μs from short DNA molecules as well as current signatures during molecular passage events that shed light on submolecular DNA configurations in small nanopores. PMID:22426489
CSMA Versus Prioritized CSMA for Air-Traffic-Control Improvement
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
OPNET version 7.0 simulations are presented involving an important application of the Aeronautical Telecommunications Network (ATN), Controller Pilot Data Link Communications (CPDLC) over the Very High Frequency Data Link, Mode 2 (VDL-2). Communication is modeled for essentially all incoming and outgoing nonstop air-traffic for just three United States cities: Cleveland, Cincinnati, and Detroit. There are 32 airports in the simulation, 29 of which are either sources or destinations for the air-traffic of the aforementioned three airports. The simulation involves 111 Air Traffic Control (ATC) ground stations, and 1,235 equally equipped aircraft-taking off, flying realistic free-flight trajectories, and landing in a 24-hr period. Collisionless, Prioritized Carrier Sense Multiple Access (CSMA) is successfully tested and compared with the traditional CSMA typically associated with VDL-2. The performance measures include latency, throughput, and packet loss. As expected, Prioritized CSMA is much quicker and more efficient than traditional CSMA. These simulation results show the potency of Prioritized CSMA for implementing low latency, high throughput, and efficient connectivity.
Knight, Jean; Rovida, Costanca
2014-01-01
The proposed Safe Cosmetics and Personal Care Products Act of 2013 calls for a new evaluation program for cosmetic ingredients in the US, with the new assessments initially dependent on expanded animal testing. This paper considers possible testing scenarios under the proposed Act and estimates the number of test animals and cost under each scenario. It focuses on the impact for the first 10 years of testing, the period of greatest impact on animals and costs. The analysis suggests the first 10 years of testing under the Act could evaluate, at most, about 50% of ingredients used in cosmetics. Testing during this period would cost about $ 1.7-$ 9 billion and 1-11.5 million animals. By test year 10, alternative, high-throughput test methods under development are expected to be available, replacing animal testing and allowing rapid evaluation of all ingredients. Given the high cost in dollars and animal lives of the first 10 years for only about half of ingredients, a better choice may be to accelerate development of high-throughput methods. This would allow evaluation of 100% of cosmetic ingredients before year 10 at lower cost and without animal testing.
Mazoure, Bogdan; Caraus, Iurie; Nadon, Robert; Makarenkov, Vladimir
2018-06-01
Data generated by high-throughput screening (HTS) technologies are prone to spatial bias. Traditionally, bias correction methods used in HTS assume either a simple additive or, more recently, a simple multiplicative spatial bias model. These models do not, however, always provide an accurate correction of measurements in wells located at the intersection of rows and columns affected by spatial bias. The measurements in these wells depend on the nature of interaction between the involved biases. Here, we propose two novel additive and two novel multiplicative spatial bias models accounting for different types of bias interactions. We describe a statistical procedure that allows for detecting and removing different types of additive and multiplicative spatial biases from multiwell plates. We show how this procedure can be applied by analyzing data generated by the four HTS technologies (homogeneous, microorganism, cell-based, and gene expression HTS), the three high-content screening (HCS) technologies (area, intensity, and cell-count HCS), and the only small-molecule microarray technology available in the ChemBank small-molecule screening database. The proposed methods are included in the AssayCorrector program, implemented in R, and available on CRAN.
Transfer, imaging, and analysis plate for facile handling of 384 hanging drop 3D tissue spheroids.
Cavnar, Stephen P; Salomonsson, Emma; Luker, Kathryn E; Luker, Gary D; Takayama, Shuichi
2014-04-01
Three-dimensional culture systems bridge the experimental gap between in vivo and in vitro physiology. However, nonstandardized formation and limited downstream adaptability of 3D cultures have hindered mainstream adoption of these systems for biological applications, especially for low- and moderate-throughput assays commonly used in biomedical research. Here we build on our recent development of a 384-well hanging drop plate for spheroid culture to design a complementary spheroid transfer and imaging (TRIM) plate. The low-aspect ratio wells of the TRIM plate facilitated high-fidelity, user-independent, contact-based collection of hanging drop spheroids. Using the TRIM plate, we demonstrated several downstream analyses, including bulk tissue collection for flow cytometry, high-resolution low working-distance immersion imaging, and timely reagent delivery for enzymatic studies. Low working-distance multiphoton imaging revealed a cell type-dependent, macroscopic spheroid structure. Unlike ovarian cancer spheroids, which formed loose, disk-shaped spheroids, human mammary fibroblasts formed tight, spherical, and nutrient-limited spheroids. Beyond the applications we describe here, we expect the hanging drop spheroid plate and complementary TRIM plate to facilitate analyses of spheroids across the spectrum of throughput, particularly for bulk collection of spheroids and high-content imaging.
Cancer biomarker discovery: the entropic hallmark.
Berretta, Regina; Moscato, Pablo
2010-08-18
It is a commonly accepted belief that cancer cells modify their transcriptional state during the progression of the disease. We propose that the progression of cancer cells towards malignant phenotypes can be efficiently tracked using high-throughput technologies that follow the gradual changes observed in the gene expression profiles by employing Shannon's mathematical theory of communication. Methods based on Information Theory can then quantify the divergence of cancer cells' transcriptional profiles from those of normally appearing cells of the originating tissues. The relevance of the proposed methods can be evaluated using microarray datasets available in the public domain but the method is in principle applicable to other high-throughput methods. Using melanoma and prostate cancer datasets we illustrate how it is possible to employ Shannon Entropy and the Jensen-Shannon divergence to trace the transcriptional changes progression of the disease. We establish how the variations of these two measures correlate with established biomarkers of cancer progression. The Information Theory measures allow us to identify novel biomarkers for both progressive and relatively more sudden transcriptional changes leading to malignant phenotypes. At the same time, the methodology was able to validate a large number of genes and processes that seem to be implicated in the progression of melanoma and prostate cancer. We thus present a quantitative guiding rule, a new unifying hallmark of cancer: the cancer cell's transcriptome changes lead to measurable observed transitions of Normalized Shannon Entropy values (as measured by high-throughput technologies). At the same time, tumor cells increment their divergence from the normal tissue profile increasing their disorder via creation of states that we might not directly measure. This unifying hallmark allows, via the the Jensen-Shannon divergence, to identify the arrow of time of the processes from the gene expression profiles, and helps to map the phenotypical and molecular hallmarks of specific cancer subtypes. The deep mathematical basis of the approach allows us to suggest that this principle is, hopefully, of general applicability for other diseases.
High-throughput sequencing methods to study neuronal RNA-protein interactions.
Ule, Jernej
2009-12-01
UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.
Automation in biological crystallization.
Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen
2014-06-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.
Automation in biological crystallization
Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen
2014-01-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074
Comparative Transcriptomes and EVO-DEVO Studies Depending on Next Generation Sequencing.
Liu, Tiancheng; Yu, Lin; Liu, Lei; Li, Hong; Li, Yixue
2015-01-01
High throughput technology has prompted the progressive omics studies, including genomics and transcriptomics. We have reviewed the improvement of comparative omic studies, which are attributed to the high throughput measurement of next generation sequencing technology. Comparative genomics have been successfully applied to evolution analysis while comparative transcriptomics are adopted in comparison of expression profile from two subjects by differential expression or differential coexpression, which enables their application in evolutionary developmental biology (EVO-DEVO) studies. EVO-DEVO studies focus on the evolutionary pressure affecting the morphogenesis of development and previous works have been conducted to illustrate the most conserved stages during embryonic development. Old measurements of these studies are based on the morphological similarity from macro view and new technology enables the micro detection of similarity in molecular mechanism. Evolutionary model of embryo development, which includes the "funnel-like" model and the "hourglass" model, has been evaluated by combination of these new comparative transcriptomic methods with prior comparative genomic information. Although the technology has promoted the EVO-DEVO studies into a new era, technological and material limitation still exist and further investigations require more subtle study design and procedure.
Toh, Shigeo; Holbrook-Smith, Duncan; Stokes, Michael E; Tsuchiya, Yuichiro; McCourt, Peter
2014-08-14
Strigolactones are terpenoid-based plant hormones that act as communication signals within a plant, between plants and fungi, and between parasitic plants and their hosts. Here we show that an active enantiomer form of the strigolactone GR24, the germination stimulant karrikin, and a number of structurally related small molecules called cotylimides all bind the HTL/KAI2 α/β hydrolase in Arabidopsis. Strigolactones and cotylimides also promoted an interaction between HTL/KAI2 and the F-box protein MAX2 in yeast. Identification of this chemically dependent protein-protein interaction prompted the development of a yeast-based, high-throughput chemical screen for potential strigolactone mimics. Of the 40 lead compounds identified, three were found to have in planta strigolactone activity using Arabidopsis-based assays. More importantly, these three compounds were all found to stimulate suicide germination of the obligate parasitic plant Striga hermonthica. These results suggest that screening strategies involving yeast/Arabidopsis models may be useful in combating parasitic plant infestations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Vonk, Freek J; Jackson, Kate; Doley, Robin; Madaras, Frank; Mirtschin, Peter J; Vidal, Nicolas
2011-04-01
Snake venoms are recognized here as a grossly under-explored resource in pharmacological prospecting. Discoveries in snake systematics demonstrate that former taxonomic bias in research has led to the neglect of thousands of species of potential medical use. Recent discoveries reveal an unexpectedly vast degree of variation in venom composition among snakes, from different species down to litter mates. The molecular mechanisms underlying this diversity are only beginning to be understood. However, the enormous potential that this resource represents for pharmacological prospecting is clear. New high-throughput screening systems offer greatly increased speed and efficiency in identifying and extracting therapeutically useful molecules. At the same time a global biodiversity crisis is threatening the very snake populations on which hopes for new venom-derived medications depend. Biomedical researchers, pharmacologists, clinicians, herpetologists, and conservation biologists must combine their efforts if the full potential of snake venom-derived medications is to be realized. Copyright © 2011 WILEY Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Schnable, J. C.; Pandey, P.; Ge, Y.; Xu, Y.; Qiu, Y.; Liang, Z.
2017-12-01
Maize Zea mays ssp. mays is one of three crops, along with rice and wheat, responsible for more than 1/2 of all calories consumed around the world. Increasing the yield and stress tolerance of these crops is essential to meet the growing need for food. The cost and speed of plant phenotyping is currently the largest constraint on plant breeding efforts. Datasets linking new types of high throughput phenotyping data collected from plants to the performance of the same genotypes under agronomic conditions across a wide range of environments are essential for developing new statistical approaches and computer vision based tools. A set of maize inbreds and hybrids - primarily recently off patent lines - were phenotyped using a high throughput platform at University of Nebraska-Lincoln. These lines have been previously subjected to high density genotyping, and scored for a core set of 13 phenotypes in field trials across 13 North American states in 2014, 2015, 2016, and 2017. Correlations between image-based measurements and manual measurements demonstrated the feasibility of quantifying variation in plant architecture using image data. However, we demonstrate that naive approaches to measuring traits such as biomass where are developed without integrating genotypic information can introduce nonrandom measurement errors which are confounded with variation between plant accessions. Analysis of hyperspectral image data demonstrated unique signatures from stem tissue which were not identified using aerial imagry. Integrating heritable phenotypes from high-throughput phenotyping data with field data from different environments can reveal previously unknown factors influencing yield plasticity.
Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.
2016-01-01
Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502
Conventional and hyperspectral time-series imaging of maize lines widely used in field trials
Liang, Zhikai; Pandey, Piyush; Stoerger, Vincent; Xu, Yuhang; Qiu, Yumou; Ge, Yufeng
2018-01-01
Abstract Background Maize (Zea mays ssp. mays) is 1 of 3 crops, along with rice and wheat, responsible for more than one-half of all calories consumed around the world. Increasing the yield and stress tolerance of these crops is essential to meet the growing need for food. The cost and speed of plant phenotyping are currently the largest constraints on plant breeding efforts. Datasets linking new types of high-throughput phenotyping data collected from plants to the performance of the same genotypes under agronomic conditions across a wide range of environments are essential for developing new statistical approaches and computer vision–based tools. Findings A set of maize inbreds—primarily recently off patent lines—were phenotyped using a high-throughput platform at University of Nebraska-Lincoln. These lines have been previously subjected to high-density genotyping and scored for a core set of 13 phenotypes in field trials across 13 North American states in 2 years by the Genomes 2 Fields Consortium. A total of 485 GB of image data including RGB, hyperspectral, fluorescence, and thermal infrared photos has been released. Conclusions Correlations between image-based measurements and manual measurements demonstrated the feasibility of quantifying variation in plant architecture using image data. However, naive approaches to measuring traits such as biomass can introduce nonrandom measurement errors confounded with genotype variation. Analysis of hyperspectral image data demonstrated unique signatures from stem tissue. Integrating heritable phenotypes from high-throughput phenotyping data with field data from different environments can reveal previously unknown factors that influence yield plasticity. PMID:29186425
Conventional and hyperspectral time-series imaging of maize lines widely used in field trials.
Liang, Zhikai; Pandey, Piyush; Stoerger, Vincent; Xu, Yuhang; Qiu, Yumou; Ge, Yufeng; Schnable, James C
2018-02-01
Maize (Zea mays ssp. mays) is 1 of 3 crops, along with rice and wheat, responsible for more than one-half of all calories consumed around the world. Increasing the yield and stress tolerance of these crops is essential to meet the growing need for food. The cost and speed of plant phenotyping are currently the largest constraints on plant breeding efforts. Datasets linking new types of high-throughput phenotyping data collected from plants to the performance of the same genotypes under agronomic conditions across a wide range of environments are essential for developing new statistical approaches and computer vision-based tools. A set of maize inbreds-primarily recently off patent lines-were phenotyped using a high-throughput platform at University of Nebraska-Lincoln. These lines have been previously subjected to high-density genotyping and scored for a core set of 13 phenotypes in field trials across 13 North American states in 2 years by the Genomes 2 Fields Consortium. A total of 485 GB of image data including RGB, hyperspectral, fluorescence, and thermal infrared photos has been released. Correlations between image-based measurements and manual measurements demonstrated the feasibility of quantifying variation in plant architecture using image data. However, naive approaches to measuring traits such as biomass can introduce nonrandom measurement errors confounded with genotype variation. Analysis of hyperspectral image data demonstrated unique signatures from stem tissue. Integrating heritable phenotypes from high-throughput phenotyping data with field data from different environments can reveal previously unknown factors that influence yield plasticity. © The Authors 2017. Published by Oxford University Press.
Sandham, David A; Arnold, Nicola; Aschauer, Heinrich; Bala, Kamlesh; Barker, Lucy; Brown, Lyndon; Brown, Zarin; Budd, David; Cox, Brian; Docx, Cerys; Dubois, Gerald; Duggan, Nicholas; England, Karen; Everatt, Brian; Furegati, Marcus; Hall, Edward; Kalthoff, Frank; King, Anna; Leblanc, Catherine J; Manini, Jodie; Meingassner, Josef; Profit, Rachael; Schmidt, Alfred; Simmons, Jennifer; Sohal, Bindi; Stringer, Rowan; Thomas, Matthew; Turner, Katharine L; Walker, Christoph; Watson, Simon J; Westwick, John; Willis, Jennifer; Williams, Gareth; Wilson, Caroline
2013-11-01
Optimization of a 7-azaindole-3-acetic acid CRTh2 receptor antagonist chemotype derived from high throughput screening furnished a highly selective compound NVP-QAV680 with low nM functional potency for inhibition of CRTh2 driven human eosinophil and Th2 lymphocyte activation in vitro. The molecule exhibited good oral bioavailability in the rat, combined with efficacy in rodent CRTh2-dependent mechanistic and allergic disease models and was suitable for clinical development. Copyright © 2013 Elsevier Ltd. All rights reserved.
High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...
Filošević, Ana; Al-Samarai, Sabina; Andretić Waldowski, Rozi
2018-01-01
Drosophila melanogaster can be used to identify genes with novel functional roles in neuronal plasticity induced by repeated consumption of addictive drugs. Behavioral sensitization is a relatively simple behavioral output of plastic changes that occur in the brain after repeated exposures to drugs of abuse. The development of screening procedures for genes that control behavioral sensitization has stalled due to a lack of high-throughput behavioral tests that can be used in genetically tractable organism, such as Drosophila . We have developed a new behavioral test, FlyBong, which combines delivery of volatilized cocaine (vCOC) to individually housed flies with objective quantification of their locomotor activity. There are two main advantages of FlyBong: it is high-throughput and it allows for comparisons of locomotor activity of individual flies before and after single or multiple exposures. At the population level, exposure to vCOC leads to transient and concentration-dependent increase in locomotor activity, representing sensitivity to an acute dose. A second exposure leads to further increase in locomotion, representing locomotor sensitization. We validate FlyBong by showing that locomotor sensitization at either the population or individual level is absent in the mutants for circadian genes period (per) , Clock (Clk) , and cycle (cyc) . The locomotor sensitization that is present in timeless (tim) and pigment dispersing factor (pdf) mutant flies is in large part not cocaine specific, but derived from increased sensitivity to warm air. Circadian genes are not only integral part of the neural mechanism that is required for development of locomotor sensitization, but in addition, they modulate the intensity of locomotor sensitization as a function of the time of day. Motor-activating effects of cocaine are sexually dimorphic and require a functional dopaminergic transporter. FlyBong is a new and improved method for inducing and measuring locomotor sensitization to cocaine in individual Drosophila . Because of its high-throughput nature, FlyBong can be used in genetic screens or in selection experiments aimed at the unbiased identification of functional genes involved in acute or chronic effects of volatilized psychoactive substances.
Filošević, Ana; Al-samarai, Sabina; Andretić Waldowski, Rozi
2018-01-01
Drosophila melanogaster can be used to identify genes with novel functional roles in neuronal plasticity induced by repeated consumption of addictive drugs. Behavioral sensitization is a relatively simple behavioral output of plastic changes that occur in the brain after repeated exposures to drugs of abuse. The development of screening procedures for genes that control behavioral sensitization has stalled due to a lack of high-throughput behavioral tests that can be used in genetically tractable organism, such as Drosophila. We have developed a new behavioral test, FlyBong, which combines delivery of volatilized cocaine (vCOC) to individually housed flies with objective quantification of their locomotor activity. There are two main advantages of FlyBong: it is high-throughput and it allows for comparisons of locomotor activity of individual flies before and after single or multiple exposures. At the population level, exposure to vCOC leads to transient and concentration-dependent increase in locomotor activity, representing sensitivity to an acute dose. A second exposure leads to further increase in locomotion, representing locomotor sensitization. We validate FlyBong by showing that locomotor sensitization at either the population or individual level is absent in the mutants for circadian genes period (per), Clock (Clk), and cycle (cyc). The locomotor sensitization that is present in timeless (tim) and pigment dispersing factor (pdf) mutant flies is in large part not cocaine specific, but derived from increased sensitivity to warm air. Circadian genes are not only integral part of the neural mechanism that is required for development of locomotor sensitization, but in addition, they modulate the intensity of locomotor sensitization as a function of the time of day. Motor-activating effects of cocaine are sexually dimorphic and require a functional dopaminergic transporter. FlyBong is a new and improved method for inducing and measuring locomotor sensitization to cocaine in individual Drosophila. Because of its high-throughput nature, FlyBong can be used in genetic screens or in selection experiments aimed at the unbiased identification of functional genes involved in acute or chronic effects of volatilized psychoactive substances. PMID:29459820
From Lab to Fab: Developing a Nanoscale Delivery Tool for Scalable Nanomanufacturing
NASA Astrophysics Data System (ADS)
Safi, Asmahan A.
The emergence of nanomaterials with unique properties at the nanoscale over the past two decades carries a capacity to impact society and transform or create new industries ranging from nanoelectronics to nanomedicine. However, a gap in nanomanufacturing technologies has prevented the translation of nanomaterial into real-world commercialized products. Bridging this gap requires a paradigm shift in methods for fabricating structured devices with a nanoscale resolution in a repeatable fashion. This thesis explores the new paradigms for fabricating nanoscale structures devices and systems for high throughput high registration applications. We present a robust and scalable nanoscale delivery platform, the Nanofountain Probe (NFP), for parallel direct-write of functional materials. The design and microfabrication of NFP is presented. The new generation addresses the challenges of throughput, resolution and ink replenishment characterizing tip-based nanomanufacturing. To achieve these goals, optimized probe geometry is integrated to the process along with channel sealing and cantilever bending. The capabilities of the newly fabricated probes are demonstrated through two type of delivery: protein nanopatterning and single cell nanoinjection. The broad applications of the NFP for single cell delivery are investigated. An external microfluidic packaging is developed to enable delivery in liquid environment. The system is integrated to a combined atomic force microscope and inverted fluorescence microscope. Intracellular delivery is demonstrated by injecting a fluorescent dextran into Hela cells in vitro while monitoring the injection forces. Such developments enable in vitro cellular delivery for single cell studies and high throughput gene expression. The nanomanufacturing capabilities of NFPs are explored. Nanofabrication of carbon nanotube-based electronics presents all the manufacturing challenges characterizing of assembling nanomaterials precisely onto devices. The presented study combines top-down and bottom-approaches by integrating the catalyst patterning and carbon nanotube growth directly on structures. Large array of iron-rich catalyst are patterned on an substrate for subsequent carbon nanotubes synthesis. The dependence of probe geometry and substrate wetting is assessed by modeling and experimental studies. Finally preliminary results on synthesis of carbon nanotube by catalyst assisted chemical vapor deposition suggest increasing the catalyst yield is critical. Such work will enable high throughput nanomanufacturing of carbon nanotube based devices.
High-throughput methods for characterizing the mechanical properties of coatings
NASA Astrophysics Data System (ADS)
Siripirom, Chavanin
The characterization of mechanical properties in a combinatorial and high-throughput workflow has been a bottleneck that reduced the speed of the materials development process. High-throughput characterization of the mechanical properties was applied in this research in order to reduce the amount of sample handling and to accelerate the output. A puncture tester was designed and built to evaluate the toughness of materials using an innovative template design coupled with automation. The test is in the form of a circular free-film indentation. A single template contains 12 samples which are tested in a rapid serial approach. Next, the operational principles of a novel parallel dynamic mechanical-thermal analysis instrument were analyzed in detail for potential sources of errors. The test uses a model of a circular bilayer fixed-edge plate deformation. A total of 96 samples can be analyzed simultaneously which provides a tremendous increase in efficiency compared with a conventional dynamic test. The modulus values determined by the system had considerable variation. The errors were observed and improvements to the system were made. A finite element analysis was used to analyze the accuracy given by the closed-form solution with respect to testing geometries, such as thicknesses of the samples. A good control of the thickness of the sample was proven to be crucial to the accuracy and precision of the output. Then, the attempt to correlate the high-throughput experiments and conventional coating testing methods was made. Automated nanoindentation in dynamic mode was found to provide information on the near-surface modulus and could potentially correlate with the pendulum hardness test using the loss tangent component. Lastly, surface characterization of stratified siloxane-polyurethane coatings was carried out with X-ray photoelectron spectroscopy, Rutherford backscattering spectroscopy, transmission electron microscopy, and nanoindentation. The siloxane component segregates to the surface during curing. The distribution of siloxane as a function of thickness into the sample showed differences depending on the formulation parameters. The coatings which had higher siloxane content near the surface were those coatings found to perform well in field tests.
Wu, Jian; Dai, Wei; Wu, Lin; Wang, Jinke
2018-02-13
Next-generation sequencing (NGS) is fundamental to the current biological and biomedical research. Construction of sequencing library is a key step of NGS. Therefore, various library construction methods have been explored. However, the current methods are still limited by some shortcomings. This study developed a new NGS library construction method, Single strand Adaptor Library Preparation (SALP), by using a novel single strand adaptor (SSA). SSA is a double-stranded oligonucleotide with a 3' overhang of 3 random nucleotides, which can be efficiently ligated to the 3' end of single strand DNA by T4 DNA ligase. SALP can be started with any denatured DNA fragments such as those sheared by Tn5 tagmentation, enzyme digestion and sonication. When started with Tn5-tagmented chromatin, SALP can overcome a key limitation of ATAC-seq and become a high-throughput NGS library construction method, SALP-seq, which can be used to comparatively characterize the chromatin openness state of multiple cells unbiasly. In this way, this study successfully characterized the comparative chromatin openness states of four different cell lines, including GM12878, HepG2, HeLa and 293T, with SALP-seq. Similarly, this study also successfully characterized the chromatin openness states of HepG2 cells with SALP-seq by using 10 5 to 500 cells. This study developed a new NGS library construction method, SALP, by using a novel kind of single strand adaptor (SSA), which should has wide applications in the future due to its unique performance.
Draveling, C; Ren, L; Haney, P; Zeisse, D; Qoronfleh, M W
2001-07-01
The revolution in genomics and proteomics is having a profound impact on drug discovery. Today's protein scientist demands a faster, easier, more reliable way to purify proteins. A high capacity, high-throughput new technology has been developed in Perbio Sciences for affinity protein purification. This technology utilizes selected chromatography media that are dehydrated to form uniform aggregates. The SwellGel aggregates will instantly rehydrate upon addition of the protein sample, allowing purification and direct performance of multiple assays in a variety of formats. SwellGel technology has greater stability and is easier to handle than standard wet chromatography resins. The microplate format of this technology provides high-capacity, high-throughput features, recovering milligram quantities of protein suitable for high-throughput screening or biophysical/structural studies. Data will be presented applying SwellGel technology to recombinant 6x His-tagged protein and glutathione-S-transferase (GST) fusion protein purification. Copyright 2001 Academic Press.
NASA Astrophysics Data System (ADS)
Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela
2016-10-01
Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.
Research Of Airborne Precision Spacing to Improve Airport Arrival Operations
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Baxley, Brian T.; Murdoch, Jennifer L.
2011-01-01
In September 2004, the European Organization for the Safety of Air Navigation (EUROCONTROL) and the United States Federal Aviation Administration (FAA) signed a Memorandum of Cooperation to mutually develop, modify, test, and evaluate systems, procedures, facilities, and devices to meet the need for safe and efficient air navigation and air traffic control in the future. In the United States and Europe, these efforts are defined within the architectures of the Next Generation Air Transportation System (NextGen) Program and Single European Sky Air Traffic Management Research (SESAR) Program respectively. Both programs have identified Airborne Spacing as a critical component, with Automatic Dependent Surveillance Broadcast (ADS-B) as a key enabler. Increased interest in reducing airport community noise and the escalating cost of aviation fuel has led to the use of Continuous Descent Arrival (CDA) procedures to reduce noise, emissions, and fuel usage compared to current procedures. To provide these operational enhancements, arrival flight paths into terminal areas are planned around continuous vertical descents that are closer to an optimum trajectory than those in use today. The profiles are designed to be near-idle descents from cruise altitude to the Final Approach Fix (FAF) and are typically without any level segments. By staying higher and faster than conventional arrivals, CDAs also save flight time for the aircraft operator. The drawback is that the variation of optimized trajectories for different types and weights of aircraft requires the Air Traffic Controller to provide more airspace around an aircraft on a CDA than on a conventional arrival procedure. This additional space decreases the throughput rate of the destination airport. Airborne self-spacing concepts have been developed to increase the throughput at high-demand airports by managing the inter-arrival spacing to be more precise and consistent using on-board guidance. It has been proposed that the additional space needed around an aircraft performing a CDA could be reduced or eliminated when using airborne spacing techniques.
The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.
The ToxCast Dashboard helps users examine high-throughput assay data to inform chemical safety decisions. To date, it has data on over 9,000 chemicals and information from more than 1,000 high-throughput assay endpoint components.
Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong
2014-01-01
Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980
Ionomics: The functional genomics of elements.
Baxter, Ivan
2010-03-01
Ionomics is the study of elemental accumulation in living systems using high-throughput elemental profiling. This approach has been applied extensively in plants for forward and reverse genetics, screening diversity panels, and modeling of physiological states. In this review, I will discuss some of the advantages and limitations of the ionomics approach as well as the important parameters to consider when designing ionomics experiments, and how to evaluate ionomics data.
A High-throughput Screening Assay for Determining Cellular Levels of Total Tau Protein
Dehdashti, Seameen J.; Zheng, Wei; Gever, Joel R.; Wilhelm, Robert; Nguyen, Dac-Trung; Sittampalam, Gurusingham; McKew, John C.; Austin, Christopher P.; Prusiner, Stanley B.
2014-01-01
The microtubule-associated protein (MAP) tau has been implicated in the pathology of numerous neurodegenerative diseases. In the past decade, the hyperphosphorylated and aggregated states of tau protein have been important targets in the drug discovery field for the potential treatment of Alzheimer’s disease. Although several compounds have been reported to reduce the hyperphosphorylated state of tau or impact the stabilization of tau, their therapeutic activities are still to be validated. Recently, reduction of total cellular tau protein has emerged as an alternate intervention point for drug development and a potential treatment of tauopathies. We have developed and optimized a homogenous assay, using the AlphaLISA and HTRF assay technologies, for the quantification of total cellular tau protein levels in the SH-SY5Y neuroblastoma cell line. The signal-to-basal ratios were 375 and 5.3, and the Z’ factors were 0.67 and 0.60 for the AlphaLISA and HTRF tau assays, respectively. The clear advantages of this homogeneous tau assay over conventional total tau assays, such as ELISA and Western blot, are the elimination of plate wash steps and miniaturization of the assay into 1536-well plate format for the ultra–high-throughput screening of large compound libraries. PMID:23905996
A high-quality annotated transcriptome of swine peripheral blood
USDA-ARS?s Scientific Manuscript database
Background: High throughput gene expression profiling assays of peripheral blood are widely used in biomedicine, as well as in animal genetics and physiology research. Accurate, comprehensive, and precise interpretation of such high throughput assays relies on well-characterized reference genomes an...
An innovative SNP genotyping method adapting to multiple platforms and throughputs.
Long, Y M; Chao, W S; Ma, G J; Xu, S S; Qi, L L
2017-03-01
An innovative genotyping method designated as semi-thermal asymmetric reverse PCR (STARP) was developed for genotyping individual SNPs with improved accuracy, flexible throughputs, low operational costs, and high platform compatibility. Multiplex chip-based technology for genome-scale genotyping of single nucleotide polymorphisms (SNPs) has made great progress in the past two decades. However, PCR-based genotyping of individual SNPs still remains problematic in accuracy, throughput, simplicity, and/or operational costs as well as the compatibility with multiple platforms. Here, we report a novel SNP genotyping method designated semi-thermal asymmetric reverse PCR (STARP). In this method, genotyping assay was performed under unique PCR conditions using two universal priming element-adjustable primers (PEA-primers) and one group of three locus-specific primers: two asymmetrically modified allele-specific primers (AMAS-primers) and their common reverse primer. The two AMAS-primers each were substituted one base in different positions at their 3' regions to significantly increase the amplification specificity of the two alleles and tailed at 5' ends to provide priming sites for PEA-primers. The two PEA-primers were developed for common use in all genotyping assays to stringently target the PCR fragments generated by the two AMAS-primers with similar PCR efficiencies and for flexible detection using either gel-free fluorescence signals or gel-based size separation. The state-of-the-art primer design and unique PCR conditions endowed STARP with all the major advantages of high accuracy, flexible throughputs, simple assay design, low operational costs, and platform compatibility. In addition to SNPs, STARP can also be employed in genotyping of indels (insertion-deletion polymorphisms). As vast variations in DNA sequences are being unearthed by many genome sequencing projects and genotyping by sequencing, STARP will have wide applications across all biological organisms in agriculture, medicine, and forensics.
Duo, Jia; Bruno, JoAnne; Kozhich, Alexander; David-Brown, Donata; Luo, Linlin; Kwok, Suk; Santockyte, Rasa; Haulenbeek, Jonathan; Liu, Rong; Hamuro, Lora; Peterson, Jon E; Piccoli, Steven; DeSilva, Binodh; Pillutla, Renuka; Zhang, Yan J
2018-04-01
Ligand-binding assay (LBA) performance depends on quality reagents. Strategic reagent screening and characterization is critical to LBA development, optimization and validation. Application of advanced technologies expedites the reagent screening and assay development process. By evaluating surface plasmon resonance technology that offers high-throughput kinetic information, this article aims to provide perspectives on applying the surface plasmon resonance technology to strategic LBA critical reagent screening and characterization supported by a number of case studies from multiple biotherapeutic programs.
The Dana Farber Cancer Institute CTD2 Center focuses on the use of high-throughput genetic and bioinformatic approaches to identify and credential oncogenes and co-dependencies in cancers. This Center aims to provide the cancer research community with information that will facilitate the prioritization of targets based on both genomic and functional evidence, inform the most appropriate genetic context for downstream mechanistic and validation studies, and enable the translation of this information into therapeutics and diagnostics.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS
High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...
Leng, Yuankui
2017-01-01
Spectrometrically or optically encoded microsphere based suspension array technology (SAT) is applicable to the high-throughput, simultaneous detection of multiple analytes within a small, single sample volume. Thanks to the rapid development of nanotechnology, tremendous progress has been made in the multiplexed detecting capability, sensitivity, and photostability of suspension arrays. In this review, we first focus on the current stock of nanoparticle-based barcodes as well as the manufacturing technologies required for their production. We then move on to discuss all existing barcode-based bioanalysis patterns, including the various labels used in suspension arrays, label-free platforms, signal amplification methods, and fluorescence resonance energy transfer (FRET)-based platforms. We then introduce automatic platforms for suspension arrays that use superparamagnetic nanoparticle-based microspheres. Finally, we summarize the current challenges and their proposed solutions, which are centered on improving encoding capacities, alternative probe possibilities, nonspecificity suppression, directional immobilization, and “point of care” platforms. Throughout this review, we aim to provide a comprehensive guide for the design of suspension arrays, with the goal of improving their performance in areas such as multiplexing capacity, throughput, sensitivity, and cost effectiveness. We hope that our summary on the state-of-the-art development of these arrays, our commentary on future challenges, and some proposed avenues for further advances will help drive the development of suspension array technology and its related fields. PMID:26021602
Yong, Y K; Moheimani, S O R; Kenton, B J; Leang, K K
2012-12-01
Recent interest in high-speed scanning probe microscopy for high-throughput applications including video-rate atomic force microscopy and probe-based nanofabrication has sparked attention on the development of high-bandwidth flexure-guided nanopositioning systems (nanopositioners). Such nanopositioners are designed to move samples with sub-nanometer resolution with positioning bandwidth in the kilohertz range. State-of-the-art designs incorporate uniquely designed flexure mechanisms driven by compact and stiff piezoelectric actuators. This paper surveys key advances in mechanical design and control of dynamic effects and nonlinearities, in the context of high-speed nanopositioning. Future challenges and research topics are also discussed.
Kitsos, Christine M; Bhamidipati, Phani; Melnikova, Irena; Cash, Ethan P; McNulty, Chris; Furman, Julia; Cima, Michael J; Levinson, Douglas
2007-01-01
This study examined whether hierarchical clustering could be used to detect cell states induced by treatment combinations that were generated through automation and high-throughput (HT) technology. Data-mining techniques were used to analyze the large experimental data sets to determine whether nonlinear, non-obvious responses could be extracted from the data. Unary, binary, and ternary combinations of pharmacological factors (examples of stimuli) were used to induce differentiation of HL-60 cells using a HT automated approach. Cell profiles were analyzed by incorporating hierarchical clustering methods on data collected by flow cytometry. Data-mining techniques were used to explore the combinatorial space for nonlinear, unexpected events. Additional small-scale, follow-up experiments were performed on cellular profiles of interest. Multiple, distinct cellular profiles were detected using hierarchical clustering of expressed cell-surface antigens. Data-mining of this large, complex data set retrieved cases of both factor dominance and cooperativity, as well as atypical cellular profiles. Follow-up experiments found that treatment combinations producing "atypical cell types" made those cells more susceptible to apoptosis. CONCLUSIONS Hierarchical clustering and other data-mining techniques were applied to analyze large data sets from HT flow cytometry. From each sample, the data set was filtered and used to define discrete, usable states that were then related back to their original formulations. Analysis of resultant cell populations induced by a multitude of treatments identified unexpected phenotypes and nonlinear response profiles.
GiNA, an efficient and high-throughput software for horticultural phenotyping
USDA-ARS?s Scientific Manuscript database
Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...
High-throughput quantification of hydroxyproline for determination of collagen.
Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan
2011-10-15
An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.
2016-09-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakala, Jacqueline Alexandra
2016-11-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
A Memory Efficient Network Encryption Scheme
NASA Astrophysics Data System (ADS)
El-Fotouh, Mohamed Abo; Diepold, Klaus
In this paper, we studied the two widely used encryption schemes in network applications. Shortcomings have been found in both schemes, as these schemes consume either more memory to gain high throughput or low memory with low throughput. The need has aroused for a scheme that has low memory requirements and in the same time possesses high speed, as the number of the internet users increases each day. We used the SSM model [1], to construct an encryption scheme based on the AES. The proposed scheme possesses high throughput together with low memory requirements.
HTP-NLP: A New NLP System for High Throughput Phenotyping.
Schlegel, Daniel R; Crowner, Chris; Lehoullier, Frank; Elkin, Peter L
2017-01-01
Secondary use of clinical data for research requires a method to quickly process the data so that researchers can quickly extract cohorts. We present two advances in the High Throughput Phenotyping NLP system which support the aim of truly high throughput processing of clinical data, inspired by a characterization of the linguistic properties of such data. Semantic indexing to store and generalize partially-processed results and the use of compositional expressions for ungrammatical text are discussed, along with a set of initial timing results for the system.
Mapping of MPEG-4 decoding on a flexible architecture platform
NASA Astrophysics Data System (ADS)
van der Tol, Erik B.; Jaspers, Egbert G.
2001-12-01
In the field of consumer electronics, the advent of new features such as Internet, games, video conferencing, and mobile communication has triggered the convergence of television and computers technologies. This requires a generic media-processing platform that enables simultaneous execution of very diverse tasks such as high-throughput stream-oriented data processing and highly data-dependent irregular processing with complex control flows. As a representative application, this paper presents the mapping of a Main Visual profile MPEG-4 for High-Definition (HD) video onto a flexible architecture platform. A stepwise approach is taken, going from the decoder application toward an implementation proposal. First, the application is decomposed into separate tasks with self-contained functionality, clear interfaces, and distinct characteristics. Next, a hardware-software partitioning is derived by analyzing the characteristics of each task such as the amount of inherent parallelism, the throughput requirements, the complexity of control processing, and the reuse potential over different applications and different systems. Finally, a feasible implementation is proposed that includes amongst others a very-long-instruction-word (VLIW) media processor, one or more RISC processors, and some dedicated processors. The mapping study of the MPEG-4 decoder proves the flexibility and extensibility of the media-processing platform. This platform enables an effective HW/SW co-design yielding a high performance density.
A low cost and high throughput magnetic bead-based immuno-agglutination assay in confined droplets.
Teste, Bruno; Ali-Cherif, Anaïs; Viovy, Jean Louis; Malaquin, Laurent
2013-06-21
Although passive immuno-agglutination assays consist of one step and simple procedures, they are usually not adapted for high throughput analyses and they require expensive and bulky equipment for quantitation steps. Here we demonstrate a low cost, multimodal and high throughput immuno-agglutination assay that relies on a combination of magnetic beads (MBs), droplets microfluidics and magnetic tweezers. Antibody coated MBs were used as a capture support in the homogeneous phase. Following the immune interaction, water in oil droplets containing MBs and analytes were generated and transported in Teflon tubing. When passing in between magnetic tweezers, the MBs contained in the droplets were magnetically confined in order to enhance the agglutination rate and kinetics. When releasing the magnetic field, the internal recirculation flows in the droplet induce shear forces that favor MBs redispersion. In the presence of the analyte, the system preserves specific interactions and MBs stay in the aggregated state while in the case of a non-specific analyte, redispersion of particles occurs. The analyte quantitation procedure relies on the MBs redispersion rate within the droplet. The influence of different parameters such as magnetic field intensity, flow rate and MBs concentration on the agglutination performances have been investigated and optimized. Although the immuno-agglutination assay described in this work may not compete with enzyme linked immunosorbent assay (ELISA) in terms of sensitivity, it offers major advantages regarding the reagents consumption (analysis is performed in sub microliter droplet) and the platform cost that yields to very cheap analyses. Moreover the fully automated analysis procedure provides reproducible analyses with throughput well above those of existing technologies. We demonstrated the detection of biotinylated phosphatase alkaline in 100 nL sample volumes with an analysis rate of 300 assays per hour and a limit of detection of 100 pM.
NASA Astrophysics Data System (ADS)
Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki
In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.
Sequence-Dependent Persistence Length of Long DNA
NASA Astrophysics Data System (ADS)
Chuang, Hui-Min; Reifenberger, Jeffrey G.; Cao, Han; Dorfman, Kevin D.
2017-12-01
Using a high-throughput genome-mapping approach, we obtained circa 50 million measurements of the extension of internal human DNA segments in a 41 nm ×41 nm nanochannel. The underlying DNA sequences, obtained by mapping to the reference human genome, are 2.5-393 kilobase pairs long and contain percent GC contents between 32.5% and 60%. Using Odijk's theory for a channel-confined wormlike chain, these data reveal that the DNA persistence length increases by almost 20% as the percent GC content increases. The increased persistence length is rationalized by a model, containing no adjustable parameters, that treats the DNA as a statistical terpolymer with a sequence-dependent intrinsic persistence length and a sequence-independent electrostatic persistence length.
Datta, Sandipan; Tomilov, Alexey; Cortopassi, Gino
2016-01-01
Inherited mitochondrial complex I mutations cause blinding Leber's hereditary Optic Neuropathy (LHON), for which no curative therapy exists. A specific biochemical consequence of LHON mutations in the presence of trace rotenone was observed: deficient complex I-dependent ATP synthesis (CIDAS) and mitochondrial O2 consumption, proportional to the clinical severity of the three primary LHON mutations. We optimized a high-throughput assay of CIDAS to screen 1600 drugs to 2, papaverine and zolpidem, which protected CIDAS in LHON cells concentration-dependently. TSPO and cAMP were investigated as protective mechanisms, but a conclusive mechanism remains to be elucidated; next steps include testing in animal models. PMID:27497748
Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)
High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...
Krebs, Arnaud R; Dessus-Babus, Sophie; Burger, Lukas; Schübeler, Dirk
2014-09-26
The majority of mammalian promoters are CpG islands; regions of high CG density that require protection from DNA methylation to be functional. Importantly, how sequence architecture mediates this unmethylated state remains unclear. To address this question in a comprehensive manner, we developed a method to interrogate methylation states of hundreds of sequence variants inserted at the same genomic site in mouse embryonic stem cells. Using this assay, we were able to quantify the contribution of various sequence motifs towards the resulting DNA methylation state. Modeling of this comprehensive dataset revealed that CG density alone is a minor determinant of their unmethylated state. Instead, these data argue for a principal role for transcription factor binding sites, a prediction confirmed by testing synthetic mutant libraries. Taken together, these findings establish the hierarchy between the two cis-encoded mechanisms that define the DNA methylation state and thus the transcriptional competence of CpG islands.
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
Yan, Qimin; Yu, Jie; Suram, Santosh K.; ...
2017-03-06
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
Microfluidics for cell-based high throughput screening platforms - A review.
Du, Guansheng; Fang, Qun; den Toonder, Jaap M J
2016-01-15
In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery. Copyright © 2015 Elsevier B.V. All rights reserved.
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Qimin; Yu, Jie; Suram, Santosh K.
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
Development and Validation of an Automated High-Throughput System for Zebrafish In Vivo Screenings
Virto, Juan M.; Holgado, Olaia; Diez, Maria; Izpisua Belmonte, Juan Carlos; Callol-Massot, Carles
2012-01-01
The zebrafish is a vertebrate model compatible with the paradigms of drug discovery. The small size and transparency of zebrafish embryos make them amenable for the automation necessary in high-throughput screenings. We have developed an automated high-throughput platform for in vivo chemical screenings on zebrafish embryos that includes automated methods for embryo dispensation, compound delivery, incubation, imaging and analysis of the results. At present, two different assays to detect cardiotoxic compounds and angiogenesis inhibitors can be automatically run in the platform, showing the versatility of the system. A validation of these two assays with known positive and negative compounds, as well as a screening for the detection of unknown anti-angiogenic compounds, have been successfully carried out in the system developed. We present a totally automated platform that allows for high-throughput screenings in a vertebrate organism. PMID:22615792
BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing
Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph
2011-01-01
Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797
Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie
2018-04-25
Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Identification and correction of systematic error in high-throughput sequence data
2011-01-01
Background A feature common to all DNA sequencing technologies is the presence of base-call errors in the sequenced reads. The implications of such errors are application specific, ranging from minor informatics nuisances to major problems affecting biological inferences. Recently developed "next-gen" sequencing technologies have greatly reduced the cost of sequencing, but have been shown to be more error prone than previous technologies. Both position specific (depending on the location in the read) and sequence specific (depending on the sequence in the read) errors have been identified in Illumina and Life Technology sequencing platforms. We describe a new type of systematic error that manifests as statistically unlikely accumulations of errors at specific genome (or transcriptome) locations. Results We characterize and describe systematic errors using overlapping paired reads from high-coverage data. We show that such errors occur in approximately 1 in 1000 base pairs, and that they are highly replicable across experiments. We identify motifs that are frequent at systematic error sites, and describe a classifier that distinguishes heterozygous sites from systematic error. Our classifier is designed to accommodate data from experiments in which the allele frequencies at heterozygous sites are not necessarily 0.5 (such as in the case of RNA-Seq), and can be used with single-end datasets. Conclusions Systematic errors can easily be mistaken for heterozygous sites in individuals, or for SNPs in population analyses. Systematic errors are particularly problematic in low coverage experiments, or in estimates of allele-specific expression from RNA-Seq data. Our characterization of systematic error has allowed us to develop a program, called SysCall, for identifying and correcting such errors. We conclude that correction of systematic errors is important to consider in the design and interpretation of high-throughput sequencing experiments. PMID:22099972
High Throughput Genotoxicity Profiling of the US EPA ToxCast Chemical Library
A key aim of the ToxCast project is to investigate modern molecular and genetic high content and high throughput screening (HTS) assays, along with various computational tools to supplement and perhaps replace traditional assays for evaluating chemical toxicity. Genotoxicity is a...
Autonomous control of production networks using a pheromone approach
NASA Astrophysics Data System (ADS)
Armbruster, D.; de Beer, C.; Freitag, M.; Jagalski, T.; Ringhofer, C.
2006-04-01
The flow of parts through a production network is usually pre-planned by a central control system. Such central control fails in presence of highly fluctuating demand and/or unforeseen disturbances. To manage such dynamic networks according to low work-in-progress and short throughput times, an autonomous control approach is proposed. Autonomous control means a decentralized routing of the autonomous parts themselves. The parts’ decisions base on backward propagated information about the throughput times of finished parts for different routes. So, routes with shorter throughput times attract parts to use this route again. This process can be compared to ants leaving pheromones on their way to communicate with following ants. The paper focuses on a mathematical description of such autonomously controlled production networks. A fluid model with limited service rates in a general network topology is derived and compared to a discrete-event simulation model. Whereas the discrete-event simulation of production networks is straightforward, the formulation of the addressed scenario in terms of a fluid model is challenging. Here it is shown, how several problems in a fluid model formulation (e.g. discontinuities) can be handled mathematically. Finally, some simulation results for the pheromone-based control with both the discrete-event simulation model and the fluid model are presented for a time-dependent influx.
Haeili, Mehri; Moore, Casey; Davis, Christopher J C; Cochran, James B; Shah, Santosh; Shrestha, Tej B; Zhang, Yaofang; Bossmann, Stefan H; Benjamin, William H; Kutsch, Olaf; Wolschendorf, Frank
2014-07-01
Macrophages take advantage of the antibacterial properties of copper ions in the killing of bacterial intruders. However, despite the importance of copper for innate immune functions, coordinated efforts to exploit copper ions for therapeutic interventions against bacterial infections are not yet in place. Here we report a novel high-throughput screening platform specifically developed for the discovery and characterization of compounds with copper-dependent antibacterial properties toward methicillin-resistant Staphylococcus aureus (MRSA). We detail how one of the identified compounds, glyoxal-bis(N4-methylthiosemicarbazone) (GTSM), exerts its potent strictly copper-dependent antibacterial properties on MRSA. Our data indicate that the activity of the GTSM-copper complex goes beyond the general antibacterial effects of accumulated copper ions and suggest that, in contrast to prevailing opinion, copper complexes can indeed exhibit species- and target-specific activities. Based on experimental evidence, we propose that copper ions impose structural changes upon binding to the otherwise inactive GTSM ligand and transfer antibacterial properties to the chelate. In turn, GTSM determines target specificity and utilizes a redox-sensitive release mechanism through which copper ions are deployed at or in close proximity to a putative target. According to our proof-of-concept screen, copper activation is not a rare event and even extends to already established drugs. Thus, copper-activated compounds could define a novel class of anti-MRSA agents that amplify copper-dependent innate immune functions of the host. To this end, we provide a blueprint for a high-throughput drug screening campaign which considers the antibacterial properties of copper ions at the host-pathogen interface. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Medvetz, Doug; Sun, Yang; Li, Chenggang; Khabibullin, Damir; Balan, Murugabaskar; Parkhitko, Andrey; Priolo, Carmen; Asara, John M; Pal, Soumitro; Yu, Jane; Henske, Elizabeth P
2015-01-01
Tuberous sclerosis complex (TSC) is an autosomal dominant syndrome associated with tumors of the brain, heart, kidney, and lung. The TSC protein complex inhibits the mammalian or mechanistic target of rapamycin complex 1 (mTORC1). Inhibitors of mTORC1, including rapamycin, induce a cytostatic response in TSC tumors, resulting in temporary disease stabilization and prompt regrowth when treatment is stopped. The lack of TSC-specific cytotoxic therapies represents an important unmet clinical need. Using a high-throughput chemical screen in TSC2-deficient, patient-derived cells, we identified a series of molecules antagonized by rapamycin and therefore selective for cells with mTORC1 hyperactivity. In particular, the cell-permeable alkaloid chelerythrine induced reactive oxygen species (ROS) and depleted glutathione (GSH) selectively in TSC2-null cells based on metabolic profiling. N-acetylcysteine or GSH cotreatment protected TSC2-null cells from chelerythrine's effects, indicating that chelerythrine-induced cell death is ROS dependent. Induction of heme-oxygenase-1 (HMOX1/HO-1) with hemin also blocked chelerythrine-induced cell death. In vivo, chelerythrine inhibited the growth of TSC2-null xenograft tumors with no evidence of systemic toxicity with daily treatment over an extended period of time. This study reports the results of a bioactive compound screen and the identification of a potential lead candidate that acts via a novel oxidative stress-dependent mechanism to selectively induce necroptosis in TSC2-deficient tumors. This study demonstrates that TSC2-deficient tumor cells are hypersensitive to oxidative stress-dependent cell death, and provide critical proof of concept that TSC2-deficient cells can be therapeutically targeted without the use of a rapalog to induce a cell death response. ©2014 American Association for Cancer Research.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella
2012-01-01
Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701