Hankin, Abigail; Freiman, Heather; Copeland, Brittney; Travis, Natasha; Shah, Bijal
2016-01-01
This study compared two approaches for implementation of non-targeted HIV screening in the emergency department (ED): (1) designated HIV counselors screening in parallel with ED care and (2) nurse-based screening integrated into patient triage. A retrospective analysis was performed to compare parallel and integrated screening models using data from the first 12 months of each program. Data for the parallel screening model were extracted from information collected by HIV test counselors and the electronic medical record (EMR). Integrated screening model data were extracted from the EMR and supplemented by data collected by HIV social workers during patient interaction. For both programs, data included demographics, HIV test offer, test acceptance or declination, and test result. A Z-test between two proportions was performed to compare screening frequencies and results. During the first 12 months of parallel screening, approximately 120,000 visits were made to the ED, with 3,816 (3%) HIV tests administered and 65 (2%) new diagnoses of HIV infection. During the first 12 months of integrated screening, 111,738 patients were triaged in the ED, with 16,329 (15%) patients tested and 190 (1%) new diagnoses. Integrated screening resulted in an increased frequency of HIV screening compared with parallel screening (0.15 tests per ED patient visit vs. 0.03 tests per ED patient visit, p<0.001) and an increase in the absolute number of new diagnoses (190 vs. 65), representing a slight decrease in the proportion of new diagnoses (1% vs. 2%, p=0.007). Non-targeted, integrated HIV screening, with test offer and order by ED nurses during patient triage, is feasible and resulted in an increased frequency of HIV screening and a threefold increase in the absolute number of newly identified HIV-positive patients.
Cheng, Han; Koning, Katie; O'Hearn, Aileen; Wang, Minxiu; Rumschlag-Booms, Emily; Varhegyi, Elizabeth; Rong, Lijun
2015-11-24
Genome-wide RNAi screening has been widely used to identify host proteins involved in replication and infection of different viruses, and numerous host factors are implicated in the replication cycles of these viruses, demonstrating the power of this approach. However, discrepancies on target identification of the same viruses by different groups suggest that high throughput RNAi screening strategies need to be carefully designed, developed and optimized prior to the large scale screening. Two genome-wide RNAi screens were performed in parallel against the entry of pseudotyped Marburg viruses and avian influenza virus H5N1 utilizing an HIV-1 based surrogate system, to identify host factors which are important for virus entry. A comparative analysis approach was employed in data analysis, which alleviated systematic positional effects and reduced the false positive number of virus-specific hits. The parallel nature of the strategy allows us to easily identify the host factors for a specific virus with a greatly reduced number of false positives in the initial screen, which is one of the major problems with high throughput screening. The power of this strategy is illustrated by a genome-wide RNAi screen for identifying the host factors important for Marburg virus and/or avian influenza virus H5N1 as described in this study. This strategy is particularly useful for highly pathogenic viruses since pseudotyping allows us to perform high throughput screens in the biosafety level 2 (BSL-2) containment instead of the BSL-3 or BSL-4 for the infectious viruses, with alleviated safety concerns. The screening strategy together with the unique comparative analysis approach makes the data more suitable for hit selection and enables us to identify virus-specific hits with a much lower false positive rate.
Tile-based Level of Detail for the Parallel Age
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niski, K; Cohen, J D
Today's PCs incorporate multiple CPUs and GPUs and are easily arranged in clusters for high-performance, interactive graphics. We present an approach based on hierarchical, screen-space tiles to parallelizing rendering with level of detail. Adapt tiles, render tiles, and machine tiles are associated with CPUs, GPUs, and PCs, respectively, to efficiently parallelize the workload with good resource utilization. Adaptive tile sizes provide load balancing while our level of detail system allows total and independent management of the load on CPUs and GPUs. We demonstrate our approach on parallel configurations consisting of both single PCs and a cluster of PCs.
Parallel shRNA and CRISPR-Cas9 screens enable antiviral drug target identification.
Deans, Richard M; Morgens, David W; Ökesli, Ayşe; Pillay, Sirika; Horlbeck, Max A; Kampmann, Martin; Gilbert, Luke A; Li, Amy; Mateo, Roberto; Smith, Mark; Glenn, Jeffrey S; Carette, Jan E; Khosla, Chaitan; Bassik, Michael C
2016-05-01
Broad-spectrum antiviral drugs targeting host processes could potentially treat a wide range of viruses while reducing the likelihood of emergent resistance. Despite great promise as therapeutics, such drugs remain largely elusive. Here we used parallel genome-wide high-coverage short hairpin RNA (shRNA) and clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 screens to identify the cellular target and mechanism of action of GSK983, a potent broad-spectrum antiviral with unexplained cytotoxicity. We found that GSK983 blocked cell proliferation and dengue virus replication by inhibiting the pyrimidine biosynthesis enzyme dihydroorotate dehydrogenase (DHODH). Guided by mechanistic insights from both genomic screens, we found that exogenous deoxycytidine markedly reduced GSK983 cytotoxicity but not antiviral activity, providing an attractive new approach to improve the therapeutic window of DHODH inhibitors against RNA viruses. Our results highlight the distinct advantages and limitations of each screening method for identifying drug targets, and demonstrate the utility of parallel knockdown and knockout screens for comprehensive probing of drug activity.
Hierarchical virtual screening approaches in small molecule drug discovery.
Kumar, Ashutosh; Zhang, Kam Y J
2015-01-01
Virtual screening has played a significant role in the discovery of small molecule inhibitors of therapeutic targets in last two decades. Various ligand and structure-based virtual screening approaches are employed to identify small molecule ligands for proteins of interest. These approaches are often combined in either hierarchical or parallel manner to take advantage of the strength and avoid the limitations associated with individual methods. Hierarchical combination of ligand and structure-based virtual screening approaches has received noteworthy success in numerous drug discovery campaigns. In hierarchical virtual screening, several filters using ligand and structure-based approaches are sequentially applied to reduce a large screening library to a number small enough for experimental testing. In this review, we focus on different hierarchical virtual screening strategies and their application in the discovery of small molecule modulators of important drug targets. Several virtual screening studies are discussed to demonstrate the successful application of hierarchical virtual screening in small molecule drug discovery. Copyright © 2014 Elsevier Inc. All rights reserved.
Personalized drug discovery: HCA approach optimized for rare diseases at Tel Aviv University.
Solmesky, Leonardo J; Weil, Miguel
2014-03-01
The Cell screening facility for personalized medicine (CSFPM) at Tel Aviv University in Israel is devoted to screening small molecules libraries for finding new drugs for rare diseases using human cell based models. The main strategy of the facility is based on smartly reducing the size of the compounds collection in similarity clusters and at the same time keeping high diversity of pharmacophores. This strategy allows parallel screening of several patient derived - cells in a personalized screening approach. The tested compounds are repositioned drugs derived from collections of phase III and FDA approved small molecules. In addition, the facility carries screenings using other chemical libraries and toxicological characterizations of nanomaterials.
Large-scale virtual screening on public cloud resources with Apache Spark.
Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola
2017-01-01
Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.
Activity-Based Screening of Metagenomic Libraries for Hydrogenase Enzymes.
Adam, Nicole; Perner, Mirjam
2017-01-01
Here we outline how to identify hydrogenase enzymes from metagenomic libraries through an activity-based screening approach. A metagenomic fosmid library is constructed in E. coli and the fosmids are transferred into a hydrogenase deletion mutant of Shewanella oneidensis (ΔhyaB) via triparental mating. If a fosmid exhibits hydrogen uptake activity, S. oneidensis' phenotype is restored and hydrogenase activity is indicated by a color change of the medium from yellow to colorless. This new method enables screening of 48 metagenomic fosmid clones in parallel.
An Old Story in the Parallel Synthesis World: An Approach to Hydantoin Libraries.
Bogolubsky, Andrey V; Moroz, Yurii S; Savych, Olena; Pipko, Sergey; Konovets, Angelika; Platonov, Maxim O; Vasylchenko, Oleksandr V; Hurmach, Vasyl V; Grygorenko, Oleksandr O
2018-01-08
An approach to the parallel synthesis of hydantoin libraries by reaction of in situ generated 2,2,2-trifluoroethylcarbamates and α-amino esters was developed. To demonstrate utility of the method, a library of 1158 hydantoins designed according to the lead-likeness criteria (MW 200-350, cLogP 1-3) was prepared. The success rate of the method was analyzed as a function of physicochemical parameters of the products, and it was found that the method can be considered as a tool for lead-oriented synthesis. A hydantoin-bearing submicromolar primary hit acting as an Aurora kinase A inhibitor was discovered with a combination of rational design, parallel synthesis using the procedures developed, in silico and in vitro screenings.
Ebalunode, Jerry O; Zheng, Weifan; Tropsha, Alexander
2011-01-01
Optimization of chemical library composition affords more efficient identification of hits from biological screening experiments. The optimization could be achieved through rational selection of reagents used in combinatorial library synthesis. However, with a rapid advent of parallel synthesis methods and availability of millions of compounds synthesized by many vendors, it may be more efficient to design targeted libraries by means of virtual screening of commercial compound collections. This chapter reviews the application of advanced cheminformatics approaches such as quantitative structure-activity relationships (QSAR) and pharmacophore modeling (both ligand and structure based) for virtual screening. Both approaches rely on empirical SAR data to build models; thus, the emphasis is placed on achieving models of the highest rigor and external predictive power. We present several examples of successful applications of both approaches for virtual screening to illustrate their utility. We suggest that the expert use of both QSAR and pharmacophore models, either independently or in combination, enables users to achieve targeted libraries enriched with experimentally confirmed hit compounds.
Zhang, Litao; Cvijic, Mary Ellen; Lippy, Jonathan; Myslik, James; Brenner, Stephen L; Binnie, Alastair; Houston, John G
2012-07-01
In this paper, we review the key solutions that enabled evolution of the lead optimization screening support process at Bristol-Myers Squibb (BMS) between 2004 and 2009. During this time, technology infrastructure investment and scientific expertise integration laid the foundations to build and tailor lead optimization screening support models across all therapeutic groups at BMS. Together, harnessing advanced screening technology platforms and expanding panel screening strategy led to a paradigm shift at BMS in supporting lead optimization screening capability. Parallel SAR and structure liability relationship (SLR) screening approaches were first and broadly introduced to empower more-rapid and -informed decisions about chemical synthesis strategy and to broaden options for identifying high-quality drug candidates during lead optimization. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Nelson, William; Davis, Christopher C.
2014-10-01
Plenoptic functions are functions that preserve all the necessary light field information of optical events. Theoretical work has demonstrated that geometric based plenoptic functions can serve equally well in the traditional wave propagation equation known as the "scalar stochastic Helmholtz equation". However, in addressing problems of 3D turbulence simulation, the dominant methods using phase screen models have limitations both in explaining the choice of parameters (on the transverse plane) in real-world measurements, and finding proper correlations between neighboring phase screens (the Markov assumption breaks down). Though possible corrections to phase screen models are still promising, the equivalent geometric approach based on plenoptic functions begins to show some advantages. In fact, in these geometric approaches, a continuous wave problem is reduced to discrete trajectories of rays. This allows for convenience in parallel computing and guarantees conservation of energy. Besides the pairwise independence of simulated rays, the assigned refractive index grids can be directly tested by temperature measurements with tiny thermoprobes combined with other parameters such as humidity level and wind speed. Furthermore, without loss of generality one can break the causal chain in phase screen models by defining regional refractive centers to allow rays that are less affected to propagate through directly. As a result, our work shows that the 3D geometric approach serves as an efficient and accurate method in assessing relevant turbulence problems with inputs of several environmental measurements and reasonable guesses (such as Cn 2 levels). This approach will facilitate analysis and possible corrections in lateral wave propagation problems, such as image de-blurring, prediction of laser propagation over long ranges, and improvement of free space optic communication systems. In this paper, the plenoptic function model and relevant parallel algorithm computing will be presented, and its primary results and applications are demonstrated.
2014-01-01
Background The risks associated with gestational diabetes mellitus (GDM) are well recognized, and there is increasing evidence to support treatment of the condition. However, clear guidance on the ideal approach to screening for GDM is lacking. Professional groups continue to debate whether selective screening (based on risk factors) or universal screening is the most appropriate approach. Additionally, there is ongoing debate about what levels of glucose abnormalities during pregnancy respond best to treatment and which maternal and neonatal outcomes benefit most from treatment. Furthermore, the implications of possible screening options on health care costs are not well established. In response to this uncertainty there have been repeated calls for well-designed, randomised trials to determine the efficacy of screening, diagnosis, and management plans for GDM. We describe a randomised controlled trial to investigate screening uptake rates and the clinical and cost effectiveness of screening in primary versus secondary care settings. Methods/Design This will be an unblinded, two-group, parallel randomised controlled trial (RCT). The target population includes 784 women presenting for their first antenatal visit at 12 to 18 weeks gestation at two hospitals in the west of Ireland: Galway University Hospital and Mayo General Hospital. Participants will be offered universal screening for GDM at 24 to 28 weeks gestation in either primary care (n = 392) or secondary care (n = 392) locations. The primary outcome variable is the uptake rate of screening. Secondary outcomes include indicators of clinical effectiveness of screening at each screening site (primary and secondary) including gestational week at time of screening, time to access antenatal diabetes services for women diagnosed with GDM, and pregnancy and neonatal outcomes for women with GDM. In addition, parallel economic and qualitative evaluations will be conducted. The trial will cover the period from the woman’s first hospital antenatal visit at 12 to 18 weeks gestation, until the completion of the pregnancy. Trial registration Current Controlled Trials: ISRCTN02232125 PMID:24438478
Mladic, Marija; Zietek, Barbara M; Iyer, Janaki Krishnamoorthy; Hermarij, Philip; Niessen, Wilfried M A; Somsen, Govert W; Kini, R Manjunatha; Kool, Jeroen
2016-02-01
Snake venoms comprise complex mixtures of peptides and proteins causing modulation of diverse physiological functions upon envenomation of the prey organism. The components of snake venoms are studied as research tools and as potential drug candidates. However, the bioactivity determination with subsequent identification and purification of the bioactive compounds is a demanding and often laborious effort involving different analytical and pharmacological techniques. This study describes the development and optimization of an integrated analytical approach for activity profiling and identification of venom constituents targeting the cardiovascular system, thrombin and factor Xa enzymes in particular. The approach developed encompasses reversed-phase liquid chromatography (RPLC) analysis of a crude snake venom with parallel mass spectrometry (MS) and bioactivity analysis. The analytical and pharmacological part in this approach are linked using at-line nanofractionation. This implies that the bioactivity is assessed after high-resolution nanofractionation (6 s/well) onto high-density 384-well microtiter plates and subsequent freeze drying of the plates. The nanofractionation and bioassay conditions were optimized for maintaining LC resolution and achieving good bioassay sensitivity. The developed integrated analytical approach was successfully applied for the fast screening of snake venoms for compounds affecting thrombin and factor Xa activity. Parallel accurate MS measurements provided correlation of observed bioactivity to peptide/protein masses. This resulted in identification of a few interesting peptides with activity towards the drug target factor Xa from a screening campaign involving venoms of 39 snake species. Besides this, many positive protease activity peaks were observed in most venoms analysed. These protease fingerprint chromatograms were found to be similar for evolutionary closely related species and as such might serve as generic snake protease bioactivity fingerprints in biological studies on venoms. Copyright © 2015 Elsevier Ltd. All rights reserved.
Discovery of Cationic Polymers for Non-viral Gene Delivery using Combinatorial Approaches
Barua, Sutapa; Ramos, James; Potta, Thrimoorthy; Taylor, David; Huang, Huang-Chiao; Montanez, Gabriela; Rege, Kaushal
2015-01-01
Gene therapy is an attractive treatment option for diseases of genetic origin, including several cancers and cardiovascular diseases. While viruses are effective vectors for delivering exogenous genes to cells, concerns related to insertional mutagenesis, immunogenicity, lack of tropism, decay and high production costs necessitate the discovery of non-viral methods. Significant efforts have been focused on cationic polymers as non-viral alternatives for gene delivery. Recent studies have employed combinatorial syntheses and parallel screening methods for enhancing the efficacy of gene delivery, biocompatibility of the delivery vehicle, and overcoming cellular level barriers as they relate to polymer-mediated transgene uptake, transport, transcription, and expression. This review summarizes and discusses recent advances in combinatorial syntheses and parallel screening of cationic polymer libraries for the discovery of efficient and safe gene delivery systems. PMID:21843141
Zhang, Hui; Luo, Li-Ping; Song, Hui-Peng; Hao, Hai-Ping; Zhou, Ping; Qi, Lian-Wen; Li, Ping; Chen, Jun
2014-01-24
Generation of a high-purity fraction library for efficiently screening active compounds from natural products is challenging because of their chemical diversity and complex matrices. In this work, a strategy combining high-resolution peak fractionation (HRPF) with a cell-based assay was proposed for target screening of bioactive constituents from natural products. In this approach, peak fractionation was conducted under chromatographic conditions optimized for high-resolution separation of the natural product extract. The HRPF approach was automatically performed according to the predefinition of certain peaks based on their retention times from a reference chromatographic profile. The corresponding HRPF database was collected with a parallel mass spectrometer to ensure purity and characterize the structures of compounds in the various fractions. Using this approach, a set of 75 peak fractions on the microgram scale was generated from 4mg of the extract of Salvia miltiorrhiza. After screening by an ARE-luciferase reporter gene assay, 20 diterpene quinones were selected and identified, and 16 of these compounds were reported to possess novel Nrf2 activation activity. Compared with conventional fixed-time interval fractionation, the HRPF approach could significantly improve the efficiency of bioactive compound discovery and facilitate the uncovering of minor active components. Copyright © 2013 Elsevier B.V. All rights reserved.
A strategy for clone selection under different production conditions.
Legmann, Rachel; Benoit, Brian; Fedechko, Ronald W; Deppeler, Cynthia L; Srinivasan, Sriram; Robins, Russell H; McCormick, Ellen L; Ferrick, David A; Rodgers, Seth T; Russo, A Peter
2011-01-01
Top performing clones have failed at the manufacturing scale while the true best performer may have been rejected early in the screening process. Therefore, the ability to screen multiple clones in complex fed-batch processes using multiple process variations can be used to assess robustness and to identify critical factors. This dynamic ranking of clones' strategy requires the execution of many parallel experiments than traditional approaches. Therefore, this approach is best suited for micro-bioreactor models which can perform hundreds of experiments quickly and efficiently. In this study, a fully monitored and controlled small scale platform was used to screen eight CHO clones producing a recombinant monoclonal antibody across several process variations, including different feeding strategies, temperature shifts and pH control profiles. The first screen utilized 240 micro-bioreactors were run for two weeks for this assessment of the scale-down model as a high-throughput tool for clone evaluation. The richness of the outcome data enable to clearly identify the best and worst clone as well as process in term of maximum monoclonal antibody titer. The follow-up comparison study utilized 180 micro-bioreactors in a full factorial design and a subset of 12 clone/process combinations was selected to be run parallel in duplicate shake flasks. Good correlation between the micro-bioreactor predictions and those made in shake flasks with a Pearson correlation value of 0.94. The results also demonstrate that this micro-scale system can perform clone screening and process optimization for gaining significant titer improvements simultaneously. This dynamic ranking strategy can support better choices of production clones. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
Rapid determination of enantiomeric excess: a focus on optical approaches.
Leung, Diana; Kang, Sung Ok; Anslyn, Eric V
2012-01-07
High-throughput screening (HTS) methods are becoming increasingly essential in discovering chiral catalysts or auxiliaries for asymmetric transformations due to the advent of parallel synthesis and combinatorial chemistry. Both parallel synthesis and combinatorial chemistry can lead to the exploration of a range of structural candidates and reaction conditions as a means to obtain the highest enantiomeric excess (ee) of a desired transformation. One current bottleneck in these approaches to asymmetric reactions is the determination of ee, which has led researchers to explore a wide range of HTS techniques. To be truly high-throughput, it has been proposed that a technique that can analyse a thousand or more samples per day is needed. Many of the current approaches to this goal are based on optical methods because they allow for a rapid determination of ee due to quick data collection and their parallel analysis capabilities. In this critical review these techniques are reviewed with a discussion of their respective advantages and drawbacks, and with a contrast to chromatographic methods (180 references). This journal is © The Royal Society of Chemistry 2012
2013-01-01
Background Despite progress in the development of combined antiretroviral therapies (cART), HIV infection remains a significant challenge for human health. Current problems of cART include multi-drug-resistant virus variants, long-term toxicity and enormous treatment costs. Therefore, the identification of novel effective drugs is urgently needed. Methods We developed a straightforward screening approach for simultaneously evaluating the sensitivity of multiple HIV gag-pol mutants to antiviral drugs in one assay. Our technique is based on multi-colour lentiviral self-inactivating (SIN) LeGO vector technology. Results We demonstrated the successful use of this approach for screening compounds against up to four HIV gag-pol variants (wild-type and three mutants) simultaneously. Importantly, the technique was adapted to Biosafety Level 1 conditions by utilising ecotropic pseudotypes. This allowed upscaling to a large-scale screening protocol exploited by pharmaceutical companies in a successful proof-of-concept experiment. Conclusions The technology developed here facilitates fast screening for anti-HIV activity of individual agents from large compound libraries. Although drugs targeting gag-pol variants were used here, our approach permits screening compounds that target several different, key cellular and viral functions of the HIV life-cycle. The modular principle of the method also allows the easy exchange of various mutations in HIV sequences. In conclusion, the methodology presented here provides a valuable new approach for the identification of novel anti-HIV drugs. PMID:23286882
Dalecki, Alex G; Wolschendorf, Frank
2016-07-01
Facing totally resistant bacteria, traditional drug discovery efforts have proven to be of limited use in replenishing our depleted arsenal of therapeutic antibiotics. Recently, the natural anti-bacterial properties of metal ions in synergy with metal-coordinating ligands have shown potential for generating new molecule candidates with potential therapeutic downstream applications. We recently developed a novel combinatorial screening approach to identify compounds with copper-dependent anti-bacterial properties. Through a parallel screening technique, the assay distinguishes between copper-dependent and independent activities against Mycobacterium tuberculosis with hits being defined as compounds with copper-dependent activities. These activities must then be linked to a compound master list to process and analyze the data and to identify the hit molecules, a labor intensive and mistake-prone analysis. Here, we describe a software program built to automate this analysis in order to streamline our workflow significantly. We conducted a small, 1440 compound screen against M. tuberculosis and used it as an example framework to build and optimize the software. Though specifically adapted to our own needs, it can be readily expanded for any small- to medium-throughput screening effort, parallel or conventional. Further, by virtue of the underlying Linux server, it can be easily adapted for chemoinformatic analysis of screens through packages such as OpenBabel. Overall, this setup represents an easy-to-use solution for streamlining processing and analysis of biological screening data, as well as offering a scaffold for ready functionality expansion. Copyright © 2016 Elsevier B.V. All rights reserved.
Microfluidics for cell-based high throughput screening platforms - A review.
Du, Guansheng; Fang, Qun; den Toonder, Jaap M J
2016-01-15
In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery. Copyright © 2015 Elsevier B.V. All rights reserved.
Development and application of a DNA microarray-based yeast two-hybrid system
Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.
2013-01-01
The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563
GPURFSCREEN: a GPU based virtual screening tool using random forest classifier.
Jayaraj, P B; Ajay, Mathias K; Nufail, M; Gopakumar, G; Jaleel, U C A
2016-01-01
In-silico methods are an integral part of modern drug discovery paradigm. Virtual screening, an in-silico method, is used to refine data models and reduce the chemical space on which wet lab experiments need to be performed. Virtual screening of a ligand data model requires large scale computations, making it a highly time consuming task. This process can be speeded up by implementing parallelized algorithms on a Graphical Processing Unit (GPU). Random Forest is a robust classification algorithm that can be employed in the virtual screening. A ligand based virtual screening tool (GPURFSCREEN) that uses random forests on GPU systems has been proposed and evaluated in this paper. This tool produces optimized results at a lower execution time for large bioassay data sets. The quality of results produced by our tool on GPU is same as that on a regular serial environment. Considering the magnitude of data to be screened, the parallelized virtual screening has a significantly lower running time at high throughput. The proposed parallel tool outperforms its serial counterpart by successfully screening billions of molecules in training and prediction phases.
Tako, Elad; Bar, Haim; Glahn, Raymond P.
2016-01-01
Research methods that predict Fe bioavailability for humans can be extremely useful in evaluating food fortification strategies, developing Fe-biofortified enhanced staple food crops and assessing the Fe bioavailability of meal plans that include such crops. In this review, research from four recent poultry (Gallus gallus) feeding trials coupled with in vitro analyses of Fe-biofortified crops will be compared to the parallel human efficacy studies which used the same varieties and harvests of the Fe-biofortified crops. Similar to the human studies, these trials were aimed to assess the potential effects of regular consumption of these enhanced staple crops on maintenance or improvement of iron status. The results demonstrate a strong agreement between the in vitro/in vivo screening approach and the parallel human studies. These observations therefore indicate that the in vitro/Caco-2 cell and Gallus gallus models can be integral tools to develop varieties of staple food crops and predict their effect on iron status in humans. The cost-effectiveness of this approach also means that it can be used to monitor the nutritional stability of the Fe-biofortified crop once a variety has released and integrated into the food system. These screening tools therefore represent a significant advancement to the field for crop development and can be applied to ensure the sustainability of the biofortification approach. PMID:27869705
The In Situ Enzymatic Screening (ISES) Approach to Reaction Discovery and Catalyst Identification.
Swyka, Robert A; Berkowitz, David B
2017-12-14
The importance of discovering new chemical transformations and/or optimizing catalytic combinations has led to a flurry of activity in reaction screening. The in situ enzymatic screening (ISES) approach described here utilizes biological tools (enzymes/cofactors) to advance chemistry. The protocol interfaces an organic reaction layer with an adjacent aqueous layer containing reporting enzymes that act upon the organic reaction product, giving rise to a spectroscopic signal. ISES allows the experimentalist to rapidly glean information on the relative rates of a set of parallel organic/organometallic reactions under investigation, without the need to quench the reactions or draw aliquots. In certain cases, the real-time enzymatic readout also provides information on sense and magnitude of enantioselectivity and substrate specificity. This article contains protocols for single-well (relative rate) and double-well (relative rate/enantiomeric excess) ISES, in addition to a colorimetric ISES protocol and a miniaturized double-well procedure. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
NASA Astrophysics Data System (ADS)
Giacalone, Philip L.
1993-06-01
The design of the Intelsat VII surface tension propellant management device (PMD) (an all-welded assembly consisting of about 100 individual components) was developed using a modular design approach that allowed the complex PMD assembly to be divided into smaller modules. The modular approach reduces manufacturing-related technical and schedule risks and allows many components and assemblies to be processed in parallel, while also facilitating the incorporation of quality assurance tests at all critical PMD subassembly levels. The baseline PMD assembly is made from titanium and stainless steel materials. In order to obtain a 100 percent titanium PMD, a new, state-of-the-art fine mesh titanium screen material was developed, tested, and qualified for use as an alternaltive to the stainless steel screen material. The Ti based screen material demonstrated a high level of bubble point performance. It was integrated into a PMD assembly and was successfully qualification tested at the tank assembly level.
Massively parallel de novo protein design for targeted therapeutics.
Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J; Hicks, Derrick R; Vergara, Renan; Murapa, Patience; Bernard, Steffen M; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T; Koday, Merika T; Jenkins, Cody M; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M; Fernández-Velasco, D Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A; Fuller, Deborah H; Baker, David
2017-10-05
De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.
Massively parallel de novo protein design for targeted therapeutics
NASA Astrophysics Data System (ADS)
Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Fernández-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David
2017-10-01
De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37-43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing.
Massively parallel de novo protein design for targeted therapeutics
Chevalier, Aaron; Silva, Daniel-Adriano; Rocklin, Gabriel J.; Hicks, Derrick R.; Vergara, Renan; Murapa, Patience; Bernard, Steffen M.; Zhang, Lu; Lam, Kwok-Ho; Yao, Guorui; Bahl, Christopher D.; Miyashita, Shin-Ichiro; Goreshnik, Inna; Fuller, James T.; Koday, Merika T.; Jenkins, Cody M.; Colvin, Tom; Carter, Lauren; Bohn, Alan; Bryan, Cassie M.; Fernández-Velasco, D. Alejandro; Stewart, Lance; Dong, Min; Huang, Xuhui; Jin, Rongsheng; Wilson, Ian A.; Fuller, Deborah H.; Baker, David
2018-01-01
De novo protein design holds promise for creating small stable proteins with shapes customized to bind therapeutic targets. We describe a massively parallel approach for designing, manufacturing and screening mini-protein binders, integrating large-scale computational design, oligonucleotide synthesis, yeast display screening and next-generation sequencing. We designed and tested 22,660 mini-proteins of 37–43 residues that target influenza haemagglutinin and botulinum neurotoxin B, along with 6,286 control sequences to probe contributions to folding and binding, and identified 2,618 high-affinity binders. Comparison of the binding and non-binding design sets, which are two orders of magnitude larger than any previously investigated, enabled the evaluation and improvement of the computational model. Biophysical characterization of a subset of the binder designs showed that they are extremely stable and, unlike antibodies, do not lose activity after exposure to high temperatures. The designs elicit little or no immune response and provide potent prophylactic and therapeutic protection against influenza, even after extensive repeated dosing. PMID:28953867
Still, Kristina B. M.; Nandlal, Randjana S. S.; Slagboom, Julien; Somsen, Govert W.; Kool, Jeroen
2017-01-01
Coagulation assays currently employed are often low throughput, require specialized equipment and/or require large blood/plasma samples. This study describes the development, optimization and early application of a generic low-volume and high-throughput screening (HTS) assay for coagulation activity. The assay is a time-course spectrophotometric measurement which kinetically measures the clotting profile of bovine or human plasma incubated with Ca2+ and a test compound. The HTS assay can be a valuable new tool for coagulation diagnostics in hospitals, for research in coagulation disorders, for drug discovery and for venom research. A major effect following envenomation by many venomous snakes is perturbation of blood coagulation caused by haemotoxic compounds present in the venom. These compounds, such as anticoagulants, are potential leads in drug discovery for cardiovascular diseases. The assay was implemented in an integrated analytical approach consisting of reversed-phase liquid chromatography (LC) for separation of crude venom components in combination with parallel post-column coagulation screening and mass spectrometry (MS). The approach was applied for the rapid assessment and identification of profiles of haemotoxic compounds in snake venoms. Procoagulant and anticoagulant activities were correlated with accurate masses from the parallel MS measurements, facilitating the detection of peptides showing strong anticoagulant activity. PMID:29186818
Cloud computing approaches to accelerate drug discovery value chain.
Garg, Vibhav; Arora, Suchir; Gupta, Chitra
2011-12-01
Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.
Three-Component Reaction Discovery Enabled by Mass Spectrometry of Self-Assembled Monolayers
Montavon, Timothy J.; Li, Jing; Cabrera-Pardo, Jaime R.; Mrksich, Milan; Kozmin, Sergey A.
2011-01-01
Multi-component reactions have been extensively employed in many areas of organic chemistry. Despite significant progress, the discovery of such enabling transformations remains challenging. Here, we present the development of a parallel, label-free reaction-discovery platform, which can be used for identification of new multi-component transformations. Our approach is based on the parallel mass spectrometric screening of interfacial chemical reactions on arrays of self-assembled monolayers. This strategy enabled the identification of a simple organic phosphine that can catalyze a previously unknown condensation of siloxy alkynes, aldehydes and amines to produce 3-hydroxy amides with high efficiency and diastereoselectivity. The reaction was further optimized using solution phase methods. PMID:22169871
[Analysis of risk factors for dry eye syndrome in visual display terminal workers].
Zhu, Yong; Yu, Wen-lan; Xu, Ming; Han, Lei; Cao, Wen-dong; Zhang, Hong-bing; Zhang, Heng-dong
2013-08-01
To analyze the risk factors for dry eye syndrome in visual display terminal (VDT) workers and to provide a scientific basis for protecting the eye health of VDT workers. Questionnaire survey, Schirmer I test, tear break-up time test, and workshop microenvironment evaluation were performed in 185 VDT workers. Multivariate logistic regression analysis was performed to determine the risk factors for dry eye syndrome in VDT workers after adjustment for confounding factors. In the logistic regression model, the regression coefficients of daily mean time of exposure to screen, daily mean time of watching TV, parallel screen-eye angle, upward screen-eye angle, eye-screen distance of less than 20 cm, irregular breaks during screen-exposed work, age, and female gender on the results of Schirmer I test were 0.153, 0.548, 0.400, 0.796, 0.234, 0.516, 0.559, and -0.685, respectively; the regression coefficients of daily mean time of exposure to screen, parallel screen-eye angle, upward screen-eye angle, age, working years, and female gender on tear break-up time were 0.021, 0.625, 2.652, 0.749, 0.403, and 1.481, respectively. Daily mean time of exposure to screen, daily mean time of watching TV, parallel screen-eye angle, upward screen-eye angle, eye-screen distance of less than 20 cm, irregular breaks during screen-exposed work, age, and working years are risk factors for dry eye syndrome in VDT workers.
Competitive Genomic Screens of Barcoded Yeast Libraries
Urbanus, Malene; Proctor, Michael; Heisler, Lawrence E.; Giaever, Guri; Nislow, Corey
2011-01-01
By virtue of advances in next generation sequencing technologies, we have access to new genome sequences almost daily. The tempo of these advances is accelerating, promising greater depth and breadth. In light of these extraordinary advances, the need for fast, parallel methods to define gene function becomes ever more important. Collections of genome-wide deletion mutants in yeasts and E. coli have served as workhorses for functional characterization of gene function, but this approach is not scalable, current gene-deletion approaches require each of the thousands of genes that comprise a genome to be deleted and verified. Only after this work is complete can we pursue high-throughput phenotyping. Over the past decade, our laboratory has refined a portfolio of competitive, miniaturized, high-throughput genome-wide assays that can be performed in parallel. This parallelization is possible because of the inclusion of DNA 'tags', or 'barcodes,' into each mutant, with the barcode serving as a proxy for the mutation and one can measure the barcode abundance to assess mutant fitness. In this study, we seek to fill the gap between DNA sequence and barcoded mutant collections. To accomplish this we introduce a combined transposon disruption-barcoding approach that opens up parallel barcode assays to newly sequenced, but poorly characterized microbes. To illustrate this approach we present a new Candida albicans barcoded disruption collection and describe how both microarray-based and next generation sequencing-based platforms can be used to collect 10,000 - 1,000,000 gene-gene and drug-gene interactions in a single experiment. PMID:21860376
NASA Astrophysics Data System (ADS)
Loring, B.; Karimabadi, H.; Rortershteyn, V.
2015-10-01
The surface line integral convolution(LIC) visualization technique produces dense visualization of vector fields on arbitrary surfaces. We present a screen space surface LIC algorithm for use in distributed memory data parallel sort last rendering infrastructures. The motivations for our work are to support analysis of datasets that are too large to fit in the main memory of a single computer and compatibility with prevalent parallel scientific visualization tools such as ParaView and VisIt. By working in screen space using OpenGL we can leverage the computational power of GPUs when they are available and run without them when they are not. We address efficiency and performance issues that arise from the transformation of data from physical to screen space by selecting an alternate screen space domain decomposition. We analyze the algorithm's scaling behavior with and without GPUs on two high performance computing systems using data from turbulent plasma simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loring, Burlen; Karimabadi, Homa; Rortershteyn, Vadim
2014-07-01
The surface line integral convolution(LIC) visualization technique produces dense visualization of vector fields on arbitrary surfaces. We present a screen space surface LIC algorithm for use in distributed memory data parallel sort last rendering infrastructures. The motivations for our work are to support analysis of datasets that are too large to fit in the main memory of a single computer and compatibility with prevalent parallel scientific visualization tools such as ParaView and VisIt. By working in screen space using OpenGL we can leverage the computational power of GPUs when they are available and run without them when they are not.more » We address efficiency and performance issues that arise from the transformation of data from physical to screen space by selecting an alternate screen space domain decomposition. We analyze the algorithm's scaling behavior with and without GPUs on two high performance computing systems using data from turbulent plasma simulations.« less
Functional annotation of chemical libraries across diverse biological processes.
Piotrowski, Jeff S; Li, Sheena C; Deshpande, Raamesh; Simpkins, Scott W; Nelson, Justin; Yashiroda, Yoko; Barber, Jacqueline M; Safizadeh, Hamid; Wilson, Erin; Okada, Hiroki; Gebre, Abraham A; Kubo, Karen; Torres, Nikko P; LeBlanc, Marissa A; Andrusiak, Kerry; Okamoto, Reika; Yoshimura, Mami; DeRango-Adem, Eva; van Leeuwen, Jolanda; Shirahige, Katsuhiko; Baryshnikova, Anastasia; Brown, Grant W; Hirano, Hiroyuki; Costanzo, Michael; Andrews, Brenda; Ohya, Yoshikazu; Osada, Hiroyuki; Yoshida, Minoru; Myers, Chad L; Boone, Charles
2017-09-01
Chemical-genetic approaches offer the potential for unbiased functional annotation of chemical libraries. Mutations can alter the response of cells in the presence of a compound, revealing chemical-genetic interactions that can elucidate a compound's mode of action. We developed a highly parallel, unbiased yeast chemical-genetic screening system involving three key components. First, in a drug-sensitive genetic background, we constructed an optimized diagnostic mutant collection that is predictive for all major yeast biological processes. Second, we implemented a multiplexed (768-plex) barcode-sequencing protocol, enabling the assembly of thousands of chemical-genetic profiles. Finally, based on comparison of the chemical-genetic profiles with a compendium of genome-wide genetic interaction profiles, we predicted compound functionality. Applying this high-throughput approach, we screened seven different compound libraries and annotated their functional diversity. We further validated biological process predictions, prioritized a diverse set of compounds, and identified compounds that appear to have dual modes of action.
Dockres: a computer program that analyzes the output of virtual screening of small molecules
2010-01-01
Background This paper describes a computer program named Dockres that is designed to analyze and summarize results of virtual screening of small molecules. The program is supplemented with utilities that support the screening process. Foremost among these utilities are scripts that run the virtual screening of a chemical library on a large number of processors in parallel. Methods Dockres and some of its supporting utilities are written Fortran-77; other utilities are written as C-shell scripts. They support the parallel execution of the screening. The current implementation of the program handles virtual screening with Autodock-3 and Autodock-4, but can be extended to work with the output of other programs. Results Analysis of virtual screening by Dockres led to both active and selective lead compounds. Conclusions Analysis of virtual screening was facilitated and enhanced by Dockres in both the authors' laboratories as well as laboratories elsewhere. PMID:20205801
Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos
2014-06-01
Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.
Zhang, Wei Yun; Zhang, Wenhua; Liu, Zhiyuan; Li, Cong; Zhu, Zhi; Yang, Chaoyong James
2012-01-03
We have developed a novel method for efficiently screening affinity ligands (aptamers) from a complex single-stranded DNA (ssDNA) library by employing single-molecule emulsion polymerase chain reaction (PCR) based on the agarose droplet microfluidic technology. In a typical systematic evolution of ligands by exponential enrichment (SELEX) process, the enriched library is sequenced first, and tens to hundreds of aptamer candidates are analyzed via a bioinformatic approach. Possible candidates are then chemically synthesized, and their binding affinities are measured individually. Such a process is time-consuming, labor-intensive, inefficient, and expensive. To address these problems, we have developed a highly efficient single-molecule approach for aptamer screening using our agarose droplet microfluidic technology. Statistically diluted ssDNA of the pre-enriched library evolved through conventional SELEX against cancer biomarker Shp2 protein was encapsulated into individual uniform agarose droplets for droplet PCR to generate clonal agarose beads. The binding capacity of amplified ssDNA from each clonal bead was then screened via high-throughput fluorescence cytometry. DNA clones with high binding capacity and low K(d) were chosen as the aptamer and can be directly used for downstream biomedical applications. We have identified an ssDNA aptamer that selectively recognizes Shp2 with a K(d) of 24.9 nM. Compared to a conventional sequencing-chemical synthesis-screening work flow, our approach avoids large-scale DNA sequencing and expensive, time-consuming DNA synthesis of large populations of DNA candidates. The agarose droplet microfluidic approach is thus highly efficient and cost-effective for molecular evolution approaches and will find wide application in molecular evolution technologies, including mRNA display, phage display, and so on. © 2011 American Chemical Society
Parallel flow diffusion battery
Yeh, H.C.; Cheng, Y.S.
1984-01-01
A parallel flow diffusion battery for determining the mass distribution of an aerosol has a plurality of diffusion cells mounted in parallel to an aerosol stream, each diffusion cell including a stack of mesh wire screens of different density.
Parallel flow diffusion battery
Yeh, Hsu-Chi; Cheng, Yung-Sung
1984-08-07
A parallel flow diffusion battery for determining the mass distribution of an aerosol has a plurality of diffusion cells mounted in parallel to an aerosol stream, each diffusion cell including a stack of mesh wire screens of different density.
Visual analysis of inter-process communication for large-scale parallel computing.
Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu
2009-01-01
In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.
Watt, Eric D.; Hornung, Michael W.; Hedge, Joan M.; Judson, Richard S.; Crofton, Kevin M.; Houck, Keith A.; Simmons, Steven O.
2016-01-01
High-throughput screening for potential thyroid-disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the U.S. Environmental Protection Agency ToxCast screening assay portfolio. To fill 1 critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast phase I and II chemical libraries, comprised of 1074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single-concentration screen were retested in concentration-response. Due to high false-positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed 2 additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using guaiacol as a substrate to confirm the activity profiles of putative TPO inhibitors. This effort represents the most extensive TPO inhibition screening campaign to date and illustrates a tiered screening approach that focuses resources, maximizes assay throughput, and reduces animal use. PMID:26884060
ERIC Educational Resources Information Center
Bullock, Emma P.; Shumway, Jessica F.; Watts, Christina M.; Moyer-Packenham, Patricia S.
2017-01-01
The purpose of this study was to contribute to the research on mathematics app use by very young children, and specifically mathematics apps for touch-screen mobile devices that contain virtual manipulatives. The study used a convergent parallel mixed methods design, in which quantitative and qualitative data were collected in parallel, analyzed…
NASA Astrophysics Data System (ADS)
Patel, Rikin D.; Kumar, Sivakumar Prasanth; Patel, Chirag N.; Shankar, Shetty Shilpa; Pandya, Himanshu A.; Solanki, Hitesh A.
2017-10-01
The traditional drug design strategy centrally focuses on optimizing binding affinity with the receptor target and evaluates pharmacokinetic properties at a later stage which causes high rate of attrition in clinical trials. Alternatively, parallel screening allows evaluation of these properties and affinity simultaneously. In a case study to identify leads from natural compounds with experimental HIV-1 reverse transcriptase (RT) inhibition, we integrated various computational approaches including Caco-2 cell permeability QSAR model with applicability domain (AD) to recognize drug-like natural compounds, molecular docking to study HIV-1 RT interactions and shape similarity analysis with known crystal inhibitors having characteristic butterfly-like model. Further, the lipophilic properties of the compounds refined from the process with best scores were examined using lipophilic ligand efficiency (LLE) index. Seven natural compound hits viz. baicalien, (+)-calanolide A, mniopetal F, fagaronine chloride, 3,5,8-trihydroxy-4-quinolone methyl ether derivative, nitidine chloride and palmatine, were prioritized based on LLE score which demonstrated Caco-2 well absorption labeling, encompassment in AD structural coverage, better receptor affinity, shape adaptation and permissible AlogP value. We showed that this integrative approach is successful in lead exploration of natural compounds targeted against HIV-1 RT enzyme.
Wang, Y Claire; Cheung, Angela M; Bibbins-Domingo, Kirsten; Prosser, Lisa A; Cook, Nancy R; Goldman, Lee; Gillman, Matthew W
2011-02-01
To compare the long-term effectiveness and cost-effectiveness of 3 approaches to managing elevated blood pressure (BP) in adolescents in the United States: no intervention, "screen-and-treat," and population-wide strategies to lower the entire BP distribution. We used a simulation model to combine several data sources to project the lifetime costs and cardiovascular outcomes for a cohort of 15-year-old U.S. adolescents under different BP approaches and conducted cost-effectiveness analysis. We obtained BP distributions from the National Health and Nutrition Examination Survey 1999-2004 and used childhood-to-adult longitudinal correlation analyses to simulate the tracking of BP. We then used the coronary heart disease policy model to estimate lifetime coronary heart disease events, costs, and quality-adjusted life years (QALY). Among screen-and-treat strategies, finding and treating the adolescents at highest risk (eg, left ventricular hypertrophy) was most cost-effective ($18000/QALY [boys] and $47000/QALY [girls]). However, all screen-and-treat strategies were dominated by population-wide strategies such as salt reduction (cost-saving [boys] and $650/QALY [girls]) and increasing physical education ($11000/QALY [boys] and $35000/QALY [girls]). Routine adolescents BP screening is moderately effective, but population-based BP interventions with broader reach could potentially be less costly and more effective for early cardiovascular disease prevention and should be implemented in parallel. Copyright © 2011 Mosby, Inc. All rights reserved.
Wang, Jun; Hallinger, Daniel R; Murr, Ashley S; Buckalew, Angela R; Simmons, Steven O; Laws, Susan C; Stoker, Tammy E
2018-05-01
Thyroid uptake of iodide via the sodium-iodide symporter (NIS) is the first step in the biosynthesis of thyroid hormones that are critical for health and development in humans and wildlife. Despite having long been a known target of endocrine disrupting chemicals such as perchlorate, information regarding NIS inhibition activity is still unavailable for the vast majority of environmental chemicals. This study applied a previously validated high-throughput approach to screen for NIS inhibitors in the ToxCast phase I library, representing 293 important environmental chemicals. Here 310 blinded samples were screened in a tiered-approach using an initial single-concentration (100 μM) radioactive-iodide uptake (RAIU) assay, followed by 169 samples further evaluated in multi-concentration (0.001 μM-100 μM) testing in parallel RAIU and cell viability assays. A novel chemical ranking system that incorporates multi-concentration RAIU and cytotoxicity responses was also developed as a standardized method for chemical prioritization in current and future screenings. Representative chemical responses and thyroid effects of high-ranking chemicals are further discussed. This study significantly expands current knowledge of NIS inhibition potential in environmental chemicals and provides critical support to U.S. EPA's Endocrine Disruptor Screening Program (EDSP) initiative to expand coverage of thyroid molecular targets, as well as the development of thyroid adverse outcome pathways (AOPs).
Screening mosaic F1 females for mutations affecting zebrafish heart induction and patterning.
Alexander, J; Stainier, D Y; Yelon, D
1998-01-01
The genetic pathways underlying the induction and anterior-posterior patterning of the heart are poorly understood. The recent emergence of the zebrafish model system now allows a classical genetic approach to such challenging problems in vertebrate development. Two large-scale screens for mutations affecting zebrafish embryonic development have recently been completed; among the hundreds of mutations identified were several that affect specific aspects of cardiac morphogenesis, differentiation, and function. However, very few mutations affecting induction and/or anterior-posterior patterning of the heart were identified. We hypothesize that a directed approach utilizing molecular markers to examine these particular steps of heart development will uncover additional such mutations. To test this hypothesis, we are conducting two parallel screens for mutations that affect either the induction or the anterior-posterior patterning of the zebrafish heart. As an indicator of cardiac induction, we examine expression of nkx2.5, the earliest known marker of precardiac mesoderm; to assess anterior-posterior patterning, we distinguish ventricle from atrium with antibodies that recognize different myosin heavy chain isoforms. In order to expedite the examination of a large number of mutations, we are screening the haploid progeny of mosaic F1 females. In these ongoing screens, we have identified four mutations that affect nkx2.5 expression as well as 21 that disrupt either ventricular or atrial development and thus far have recovered several of these mutations, demonstrating the value of our approach. Future analysis of these and other cardiac mutations will provide further insight into the processes of induction and anterior-posterior patterning of the heart.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
High-throughput strategies for the discovery and engineering of enzymes for biocatalysis.
Jacques, Philippe; Béchet, Max; Bigan, Muriel; Caly, Delphine; Chataigné, Gabrielle; Coutte, François; Flahaut, Christophe; Heuson, Egon; Leclère, Valérie; Lecouturier, Didier; Phalip, Vincent; Ravallec, Rozenn; Dhulster, Pascal; Froidevaux, Rénato
2017-02-01
Innovations in novel enzyme discoveries impact upon a wide range of industries for which biocatalysis and biotransformations represent a great challenge, i.e., food industry, polymers and chemical industry. Key tools and technologies, such as bioinformatics tools to guide mutant library design, molecular biology tools to create mutants library, microfluidics/microplates, parallel miniscale bioreactors and mass spectrometry technologies to create high-throughput screening methods and experimental design tools for screening and optimization, allow to evolve the discovery, development and implementation of enzymes and whole cells in (bio)processes. These technological innovations are also accompanied by the development and implementation of clean and sustainable integrated processes to meet the growing needs of chemical, pharmaceutical, environmental and biorefinery industries. This review gives an overview of the benefits of high-throughput screening approach from the discovery and engineering of biocatalysts to cell culture for optimizing their production in integrated processes and their extraction/purification.
Zhang, Yufeng; Xiao, Shun; Sun, Lijuan; Ge, Zhiwei; Fang, Fengkai; Zhang, Wen; Wang, Yi; Cheng, Yiyu
2013-05-13
A high throughput method was developed for rapid screening and identification of bioactive compounds from traditional Chinese medicine, marine products and other natural products. The system, integrated with five-channel chromatographic separation and dual UV-MS detection, is compatible with in vitro 96-well microplate based bioassays. The stability and applicability of the proposed method was validated by testing radical scavenging capability of a mixture of seven known compounds (rutin, dihydroquercetin, salvianolic acid A, salvianolic acid B, glycyrrhizic acid, rubescensin A and tangeretin). Moreover, the proposed method was successfully applied to the crude extracts of traditional Chinese medicine and a marine sponge from which 12 bioactive compounds were screened and characterized based on their anti-oxidative or anti-tumor activities. In particular, two diterpenoid derivatives, agelasine B and (-)-agelasine D, were identified for the first time as anti-tumor compounds from the sponge Agelas mauritiana, showing a considerable activity toward MCF-7 cells (IC50 values of 7.84±0.65 and 10.48±0.84 μM, respectively). Our findings suggested that the integrated system of 5-channel parallel chromatography coupled with on-line mass spectrometry and microplate based assays can be a versatile and high efficient approach for the discovery of active compounds from natural products. Copyright © 2013 Elsevier B.V. All rights reserved.
Cloning of Trametes versicolar genes induced by nitrogen starvation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trudel, P.; Courchesne, D.; Roy, C.
1988-06-01
We have screened a genomic library of Trametes versicolar for genes whose expression is associated with nitrogen starvation, which has been shown to induce ligninolytic activity. Using two different approaches based on differential expression, we isolated 29 clones. These were shown by restriction mapping and cross-hybridization to code for 11 distinct differentially expressed genes. Northern analysis of the kinetics of expression of these genes revealed that at least four of them have kinetics of induction that parallel kinetics of induction of ligninolytic activity.
Tiered High-Throughput Screening Approach to Identify ...
High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using
Multilevel Parallelization of AutoDock 4.2.
Norgan, Andrew P; Coffman, Paul K; Kocher, Jean-Pierre A; Katzmann, David J; Sosa, Carlos P
2011-04-28
Virtual (computational) screening is an increasingly important tool for drug discovery. AutoDock is a popular open-source application for performing molecular docking, the prediction of ligand-receptor interactions. AutoDock is a serial application, though several previous efforts have parallelized various aspects of the program. In this paper, we report on a multi-level parallelization of AutoDock 4.2 (mpAD4). Using MPI and OpenMP, AutoDock 4.2 was parallelized for use on MPI-enabled systems and to multithread the execution of individual docking jobs. In addition, code was implemented to reduce input/output (I/O) traffic by reusing grid maps at each node from docking to docking. Performance of mpAD4 was examined on two multiprocessor computers. Using MPI with OpenMP multithreading, mpAD4 scales with near linearity on the multiprocessor systems tested. In situations where I/O is limiting, reuse of grid maps reduces both system I/O and overall screening time. Multithreading of AutoDock's Lamarkian Genetic Algorithm with OpenMP increases the speed of execution of individual docking jobs, and when combined with MPI parallelization can significantly reduce the execution time of virtual screens. This work is significant in that mpAD4 speeds the execution of certain molecular docking workloads and allows the user to optimize the degree of system-level (MPI) and node-level (OpenMP) parallelization to best fit both workloads and computational resources.
Song, Cen; Zhuang, Jun
2018-01-01
In security check systems, tighter screening processes increase the security level, but also cause more congestion, which could cause longer wait times. Having to deal with more congestion in lines could also cause issues for the screeners. The Transportation Security Administration (TSA) Precheck Program was introduced to create fast lanes in airports with the goal of expediting passengers who the TSA does not deem to be threats. In this lane, the TSA allows passengers to enjoy fewer restrictions in order to speed up the screening time. Motivated by the TSA Precheck Program, we study parallel queueing imperfect screening systems, where the potential normal and adversary participants/applicants decide whether to apply to the Precheck Program or not. The approved participants would be assigned to a faster screening channel based on a screening policy determined by an approver, who balances the concerns of safety of the passengers and congestion of the lines. There exist three types of optimal normal applicant's application strategy, which depend on whether the marginal payoff is negative or positive, or whether the marginal benefit equals the marginal cost. An adversary applicant would not apply when the screening policy is sufficiently large or the number of utilized benefits is sufficiently small. The basic model is extended by considering (1) applicants' parameters to follow different distributions and (2) applicants to have risk levels, where the approver determines the threshold value needed to qualify for Precheck. This article integrates game theory and queueing theory to study the optimal screening policy and provides some insights to imperfect parallel queueing screening systems. © 2017 Society for Risk Analysis.
Effectiveness and Cost-Effectiveness of Blood Pressure Screening in Adolescents in the United States
Wang, Y. Claire; Cheung, Angela M.; Bibbins-Domingo, Kirsten; Prosser, Lisa A.; Cook, Nancy R.; Goldman, Lee; Gillman, Matthew W.
2014-01-01
Objective To compare the long-term effectiveness and cost-effectiveness of 3 approaches to managing elevated blood pressure (BP) in adolescents in the United States: no intervention, “screen-and-treat,” and population-wide strategies to lower the entire BP distribution. Study design We used a simulation model to combine several data sources to project the lifetime costs and cardiovascular outcomes for a cohort of 15-year-old U.S. adolescents under different BP approaches and conducted cost-effectiveness analysis. We obtained BP distributions from the National Health and Nutrition Examination Survey 1999–2004 and used childhood-to-adult longitudinal correlation analyses to simulate the tracking of BP. We then used the coronary heart disease policy model to estimate lifetime coronary heart disease events, costs, and quality-adjusted life years (QALY). Results Among screen-and-treat strategies, finding and treating the adolescents at highest risk (eg, left ventricular hypertrophy) was most cost-effective ($18 000/QALY [boys] and $47 000/QALY [girls]). However, all screen-and-treat strategies were dominated by population-wide strategies such as salt reduction (cost-saving [boys] and $650/ QALY [girls]) and increasing physical education ($11 000/QALY [boys] and $35 000/QALY [girls]). Conclusions Routine adolescents BP screening is moderately effective, but population-based BP interventions with broader reach could potentially be less costly and more effective for early cardiovascular disease prevention and should be implemented in parallel. PMID:20850759
Novel approach for image skeleton and distance transformation parallel algorithms
NASA Astrophysics Data System (ADS)
Qing, Kent P.; Means, Robert W.
1994-05-01
Image Understanding is more important in medical imaging than ever, particularly where real-time automatic inspection, screening and classification systems are installed. Skeleton and distance transformations are among the common operations that extract useful information from binary images and aid in Image Understanding. The distance transformation describes the objects in an image by labeling every pixel in each object with the distance to its nearest boundary. The skeleton algorithm starts from the distance transformation and finds the set of pixels that have a locally maximum label. The distance algorithm has to scan the entire image several times depending on the object width. For each pixel, the algorithm must access the neighboring pixels and find the maximum distance from the nearest boundary. It is a computational and memory access intensive procedure. In this paper, we propose a novel parallel approach to the distance transform and skeleton algorithms using the latest VLSI high- speed convolutional chips such as HNC's ViP. The algorithm speed is dependent on the object's width and takes (k + [(k-1)/3]) * 7 milliseconds for a 512 X 512 image with k being the maximum distance of the largest object. All objects in the image will be skeletonized at the same time in parallel.
Gozalbes, Rafael; Carbajo, Rodrigo J; Pineda-Lucena, Antonio
2010-01-01
In the last decade, fragment-based drug discovery (FBDD) has evolved from a novel approach in the search of new hits to a valuable alternative to the high-throughput screening (HTS) campaigns of many pharmaceutical companies. The increasing relevance of FBDD in the drug discovery universe has been concomitant with an implementation of the biophysical techniques used for the detection of weak inhibitors, e.g. NMR, X-ray crystallography or surface plasmon resonance (SPR). At the same time, computational approaches have also been progressively incorporated into the FBDD process and nowadays several computational tools are available. These stretch from the filtering of huge chemical databases in order to build fragment-focused libraries comprising compounds with adequate physicochemical properties, to more evolved models based on different in silico methods such as docking, pharmacophore modelling, QSAR and virtual screening. In this paper we will review the parallel evolution and complementarities of biophysical techniques and computational methods, providing some representative examples of drug discovery success stories by using FBDD.
Prioritizing multiple therapeutic targets in parallel using automated DNA-encoded library screening
NASA Astrophysics Data System (ADS)
Machutta, Carl A.; Kollmann, Christopher S.; Lind, Kenneth E.; Bai, Xiaopeng; Chan, Pan F.; Huang, Jianzhong; Ballell, Lluis; Belyanskaya, Svetlana; Besra, Gurdyal S.; Barros-Aguirre, David; Bates, Robert H.; Centrella, Paolo A.; Chang, Sandy S.; Chai, Jing; Choudhry, Anthony E.; Coffin, Aaron; Davie, Christopher P.; Deng, Hongfeng; Deng, Jianghe; Ding, Yun; Dodson, Jason W.; Fosbenner, David T.; Gao, Enoch N.; Graham, Taylor L.; Graybill, Todd L.; Ingraham, Karen; Johnson, Walter P.; King, Bryan W.; Kwiatkowski, Christopher R.; Lelièvre, Joël; Li, Yue; Liu, Xiaorong; Lu, Quinn; Lehr, Ruth; Mendoza-Losana, Alfonso; Martin, John; McCloskey, Lynn; McCormick, Patti; O'Keefe, Heather P.; O'Keeffe, Thomas; Pao, Christina; Phelps, Christopher B.; Qi, Hongwei; Rafferty, Keith; Scavello, Genaro S.; Steiginga, Matt S.; Sundersingh, Flora S.; Sweitzer, Sharon M.; Szewczuk, Lawrence M.; Taylor, Amy; Toh, May Fern; Wang, Juan; Wang, Minghui; Wilkins, Devan J.; Xia, Bing; Yao, Gang; Zhang, Jean; Zhou, Jingye; Donahue, Christine P.; Messer, Jeffrey A.; Holmes, David; Arico-Muendel, Christopher C.; Pope, Andrew J.; Gross, Jeffrey W.; Evindar, Ghotas
2017-07-01
The identification and prioritization of chemically tractable therapeutic targets is a significant challenge in the discovery of new medicines. We have developed a novel method that rapidly screens multiple proteins in parallel using DNA-encoded library technology (ELT). Initial efforts were focused on the efficient discovery of antibacterial leads against 119 targets from Acinetobacter baumannii and Staphylococcus aureus. The success of this effort led to the hypothesis that the relative number of ELT binders alone could be used to assess the ligandability of large sets of proteins. This concept was further explored by screening 42 targets from Mycobacterium tuberculosis. Active chemical series for six targets from our initial effort as well as three chemotypes for DHFR from M. tuberculosis are reported. The findings demonstrate that parallel ELT selections can be used to assess ligandability and highlight opportunities for successful lead and tool discovery.
Developments in SPR Fragment Screening.
Chavanieu, Alain; Pugnière, Martine
2016-01-01
Fragment-based approaches have played an increasing role alongside high-throughput screening in drug discovery for 15 years. The label-free biosensor technology based on surface plasmon resonance (SPR) is now sensitive and informative enough to serve during primary screens and validation steps. In this review, the authors discuss the role of SPR in fragment screening. After a brief description of the underlying principles of the technique and main device developments, they evaluate the advantages and adaptations of SPR for fragment-based drug discovery. SPR can also be applied to challenging targets such as membrane receptors and enzymes. The high-level of immobilization of the protein target and its stability are key points for a relevant screening that can be optimized using oriented immobilized proteins and regenerable sensors. Furthermore, to decrease the rate of false negatives, a selectivity test may be performed in parallel on the main target bearing the binding site mutated or blocked with a low-off-rate ligand. Fragment-based drug design, integrated in a rational workflow led by SPR, will thus have a predominant role for the next wave of drug discovery which could be greatly enhanced by new improvements in SPR devices.
Pollier, Jacob; González-Guzmán, Miguel; Ardiles-Diaz, Wilson; Geelen, Danny; Goossens, Alain
2011-01-01
cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) is a commonly used technique for genome-wide expression analysis that does not require prior sequence knowledge. Typically, quantitative expression data and sequence information are obtained for a large number of differentially expressed gene tags. However, most of the gene tags do not correspond to full-length (FL) coding sequences, which is a prerequisite for subsequent functional analysis. A medium-throughput screening strategy, based on integration of polymerase chain reaction (PCR) and colony hybridization, was developed that allows in parallel screening of a cDNA library for FL clones corresponding to incomplete cDNAs. The method was applied to screen for the FL open reading frames of a selection of 163 cDNA-AFLP tags from three different medicinal plants, leading to the identification of 109 (67%) FL clones. Furthermore, the protocol allows for the use of multiple probes in a single hybridization event, thus significantly increasing the throughput when screening for rare transcripts. The presented strategy offers an efficient method for the conversion of incomplete expressed sequence tags (ESTs), such as cDNA-AFLP tags, to FL-coding sequences.
Cacace, Angela; Banks, Martyn; Spicer, Timothy; Civoli, Francesca; Watson, John
2003-09-01
G-protein-coupled receptors (GPCRs) are the most successful target proteins for drug discovery research to date. More than 150 orphan GPCRs of potential therapeutic interest have been identified for which no activating ligands or biological functions are known. One of the greatest challenges in the pharmaceutical industry is to link these orphan GPCRs with human diseases. Highly automated parallel approaches that integrate ultra-high throughput and focused screening can be used to identify small molecule modulators of orphan GPCRs. These small molecules can then be employed as pharmacological tools to explore the function of orphan receptors in models of human disease. In this review, we describe methods that utilize powerful ultra-high-throughput screening technologies to identify surrogate ligands of orphan GPCRs.
Transcription elongation factors represent in vivo cancer dependencies in glioblastoma
Miller, Tyler E.; Liau, Brian B.; Wallace, Lisa C.; Morton, Andrew R.; Xie, Qi; Dixit, Deobrat; Factor, Daniel C.; Kim, Leo J. Y.; Morrow, James J.; Wu, Qiulian; Mack, Stephen C.; Hubert, Christopher G.; Gillespie, Shawn M.; Flavahan, William A.; Hoffmann, Thomas; Thummalapalli, Rohit; Hemann, Michael T.; Paddison, Patrick J.; Horbinski, Craig M.; Zuber, Johannes; Scacheri, Peter C.; Bernstein, Bradley E.; Tesar, Paul J.; Rich, Jeremy N.
2017-01-01
Glioblastoma is a universally lethal cancer with a median survival of approximately 15 months1. Despite substantial efforts to define druggable targets, there are no therapeutic options that meaningfully extend glioblastoma patient lifespan. While previous work has largely focused on in vitro cellular models, here we demonstrate a more physiologically relevant approach to target discovery in glioblastoma. We adapted pooled RNA interference (RNAi) screening technology2–4 for use in orthotopic patient-derived xenograft (PDX) models, creating a high-throughput negative selection screening platform in a functional in vivo tumour microenvironment. Using this approach, we performed parallel in vivo and in vitro screens and discovered that the chromatin and transcriptional regulators necessary for cell survival in vivo are non-overlapping with those required in vitro. We identified transcription pause-release and elongation factors as one set of in vivo-specific cancer dependencies and determined that these factors are necessary for enhancer-mediated transcriptional adaptations that enable cells to survive the tumour microenvironment. Our lead hit, JMJD6, mediates the upregulation of in vivo stress and stimulus response pathways through enhancer-mediated transcriptional pause-release, promoting cell survival specifically in vivo. Targeting JMJD6 or other identified elongation factors extends survival in orthotopic xenograft mouse models, supporting targeting the transcription elongation machinery as a therapeutic strategy for glioblastoma. More broadly, this study demonstrates the power of in vivo phenotypic screening to identify new classes of ‘cancer dependencies’ not identified by previous in vitro approaches, which could supply untapped opportunities for therapeutic intervention. PMID:28678782
Potter, B K; Avard, D; Entwistle, V; Kennedy, C; Chakraborty, P; McGuire, M; Wilson, B J
2009-01-01
Prenatal/preconceptional and newborn screening programs have been a focus of recent policy debates that have included attention to ethical, legal, and social issues (ELSIs). In parallel, there has been an ongoing discussion about whether and how ELSIs may be addressed in health technology assessment (HTA). We conducted a knowledge synthesis study to explore both guidance and current practice regarding the consideration of ELSIs in HTA for prenatal/preconceptional and newborn screening. As the concluding activity for this project, we held a Canadian workshop to discuss the issues with a diverse group of stakeholders. Based on key workshop themes integrated with our study results, we suggest that population-based genetic screening programs may present particular types of ELSIs and that a public health ethics perspective is potentially highly relevant when considering them. We also suggest that approaches to addressing ELSIs in HTA for prenatal/preconceptional and newborn screening may need to be flexible enough to respond to diversity in HTA organizations, cultural values, stakeholder communities, and contextual factors. Finally, we highlight a need for transparency in the way that HTA producers move from evidence to conclusions and the ways in which screening policy decisions are made. Copyright © 2008 S. Karger AG, Basel.
Potter, B.K.; Avard, D.; Entwistle, V.; Kennedy, C.; Chakraborty, P.; McGuire, M.; Wilson, B.J.
2008-01-01
Prenatal/preconceptional and newborn screening programs have been a focus of recent policy debates that have included attention to ethical, legal, and social issues (ELSIs). In parallel, there has been an ongoing discussion about whether and how ELSIs may be addressed in health technology assessment (HTA). We conducted a knowledge synthesis study to explore both guidance and current practice regarding the consideration of ELSIs in HTA for prenatal/preconceptional and newborn screening. As the concluding activity for this project, we held a Canadian workshop to discuss the issues with a diverse group of stakeholders. Based on key workshop themes integrated with our study results, we suggest that population-based genetic screening programs may present particular types of ELSIs and that a public health ethics perspective is potentially highly relevant when considering them. We also suggest that approaches to addressing ELSIs in HTA for prenatal/preconceptional and newborn screening may need to be flexible enough to respond to diversity in HTA organizations, cultural values, stakeholder communities, and contextual factors. Finally, we highlight a need for transparency in the way that HTA producers move from evidence to conclusions and the ways in which screening policy decisions are made. PMID:19023190
Genetic screens in human cells using the CRISPR-Cas9 system.
Wang, Tim; Wei, Jenny J; Sabatini, David M; Lander, Eric S
2014-01-03
The bacterial clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 system for genome editing has greatly expanded the toolbox for mammalian genetics, enabling the rapid generation of isogenic cell lines and mice with modified alleles. Here, we describe a pooled, loss-of-function genetic screening approach suitable for both positive and negative selection that uses a genome-scale lentiviral single-guide RNA (sgRNA) library. sgRNA expression cassettes were stably integrated into the genome, which enabled a complex mutant pool to be tracked by massively parallel sequencing. We used a library containing 73,000 sgRNAs to generate knockout collections and performed screens in two human cell lines. A screen for resistance to the nucleotide analog 6-thioguanine identified all expected members of the DNA mismatch repair pathway, whereas another for the DNA topoisomerase II (TOP2A) poison etoposide identified TOP2A, as expected, and also cyclin-dependent kinase 6, CDK6. A negative selection screen for essential genes identified numerous gene sets corresponding to fundamental processes. Last, we show that sgRNA efficiency is associated with specific sequence motifs, enabling the prediction of more effective sgRNAs. Collectively, these results establish Cas9/sgRNA screens as a powerful tool for systematic genetic analysis in mammalian cells.
Flow Cytometry: Impact on Early Drug Discovery.
Edwards, Bruce S; Sklar, Larry A
2015-07-01
Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens of thousands of cells per second and more than five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, "sip-and-spit" sampling technology has restricted it to low-sample-throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens of thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multiparameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage, and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry, and parallel sample processing promise dramatically expanded single-cell profiling capabilities to bolster systems-level approaches to drug discovery. © 2015 Society for Laboratory Automation and Screening.
Islam, Md Tarikul; Sarkar, Suprovath Kumar; Sultana, Nusrat; Begum, Mst Noorjahan; Bhuyan, Golam Sarower; Talukder, Shezote; Muraduzzaman, A K M; Alauddin, Md; Islam, Mohammad Sazzadul; Biswas, Pritha Promita; Biswas, Aparna; Qadri, Syeda Kashfi; Shirin, Tahmina; Banu, Bilquis; Sadya, Salma; Hussain, Manzoor; Sarwardi, Golam; Khan, Waqar Ahmed; Mannan, Mohammad Abdul; Shekhar, Hossain Uddin; Chowdhury, Emran Kabir; Sajib, Abu Ashfaqur; Akhteruzzaman, Sharif; Qadri, Syed Saleheen; Qadri, Firdausi; Mannoor, Kaiissar
2018-01-02
Bangladesh lies in the global thalassemia belt, which has a defined mutational hot-spot in the beta-globin gene. The high carrier frequencies of beta-thalassemia trait and hemoglobin E-trait in Bangladesh necessitate a reliable DNA-based carrier screening approach that could supplement the use of hematological and electrophoretic indices to overcome the barriers of carrier screening. With this view in mind, the study aimed to establish a high resolution melting (HRM) curve-based rapid and reliable mutation screening method targeting the mutational hot-spot of South Asian and Southeast Asian countries that encompasses exon-1 (c.1 - c.92), intron-1 (c.92 + 1 - c.92 + 130) and a portion of exon-2 (c.93 - c.217) of the HBB gene which harbors more than 95% of mutant alleles responsible for beta-thalassemia in Bangladesh. Our HRM approach could successfully differentiate ten beta-globin gene mutations, namely c.79G > A, c.92 + 5G > C, c.126_129delCTTT, c.27_28insG, c.46delT, c.47G > A, c.92G > C, c.92 + 130G > C, c.126delC and c.135delC in heterozygous states from the wild type alleles, implying the significance of the approach for carrier screening as the first three of these mutations account for ~85% of total mutant alleles in Bangladesh. Moreover, different combinations of compound heterozygous mutations were found to generate melt curves that were distinct from the wild type alleles and from one another. Based on the findings, sixteen reference samples were run in parallel to 41 unknown specimens to perform direct genotyping of the beta-thalassemia specimens using HRM. The HRM-based genotyping of the unknown specimens showed 100% consistency with the sequencing result. Targeting the mutational hot-spot, the HRM approach could be successfully applied for screening of beta-thalassemia carriers in Bangladesh as well as in other countries of South Asia and Southeast Asia. The approach could be a useful supplement of hematological and electrophortic indices in order to avoid false positive and false negative results.
Identification of Multiple Druggable Secondary Sites by Fragment Screening against DC-SIGN.
Aretz, Jonas; Baukmann, Hannes; Shanina, Elena; Hanske, Jonas; Wawrzinek, Robert; Zapol'skii, Viktor A; Seeberger, Peter H; Kaufmann, Dieter E; Rademacher, Christoph
2017-06-12
DC-SIGN is a cell-surface receptor for several pathogenic threats, such as HIV, Ebola virus, or Mycobacterium tuberculosis. Multiple attempts to develop inhibitors of the underlying carbohydrate-protein interactions have been undertaken in the past fifteen years. Still, drug-like DC-SIGN ligands are sparse, which is most likely due to its hydrophilic, solvent-exposed carbohydrate-binding site. Herein, we report on a parallel fragment screening against DC-SIGN applying SPR and a reporter displacement assay, which complements previous screenings using 19 F NMR spectroscopy and chemical fragment microarrays. Hit validation by SPR and 1 H- 15 N HSQC NMR spectroscopy revealed that although no fragment bound in the primary carbohydrate site, five secondary sites are available to harbor drug-like molecules. Building on key interactions of the reported fragment hits, these pockets will be targeted in future approaches to accelerate the development of DC-SIGN inhibitors. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Yoneda, Arata; Higaki, Takumi; Kutsuna, Natsumaro; Kondo, Yoichi; Osada, Hiroyuki; Hasezawa, Seiichiro; Matsui, Minami
2007-10-01
It is a well-known hypothesis that cortical microtubules control the direction of cellulose microfibril deposition, and that the parallel cellulose microfibrils determine anisotropic cell expansion and plant cell morphogenesis. However, the molecular mechanism by which cortical microtubules regulate the orientation of cellulose microfibrils is still unclear. To investigate this mechanism, chemical genetic screening was performed. From this screening, 'SS compounds' were identified that induced a spherical swelling phenotype in tobacco BY-2 cells. The SS compounds could be categorized into three classes: those that disrupted the cortical microtubules; those that reduced cellulose microfibril content; and thirdly those that had neither of these effects. In the last class, a chemical designated 'cobtorin' was found to induce the spherical swelling phenotype at the lowest concentration, suggesting strong binding activity to the putative target. Examining cellulose microfibril regeneration using taxol-treated protoplasts revealed that the cobtorin compound perturbed the parallel alignment of pre-existing cortical microtubules and nascent cellulose microfibrils. Thus, cobtorin could be a novel inhibitor and an attractive tool for further investigation of the mechanism that enables cortical microtubules to guide the parallel deposition of cellulose microfibrils.
NASA Technical Reports Server (NTRS)
Alario, J. P.; Haslett, R. A.
1986-01-01
Parallel pipes provide high heat flow from small heat exchanger. Six parallel heat pipes extract heat from overlying heat exchanger, forming evaporator. Vapor channel in pipe contains wick that extends into screen tube in liquid channel. Rods in each channel hold wick and screen tube in place. Evaporator compact rather than extended and more compatible with existing heat-exchanger geometries. Prototype six-pipe evaporator only 0.3 m wide and 0.71 m long. With ammonia as working fluid, transports heat to finned condenser at rate of 1,200 W.
High-throughput screening for potential thyroid-disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the U.S. Environmental Protection Agency ToxCast screening assay portfolio. To fill 1 critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast phase I and II chemical libraries, comprised of 1074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single-concentration screen were retested in concentration-response. Due to high false-positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed 2 additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidat
Procacci, Piero
2016-06-27
We present a new release (6.0β) of the ORAC program [Marsili et al. J. Comput. Chem. 2010, 31, 1106-1116] with a hybrid OpenMP/MPI (open multiprocessing message passing interface) multilevel parallelism tailored for generalized ensemble (GE) and fast switching double annihilation (FS-DAM) nonequilibrium technology aimed at evaluating the binding free energy in drug-receptor system on high performance computing platforms. The production of the GE or FS-DAM trajectories is handled using a weak scaling parallel approach on the MPI level only, while a strong scaling force decomposition scheme is implemented for intranode computations with shared memory access at the OpenMP level. The efficiency, simplicity, and inherent parallel nature of the ORAC implementation of the FS-DAM algorithm, project the code as a possible effective tool for a second generation high throughput virtual screening in drug discovery and design. The code, along with documentation, testing, and ancillary tools, is distributed under the provisions of the General Public License and can be freely downloaded at www.chim.unifi.it/orac .
Charles, Isabel; Sinclair, Ian; Addison, Daniel H
2014-04-01
A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reindl, W.; Deng, K.; Gladden, J.M.
2011-05-01
The enzymatic hydrolysis of long-chain polysaccharides is a crucial step in the conversion of biomass to lignocellulosic biofuels. The identification and characterization of optimal glycoside hydrolases is dependent on enzyme activity assays, however existing methods are limited in terms of compatibility with a broad range of reaction conditions, sample complexity, and especially multiplexity. The method we present is a multiplexed approach based on Nanostructure-Initiator Mass Spectrometry (NIMS) that allowed studying several glycolytic activities in parallel under diverse assay conditions. Although the substrate analogs carried a highly hydrophobic perfluorinated tag, assays could be performed in aqueous solutions due colloid formation ofmore » the substrate molecules. We first validated our method by analyzing known {beta}-glucosidase and {beta}-xylosidase activities in single and parallel assay setups, followed by the identification and characterization of yet unknown glycoside hydrolase activities in microbial communities.« less
New technologies in cervical cancer precursor detection.
Soler, M E; Blumenthal, P D
2000-09-01
The current literature reflects three routes toward improving cervical cancer screening. The first is to improve the test qualities of cytology-based screening. The use of liquid-based cytology and computerized analysis of Papanicolaou tests are examples of attempts at this approach. Secondly, through various combinations of parallel or sequential tests, either the sensitivity or the specificity of a given test could be improved depending on the tests chosen and the order in which they were performed (eg, Papanicolaou test followed by human papillomavirus [HPV] or vice versa). Several excellent studies have been published this year on the use of HPV DNA testing as a primary screening modality and as an adjunct to the triage of mildly abnormal cytologic findings. The recent literature also reflects increasing interest in visual inspection of the cervix and self-collected samples for HPV testing as an equally effective and viable alternative to cytology in low-resource settings. A third possibility is to make use of advances in digital and spectroscopic techniques. In these cost-conscious times, a significant number of articles address the cost-effectiveness of these technologies and the real value of cervical cancer screening. This article reviews the current literature concerning both the advent of new cervical cancer screening technologies and the rediscovery of old ones.
Flow Cytometry: Impact On Early Drug Discovery
Edwards, Bruce S.; Sklar, Larry A.
2015-01-01
Summary Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens-of-thousands of cells per second and over five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, “sip-and-spit” sampling technology has restricted it to low sample throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens-of-thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multi-parameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry and parallel sample processing promise dramatically expanded single cell profiling capabilities to bolster systems level approaches to drug discovery. PMID:25805180
NASA Technical Reports Server (NTRS)
Parrish, Russell V.; Busquets, Anthony M.; Williams, Steven P.; Nold, Dean E.
1994-01-01
An extensive simulation study was performed to determine and compare the spatial awareness of commercial airline pilots on simulated landing approaches using conventional flight displays with their awareness using advanced pictorial 'pathway in the sky' displays. Sixteen commercial airline pilots repeatedly made simulated complex microwave landing system approaches to closely spaced parallel runways with an extremely short final segment. Scenarios involving conflicting traffic situation assessments and recoveries from flight path offset conditions were used to assess spatial awareness (own ship position relative the the desired flight route, the runway, and other traffic) with the various display formats. The situation assessment tools are presented, as well as the experimental designs and the results. The results demonstrate that the integrated pictorial displays substantially increase spatial awareness over conventional electronic flight information systems display formats.
Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter
2014-12-01
The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.
Mladic, Marija; de Waal, Tessa; Burggraaff, Lindsey; Slagboom, Julien; Somsen, Govert W; Niessen, Wilfried M A; Manjunatha Kini, R; Kool, Jeroen
2017-10-01
This study presents an analytical method for the screening of snake venoms for inhibitors of the angiotensin-converting enzyme (ACE) and a strategy for their rapid identification. The method is based on an at-line nanofractionation approach, which combines liquid chromatography (LC), mass spectrometry (MS), and pharmacology in one platform. After initial LC separation of a crude venom, a post-column flow split is introduced enabling parallel MS identification and high-resolution fractionation onto 384-well plates. The plates are subsequently freeze-dried and used in a fluorescence-based ACE activity assay to determine the ability of the nanofractions to inhibit ACE activity. Once the bioactive wells are identified, the parallel MS data reveals the masses corresponding to the activities found. Narrowing down of possible bioactive candidates is provided by comparison of bioactivity profiles after reversed-phase liquid chromatography (RPLC) and after hydrophilic interaction chromatography (HILIC) of a crude venom. Additional nanoLC-MS/MS analysis is performed on the content of the bioactive nanofractions to determine peptide sequences. The method described was optimized, evaluated, and successfully applied for screening of 30 snake venoms for the presence of ACE inhibitors. As a result, two new bioactive peptides were identified: pELWPRPHVPP in Crotalus viridis viridis venom with IC 50 = 1.1 μM and pEWPPWPPRPPIPP in Cerastes cerastes cerastes venom with IC 50 = 3.5 μM. The identified peptides possess a high sequence similarity to other bradykinin-potentiating peptides (BPPs), which are known ACE inhibitors found in snake venoms.
Brennan; Biddison; Frauendorf; Schwarcz; Keen; Ecker; Davis; Tinder; Swayze
1998-01-01
An automated, 96-well parallel array synthesizer for solid-phase organic synthesis has been designed and constructed. The instrument employs a unique reagent array delivery format, in which each reagent utilized has a dedicated plumbing system. An inert atmosphere is maintained during all phases of a synthesis, and temperature can be controlled via a thermal transfer plate which holds the injection molded reaction block. The reaction plate assembly slides in the X-axis direction, while eight nozzle blocks holding the reagent lines slide in the Y-axis direction, allowing for the extremely rapid delivery of any of 64 reagents to 96 wells. In addition, there are six banks of fixed nozzle blocks, which deliver the same reagent or solvent to eight wells at once, for a total of 72 possible reagents. The instrument is controlled by software which allows the straightforward programming of the synthesis of a larger number of compounds. This is accomplished by supplying a general synthetic procedure in the form of a command file, which calls upon certain reagents to be added to specific wells via lookup in a sequence file. The bottle position, flow rate, and concentration of each reagent is stored in a separate reagent table file. To demonstrate the utility of the parallel array synthesizer, a small combinatorial library of hydroxamic acids was prepared in high throughput mode for biological screening. Approximately 1300 compounds were prepared on a 10 μmole scale (3-5 mg) in a few weeks. The resulting crude compounds were generally >80% pure, and were utilized directly for high throughput screening in antibacterial assays. Several active wells were found, and the activity was verified by solution-phase synthesis of analytically pure material, indicating that the system described herein is an efficient means for the parallel synthesis of compounds for lead discovery. Copyright 1998 John Wiley & Sons, Inc.
Rapid Parallel Screening for Strain Optimization
2013-08-16
fermentation yields of industrially relevant biological compounds. Screening of the desired chemicals was completed previously. Microbes that can...reporter, and, 2) a yeast TAR cloning shuttle vector for transferring catabolic clusters to E. coli. 15. SUBJECT TERMS NA 16. SECURITY CLASSIFICATION OF... fermentation yields of industrially relevant biological compounds. Screening of the desired chemicals was completed previously. Microbes that can utilize
Rapid Parallel Screening for Strain Optimization
2013-05-16
fermentation yields of industrially relevant biological compounds. Screening of the desired chemicals was completed previously. Microbes that can...reporter, and, 2) a yeast TAR cloning shuttle vector for transferring catabolic clusters to E. coli. 15. SUBJECT TERMS NA 16. SECURITY CLASSIFICATION OF... fermentation yields of industrially relevant biological compounds. Screening of the desired chemicals was completed previously. Microbes that can utilize
Gupte, Ankita; Baker, Emma K.; Wan, Soo-San; Stewart, Elizabeth; Loh, Amos; Shelat, Anang A.; Gould, Cathryn M.; Chalk, Alistair M.; Taylor, Scott; Lackovic, Kurt; Karlström, Åsa; Mutsaers, Anthony J.; Desai, Jayesh; Madhamshettiwar, Piyush B.; Zannettino, Andrew CW.; Burns, Chris; Huang, David CS.; Dyer, Michael A.; Simpson, Kaylene J.; Walkley, Carl R.
2015-01-01
Purpose Osteosarcoma (OS) is the most common cancer of bone occurring mostly in teenagers. Despite rapid advances in our knowledge of the genetics and cell biology of OS, significant improvements in patient survival have not been observed. The identification of effective therapeutics has been largely empirically based. The identification of new therapies and therapeutic targets are urgently needed to enable improved outcomes for OS patients. Experimental Design We have used genetically engineered murine models of human OS in a systematic, genome wide screen to identify new candidate therapeutic targets. We performed a genome wide siRNA screen, with or without doxorubicin. In parallel a screen of therapeutically relevant small molecules was conducted on primary murine and primary human OS derived cell cultures. All results were validated across independent cell cultures and across human and mouse OS. Results The results from the genetic and chemical screens significantly overlapped, with a profound enrichment of pathways regulated by PI3K and mTOR pathways. Drugs that concurrently target both PI3K and mTOR were effective at inducing apoptosis in primary OS cell cultures in vitro in both human and mouse OS, while specific PI3K or mTOR inhibitors were not effective. The results were confirmed with siRNA and small molecule approaches. Rationale combinations of specific PI3K and mTOR inhibitors could recapitulate the effect on OS cell cultures. Conclusions The approaches described here have identified dual inhibition of the PI3K/mTOR pathway as a sensitive, druggable target in OS and provide rationale for translational studies with these agents. PMID:25862761
Varnes, Jeffrey G; Geschwindner, Stefan; Holmquist, Christopher R; Forst, Janet; Wang, Xia; Dekker, Niek; Scott, Clay W; Tian, Gaochao; Wood, Michael W; Albert, Jeffrey S
2016-01-01
Fragment-based drug design (FBDD) relies on direct elaboration of fragment hits and typically requires high resolution structural information to guide optimization. In fragment-assisted drug discovery (FADD), fragments provide information to guide selection and design but do not serve as starting points for elaboration. We describe FADD and high-throughput screening (HTS) campaign strategies conducted in parallel against PDE10A where fragment hit co-crystallography was not available. The fragment screen led to prioritized fragment hits (IC50's ∼500μM), which were used to generate a hypothetical core scaffold. Application of this scaffold as a filter to HTS output afforded a 4μM hit, which, after preparation of a small number of analogs, was elaborated into a 16nM lead. This approach highlights the strength of FADD, as fragment methods were applied despite the absence of co-crystallographical information to efficiently identify a lead compound for further optimization. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quantitative screening of yeast surface-displayed polypeptide libraries by magnetic bead capture.
Yeung, Yik A; Wittrup, K Dane
2002-01-01
Magnetic bead capture is demonstrated here to be a feasible alternative for quantitative screening of favorable mutants from a cell-displayed polypeptide library. Flow cytometric sorting with fluorescent probes has been employed previously for high throughput screening for either novel binders or improved mutants. However, many laboratories do not have ready access to this technology as a result of the limited availability and high cost of cytometers, restricting the use of cell-displayed libraries. Using streptavidin-coated magnetic beads and biotinylated ligands, an alternative approach to cell-based library screening for improved mutants was developed. Magnetic bead capture probability of labeled cells is shown to be closely correlated with the surface ligand density. A single-pass enrichment ratio of 9400 +/- 1800-fold, at the expense of 85 +/- 6% binder losses, is achieved from screening a library that contains one antibody-displaying cell (binder) in 1.1 x 10(5) nondisplaying cells. Additionally, kinetic screening for an initial high affinity to low affinity (7.7-fold lower) mutant ratio of 1:95,000, the magnetic bead capture method attains a single-pass enrichment ratio of 600 +/- 200-fold with a 75 +/- 24% probability of loss for the higher affinity mutant. The observed high loss probabilities can be straightforwardly compensated for by library oversampling, given the inherently parallel nature of the screen. Overall, these results demonstrate that magnetic beads are capable of quantitatively screening for novel binders and improved mutants. The described methods are directly analogous to procedures in common use for phage display and should lower the barriers to entry for use of cell surface display libraries.
An Economic Evaluation of Colorectal Cancer Screening in Primary Care Practice
Meenan, Richard T.; Anderson, Melissa L.; Chubak, Jessica; Vernon, Sally W.; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B.
2015-01-01
Introduction Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs), automated mailings, and stepped support increases to improve 2-year colorectal cancer screening adherence. Methods Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings [“automated”], automated plus telephone assistance [“assisted”], or automated and assisted plus nurse navigation to testing completion or refusal [navigated”]) were compared to usual care. Data were from August 2008–November 2011 with analyses performed during 2012–2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Results Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=−$159) and assisted (ICER=−$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600–$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Conclusions Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. PMID:25998922
Song, Lifu; Zeng, An-Ping
2017-11-09
Cells are capable of rapid replication and performing tasks adaptively and ultra-sensitively and can be considered as cheap "biological-robots". Here we propose to engineer cells for screening biomolecules in parallel and with high sensitivity. Specifically, we place the biomolecule variants (library) on the bacterial phage M13. We then design cells to screen the library based on cell-phage interactions mediated by a specific intracellular signal change caused by the biomolecule of interest. For proof of concept, we used intracellular lysine concentration in E. coli as a signal to successfully screen variants of functional aspartate kinase III (AK-III) under in vivo conditions, a key enzyme in L-lysine biosynthesis which is strictly inhibited by L-lysine. Comparative studies with flow cytometry method failed to distinguish the wild-type from lysine resistance variants of AK-III, confirming a higher sensitivity of the method. It opens up a new and effective way of in vivo high-throughput screening for functional molecules and can be easily implemented at low costs.
Hagedorn, Martin; Bögershausen, Ansgar; Rischer, Matthias; Schubert, Rolf; Massing, Ulrich
2017-09-15
The development of nanosuspensions of poorly soluble APIs takes a lot of time and high amount of active material is needed. In this publication the use of dual centrifugation (DC) for an effective and rapid API-nanomilling is described for the first time. DC differs from normal centrifugation by an additional rotation of the samples during centrifugation, resulting in a very fast and powerful movement of the samples inside the vials, which - in combination with milling beads - result in effective milling. DC-nanomilling was compared to conventional wet ball milling and results in same or even smaller particle sizes. Also drug concentrations up to 40% can be processed. The process is fast (typical 90min) and the temperature can be controlled. DC-nanomilling appears to be very gentle, experiments showed no change of the crystal structure during milling. Since batch sizes are very small (100-1000mg) and since 40 sample vials can be processed in parallel, DC is ideal for the screening of suitable polymer/surfactant combinations. Fenofibrate was used to investigate DC-nanomilling for formulation screening by applying a DoE-approach. The presented data also show that the results of DC-nanomilling experiments are highly comparable to the results obtained by common agitator mills. Copyright © 2017 Elsevier B.V. All rights reserved.
In situ click chemistry: a powerful means for lead discovery.
Sharpless, K Barry; Manetsch, Roman
2006-11-01
Combinatorial chemistry and parallel synthesis are important and regularly applied tools for lead identification and optimisation, although they are often accompanied by challenges related to the efficiency of library synthesis and the purity of the compound library. In the last decade, novel means of lead discovery approaches have been investigated where the biological target is actively involved in the synthesis of its own inhibitory compound. These fragment-based approaches, also termed target-guided synthesis (TGS), show great promise in lead discovery applications by combining the synthesis and screening of libraries of low molecular weight compounds in a single step. Of all the TGS methods, the kinetically controlled variant is the least well known, but it has the potential to emerge as a reliable lead discovery method. The kinetically controlled TGS approach, termed in situ click chemistry, is discussed in this article.
NASA Astrophysics Data System (ADS)
Sumriddetchkajorn, Sarun; Chaitavon, Kosom
2009-07-01
This paper introduces a parallel measurement approach for fast infrared-based human temperature screening suitable for use in a large public area. Our key idea is based on the combination of simple image processing algorithms, infrared technology, and human flow management. With this multidisciplinary concept, we arrange as many people as possible in a two-dimensional space in front of a thermal imaging camera and then highlight all human facial areas through simple image filtering, image morphological, and particle analysis processes. In this way, an individual's face in live thermal image can be located and the maximum facial skin temperature can be monitored and displayed. Our experiment shows a measured 1 ms processing time in highlighting all human face areas. With a thermal imaging camera having an FOV lens of 24° × 18° and 320 × 240 active pixels, the maximum facial skin temperatures from three people's faces located at 1.3 m from the camera can also be simultaneously monitored and displayed in a measured rate of 31 fps, limited by the looping process in determining coordinates of all faces. For our 3-day test under the ambient temperature of 24-30 °C, 57-72% relative humidity, and weak wind from the outside hospital building, hyperthermic patients can be identified with 100% sensitivity and 36.4% specificity when the temperature threshold level and the offset temperature value are appropriately chosen. Appropriately locating our system away from the building doors, air conditioners and electric fans in order to eliminate wind blow coming toward the camera lens can significantly help improve our system specificity.
Silicon-fiber blanket solar-cell array concept
NASA Technical Reports Server (NTRS)
Eliason, J. T.
1973-01-01
Proposed economical manufacture of solar-cell arrays involves parallel, planar weaving of filaments made of doped silicon fibers with diffused radial junction. Each filament is a solar cell connected either in series or parallel with others to form a blanket of deposited grids or attached electrode wire mesh screens.
ERIC Educational Resources Information Center
Raman, Madhavi Gayathri; Vijaya
2016-01-01
This paper captures the design of a comprehensive curriculum incorporating the four skills based exclusively on the use of parallel audio-visual and written texts. We discuss the use of authentic materials to teach English to Indian undergraduates aged 18 to 20 years. Specifically, we talk about the use of parallel reading (screen-play) and…
Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.
2013-01-01
A major challenge of the postgenomic era is to understand how human genes function together in normal and disease states. In microorganisms, high-density genetic interaction (GI) maps are a powerful tool to elucidate gene functions and pathways. We have developed an integrated methodology based on pooled shRNA screening in mammalian cells for genome-wide identification of genes with relevant phenotypes and systematic mapping of all GIs among them. We recently demonstrated the potential of this approach in an application to pathways controlling the susceptibility of human cells to the toxin ricin. Here we present the complete quantitative framework underlying our strategy, including experimental design, derivation of quantitative phenotypes from pooled screens, robust identification of hit genes using ultra-complex shRNA libraries, parallel measurement of tens of thousands of GIs from a single double-shRNA experiment, and construction of GI maps. We describe the general applicability of our strategy. Our pooled approach enables rapid screening of the same shRNA library in different cell lines and under different conditions to determine a range of different phenotypes. We illustrate this strategy here for single- and double-shRNA libraries. We compare the roles of genes for susceptibility to ricin and Shiga toxin in different human cell lines and reveal both toxin-specific and cell line-specific pathways. We also present GI maps based on growth and ricin-resistance phenotypes, and we demonstrate how such a comparative GI mapping strategy enables functional dissection of physical complexes and context-dependent pathways. PMID:23739767
Abreu, Rui Mv; Froufe, Hugo Jc; Queiroz, Maria João Rp; Ferreira, Isabel Cfr
2010-10-28
Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a potential maximum speed-up of 10x, the parallel algorithm of MOLA performed with a speed-up of 8,64× using AutoDock4 and 8,60× using Vina.
Tegel, Hanna; Yderland, Louise; Boström, Tove; Eriksson, Cecilia; Ukkonen, Kaisa; Vasala, Antti; Neubauer, Peter; Ottosson, Jenny; Hober, Sophia
2011-08-01
Protein production and analysis in a parallel fashion is today applied in laboratories worldwide and there is a great need to improve the techniques and systems used for this purpose. In order to save time and money, a fast and reliable screening method for analysis of protein production and also verification of the protein product is desired. Here, a micro-scale protocol for the parallel production and screening of 96 proteins in plate format is described. Protein capture was achieved using immobilized metal affinity chromatography and the product was verified using matrix-assisted laser desorption ionization time-of-flight MS. In order to obtain sufficiently high cell densities and product yield in the small-volume cultivations, the EnBase® cultivation technology was applied, which enables cultivation in as small volumes as 150 μL. Here, the efficiency of the method is demonstrated by producing 96 human, recombinant proteins, both in micro-scale and using a standard full-scale protocol and comparing the results in regard to both protein identity and sample purity. The results obtained are highly comparable to those acquired through employing standard full-scale purification protocols, thus validating this method as a successful initial screening step before protein production at a larger scale. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
GPU-based simulation of optical propagation through turbulence for active and passive imaging
NASA Astrophysics Data System (ADS)
Monnier, Goulven; Duval, François-Régis; Amram, Solène
2014-10-01
IMOTEP is a GPU-based (Graphical Processing Units) software relying on a fast parallel implementation of Fresnel diffraction through successive phase screens. Its applications include active imaging, laser telemetry and passive imaging through turbulence with anisoplanatic spatial and temporal fluctuations. Thanks to parallel implementation on GPU, speedups ranging from 40X to 70X are achieved. The present paper gives a brief overview of IMOTEP models, algorithms, implementation and user interface. It then focuses on major improvements recently brought to the anisoplanatic imaging simulation method. Previously, we took advantage of the computational power offered by the GPU to develop a simulation method based on large series of deterministic realisations of the PSF distorted by turbulence. The phase screen propagation algorithm, by reproducing higher moments of the incident wavefront distortion, provides realistic PSFs. However, we first used a coarse gaussian model to fit the numerical PSFs and characterise there spatial statistics through only 3 parameters (two-dimensional displacements of centroid and width). Meanwhile, this approach was unable to reproduce the effects related to the details of the PSF structure, especially the "speckles" leading to prominent high-frequency content in short-exposure images. To overcome this limitation, we recently implemented a new empirical model of the PSF, based on Principal Components Analysis (PCA), ought to catch most of the PSF complexity. The GPU implementation allows estimating and handling efficiently the numerous (up to several hundreds) principal components typically required under the strong turbulence regime. A first demanding computational step involves PCA, phase screen propagation and covariance estimates. In a second step, realistic instantaneous images, fully accounting for anisoplanatic effects, are quickly generated. Preliminary results are presented.
Zhang, Baofeng; D'Erasmo, Michael P; Murelli, Ryan P; Gallicchio, Emilio
2016-09-30
We report the results of a binding free energy-based virtual screening campaign of a library of 77 α-hydroxytropolone derivatives against the challenging RNase H active site of the reverse transcriptase (RT) enzyme of human immunodeficiency virus-1. Multiple protonation states, rotamer states, and binding modalities of each compound were individually evaluated. The work involved more than 300 individual absolute alchemical binding free energy parallel molecular dynamics calculations and over 1 million CPU hours on national computing clusters and a local campus computational grid. The thermodynamic and structural measures obtained in this work rationalize a series of characteristics of this system useful for guiding future synthetic and biochemical efforts. The free energy model identified key ligand-dependent entropic and conformational reorganization processes difficult to capture using standard docking and scoring approaches. Binding free energy-based optimization of the lead compounds emerging from the virtual screen has yielded four compounds with very favorable binding properties, which will be the subject of further experimental investigations. This work is one of the few reported applications of advanced-binding free energy models to large-scale virtual screening and optimization projects. It further demonstrates that, with suitable algorithms and automation, advanced-binding free energy models can have a useful role in early-stage drug-discovery programs.
Quartic scaling MP2 for solids: A highly parallelized algorithm in the plane wave basis
NASA Astrophysics Data System (ADS)
Schäfer, Tobias; Ramberger, Benjamin; Kresse, Georg
2017-03-01
We present a low-complexity algorithm to calculate the correlation energy of periodic systems in second-order Møller-Plesset (MP2) perturbation theory. In contrast to previous approximation-free MP2 codes, our implementation possesses a quartic scaling, O ( N 4 ) , with respect to the system size N and offers an almost ideal parallelization efficiency. The general issue that the correlation energy converges slowly with the number of basis functions is eased by an internal basis set extrapolation. The key concept to reduce the scaling is to eliminate all summations over virtual orbitals which can be elegantly achieved in the Laplace transformed MP2 formulation using plane wave basis sets and fast Fourier transforms. Analogously, this approach could allow us to calculate second order screened exchange as well as particle-hole ladder diagrams with a similar low complexity. Hence, the presented method can be considered as a step towards systematically improved correlation energies.
A Stochastic Spiking Neural Network for Virtual Screening.
Morro, A; Canals, V; Oliver, A; Alomar, M L; Galan-Prado, F; Ballester, P J; Rossello, J L
2018-04-01
Virtual screening (VS) has become a key computational tool in early drug design and screening performance is of high relevance due to the large volume of data that must be processed to identify molecules with the sought activity-related pattern. At the same time, the hardware implementations of spiking neural networks (SNNs) arise as an emerging computing technique that can be applied to parallelize processes that normally present a high cost in terms of computing time and power. Consequently, SNN represents an attractive alternative to perform time-consuming processing tasks, such as VS. In this brief, we present a smart stochastic spiking neural architecture that implements the ultrafast shape recognition (USR) algorithm achieving two order of magnitude of speed improvement with respect to USR software implementations. The neural system is implemented in hardware using field-programmable gate arrays allowing a highly parallelized USR implementation. The results show that, due to the high parallelization of the system, millions of compounds can be checked in reasonable times. From these results, we can state that the proposed architecture arises as a feasible methodology to efficiently enhance time-consuming data-mining processes such as 3-D molecular similarity search.
An economic evaluation of colorectal cancer screening in primary care practice.
Meenan, Richard T; Anderson, Melissa L; Chubak, Jessica; Vernon, Sally W; Fuller, Sharon; Wang, Ching-Yun; Green, Beverly B
2015-06-01
Recent colorectal cancer screening studies focus on optimizing adherence. This study evaluated the cost effectiveness of interventions using electronic health records (EHRs); automated mailings; and stepped support increases to improve 2-year colorectal cancer screening adherence. Analyses were based on a parallel-design, randomized trial in which three stepped interventions (EHR-linked mailings ["automated"]; automated plus telephone assistance ["assisted"]; or automated and assisted plus nurse navigation to testing completion or refusal [navigated"]) were compared to usual care. Data were from August 2008 to November 2011, with analyses performed during 2012-2013. Implementation resources were micro-costed; research and registry development costs were excluded. Incremental cost-effectiveness ratios (ICERs) were based on number of participants current for screening per guidelines over 2 years. Bootstrapping examined robustness of results. Intervention delivery cost per participant current for screening ranged from $21 (automated) to $27 (navigated). Inclusion of induced testing costs (e.g., screening colonoscopy) lowered expenditures for automated (ICER=-$159) and assisted (ICER=-$36) relative to usual care over 2 years. Savings arose from increased fecal occult blood testing, substituting for more expensive colonoscopies in usual care. Results were broadly consistent across demographic subgroups. More intensive interventions were consistently likely to be cost effective relative to less intensive interventions, with willingness to pay values of $600-$1,200 for an additional person current for screening yielding ≥80% probability of cost effectiveness. Two-year cost effectiveness of a stepped approach to colorectal cancer screening promotion based on EHR data is indicated, but longer-term cost effectiveness requires further study. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
2010-01-01
Background Earlier diagnosis followed by multi-factorial cardiovascular risk intervention may improve outcomes in Type 2 Diabetes Mellitus (T2DM). Latent phase identification through screening requires structured, appropriately targeted population-based approaches. Providers responsible for implementing screening policy await evidence of clinical and cost effectiveness from randomised intervention trials in screen-detected T2DM cases. UK South Asians are at particularly high risk of abnormal glucose tolerance and T2DM. To be effective national screening programmes must achieve good coverage across the population by identifying barriers to the detection of disease and adapting to the delivery of earlier care. Here we describe the rationale and methods of a systematic community screening programme and randomised controlled trial of cardiovascular risk management within a UK multiethnic setting (ADDITION-Leicester). Design A single-blind cluster randomised, parallel group trial among people with screen-detected T2DM comparing a protocol driven intensive multi-factorial treatment with conventional care. Methods ADDITION-Leicester consists of community-based screening and intervention phases within 20 general practices coordinated from a single academic research centre. Screening adopts a universal diagnostic approach via repeated 75g-Oral Glucose Tolerance Tests within an eligible non-diabetic population of 66,320 individuals aged 40-75 years (25-75 years South Asian). Volunteers also provide detailed medical and family histories; complete health questionnaires, undergo anthropometric measures, lipid profiling and a proteinuria assessment. Primary outcome is reduction in modelled Coronary Heart Disease (UKPDS CHD) risk at five years. Seven thousand (30% of South Asian ethnic origin) volunteers over three years will be recruited to identify a screen-detected T2DM cohort (n = 285) powered to detected a 6% relative difference (80% power, alpha 0.05) between treatment groups at one year. Randomisation will occur at practice-level with newly diagnosed T2DM cases receiving either conventional (according to current national guidelines) or intensive (algorithmic target-driven multi-factorial cardiovascular risk intervention) treatments. Discussion ADDITION-Leicester is the largest multiethnic (targeting >30% South Asian recruitment) community T2DM and vascular risk screening programme in the UK. By assessing feasibility and efficacy of T2DM screening, it will inform national disease prevention policy and contribute significantly to our understanding of the health care needs of UK South Asians. Trial registration Clinicaltrial.gov (NCT00318032). PMID:20170482
Bilingual parallel programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, I.; Overbeek, R.
1990-01-01
Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less
A redox proteomics approach to investigate the mode of action of nanomaterials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riebeling, Christian; Wiemann, Martin; Schnekenburger, Jürgen
2016-05-15
Numbers of engineered nanomaterials (ENMs) are steadily increasing. Therefore, alternative testing approaches with reduced costs and high predictivity suitable for high throughput screening and prioritization are urgently needed to ensure a fast and effective development of safe products. In parallel, extensive research efforts are targeted to understanding modes of action of ENMs, which may also support the development of new predictive assays. Oxidative stress is a widely accepted paradigm associated with different adverse outcomes of ENMs. It has frequently been identified in in vitro and in vivo studies and different assays have been developed for this purpose. Fluorescent dye basedmore » read-outs are most frequently used for cell testing in vitro but may be limited due to possible interference of the ENMs. Recently, other assays have been put forward such as acellular determination of ROS production potential using methods like electron spin resonance, antioxidant quantification or the use of specific sensors. In addition, Omics based approaches have gained increasing attention. In particular, redox proteomics can combine the assessment of oxidative stress with the advantage of getting more detailed mechanistic information. Here we propose a comprehensive testing strategy for assessing the oxidative stress potential of ENMs, which combines acellular methods and fast in vitro screening approaches, as well as a more involved detailed redox proteomics approach. This allows for screening and prioritization in a first tier and, if required, also for unraveling mechanistic details down to compromised signaling pathways. - Highlights: • Oxidative stress is a general paradigm for nanomaterial hazard mechanism of action. • Reactive oxygen species generation can be predicted using acellular assays. • Cellular assays based on fluorescence suffer from interference by nanomaterials. • Protein carbonylation is an irreversible and predictive mark of oxidative stress. • Proteomics of carbonylation indicates affected pathways and mechanism of action.« less
Ratni, Hasane; Rogers-Evans, Mark; Bissantz, Caterina; Grundschober, Christophe; Moreau, Jean-Luc; Schuler, Franz; Fischer, Holger; Alvarez Sanchez, Ruben; Schnider, Patrick
2015-03-12
From a micromolar high throughput screening hit 7, the successful complementary application of a chemogenomic approach and of a scaffold hopping exercise rapidly led to a low single digit nanomolar human vasopressin 1a (hV1a) receptor antagonist 38. Initial optimization of the mouse V1a activities delivered suitable tool compounds which demonstrated a V1a mediated central in vivo effect. This novel series was further optimized through parallel synthesis with a focus on balancing lipophilicity to achieve robust aqueous solubility while avoiding P-gp mediated efflux. These efforts led to the discovery of the highly potent and selective brain-penetrant hV1a antagonist RO5028442 (8) suitable for human clinical studies in people with autism.
Hossack, John A; Sumanaweera, Thilaka S; Napel, Sandy; Ha, Jun S
2002-08-01
An approach for acquiring dimensionally accurate three-dimensional (3-D) ultrasound data from multiple 2-D image planes is presented. This is based on the use of a modified linear-phased array comprising a central imaging array that acquires multiple, essentially parallel, 2-D slices as the transducer is translated over the tissue of interest. Small, perpendicularly oriented, tracking arrays are integrally mounted on each end of the imaging transducer. As the transducer is translated in an elevational direction with respect to the central imaging array, the images obtained by the tracking arrays remain largely coplanar. The motion between successive tracking images is determined using a minimum sum of absolute difference (MSAD) image matching technique with subpixel matching resolution. An initial phantom scanning-based test of a prototype 8 MHz array indicates that linear dimensional accuracy of 4.6% (2 sigma) is achievable. This result compares favorably with those obtained using an assumed average velocity [31.5% (2 sigma) accuracy] and using an approach based on measuring image-to-image decorrelation [8.4% (2 sigma) accuracy]. The prototype array and imaging system were also tested in a clinical environment, and early results suggest that the approach has the potential to enable a low cost, rapid, screening method for detecting carotid artery stenosis. The average time for performing a screening test for carotid stenosis was reduced from an average of 45 minutes using 2-D duplex Doppler to 12 minutes using the new 3-D scanning approach.
From the ORFeome concept to highly comprehensive, full-genome screening libraries.
Rid, Raphaela; Abdel-Hadi, Omar; Maier, Richard; Wagner, Martin; Hundsberger, Harald; Hintner, Helmut; Bauer, Johann; Onder, Kamil
2013-02-01
Recombination-based cloning techniques have in recent times facilitated the establishment of genome-scale single-gene ORFeome repositories. Their further handling and downstream application in systematic fashion is, however, practically impeded because of logistical plus economic challenges. At this juncture, simultaneously transferring entire gene collections in compiled pool format could represent an advanced compromise between systematic ORFeome (an organism's entire set of protein-encoding open reading frames) projects and traditional random library approaches, but has not yet been considered in great detail. In our endeavor to merge the comprehensiveness of ORFeomes with a basically simple, streamlined, and easily executable single-tube design, we have here produced five different pooled screening-ready libraries for both Staphylococcus aureus and Homo sapiens. By evaluating the parallel transfer efficiencies of differentially sized genes from initial polymerase chain reaction (PCR) product amplification to entry and final destination library construction via quantitative real-time PCR, we found that the complexity of the gene population is fairly stably maintained once an entry resource has been successfully established, and that no apparent size-selection bias loss of large inserts takes place. Recombinational transfer processes are hence robust enough for straightforwardly achieving such pooled screening libraries.
Iskit, Sedef; Lieftink, Cor; Halonen, Pasi; Shahrabi, Aida; Possik, Patricia A; Beijersbergen, Roderick L; Peeper, Daniel S
2016-07-12
Breast cancer is the second most common cause of cancer-related deaths worldwide among women. Despite several therapeutic options, 15% of breast cancer patients succumb to the disease owing to tumor relapse and acquired therapy resistance. Particularly in triple-negative breast cancer (TNBC), developing effective treatments remains challenging owing to the lack of a common vulnerability that can be exploited by targeted approaches. We have previously shown that tumor cells have different requirements for growth in vivo than in vitro. Therefore, to discover novel drug targets for TNBC, we performed parallel in vivo and in vitro genetic shRNA dropout screens. We identified several potential drug targets that were required for tumor growth in vivo to a greater extent than in vitro. By combining pharmacologic inhibitors acting on a subset of these candidates, we identified a synergistic interaction between EGFR and ROCK inhibitors. This combination effectively reduced TNBC cell growth by inducing cell cycle arrest. These results illustrate the power of in vivo genetic screens and warrant further validation of EGFR and ROCK as combined pharmacologic targets for breast cancer.
MEGADOCK: An All-to-All Protein-Protein Interaction Prediction System Using Tertiary Structure Data
Ohue, Masahito; Matsuzaki, Yuri; Uchikoga, Nobuyuki; Ishida, Takashi; Akiyama, Yutaka
2014-01-01
The elucidation of protein-protein interaction (PPI) networks is important for understanding cellular structure and function and structure-based drug design. However, the development of an effective method to conduct exhaustive PPI screening represents a computational challenge. We have been investigating a protein docking approach based on shape complementarity and physicochemical properties. We describe here the development of the protein-protein docking software package “MEGADOCK” that samples an extremely large number of protein dockings at high speed. MEGADOCK reduces the calculation time required for docking by using several techniques such as a novel scoring function called the real Pairwise Shape Complementarity (rPSC) score. We showed that MEGADOCK is capable of exhaustive PPI screening by completing docking calculations 7.5 times faster than the conventional docking software, ZDOCK, while maintaining an acceptable level of accuracy. When MEGADOCK was applied to a subset of a general benchmark dataset to predict 120 relevant interacting pairs from 120 x 120 = 14,400 combinations of proteins, an F-measure value of 0.231 was obtained. Further, we showed that MEGADOCK can be applied to a large-scale protein-protein interaction-screening problem with accuracy better than random. When our approach is combined with parallel high-performance computing systems, it is now feasible to search and analyze protein-protein interactions while taking into account three-dimensional structures at the interactome scale. MEGADOCK is freely available at http://www.bi.cs.titech.ac.jp/megadock. PMID:23855673
DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.
Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques
2008-09-08
Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.
High performance in silico virtual drug screening on many-core processors.
McIntosh-Smith, Simon; Price, James; Sessions, Richard B; Ibarra, Amaurys A
2015-05-01
Drug screening is an important part of the drug development pipeline for the pharmaceutical industry. Traditional, lab-based methods are increasingly being augmented with computational methods, ranging from simple molecular similarity searches through more complex pharmacophore matching to more computationally intensive approaches, such as molecular docking. The latter simulates the binding of drug molecules to their targets, typically protein molecules. In this work, we describe BUDE, the Bristol University Docking Engine, which has been ported to the OpenCL industry standard parallel programming language in order to exploit the performance of modern many-core processors. Our highly optimized OpenCL implementation of BUDE sustains 1.43 TFLOP/s on a single Nvidia GTX 680 GPU, or 46% of peak performance. BUDE also exploits OpenCL to deliver effective performance portability across a broad spectrum of different computer architectures from different vendors, including GPUs from Nvidia and AMD, Intel's Xeon Phi and multi-core CPUs with SIMD instruction sets.
High performance in silico virtual drug screening on many-core processors
Price, James; Sessions, Richard B; Ibarra, Amaurys A
2015-01-01
Drug screening is an important part of the drug development pipeline for the pharmaceutical industry. Traditional, lab-based methods are increasingly being augmented with computational methods, ranging from simple molecular similarity searches through more complex pharmacophore matching to more computationally intensive approaches, such as molecular docking. The latter simulates the binding of drug molecules to their targets, typically protein molecules. In this work, we describe BUDE, the Bristol University Docking Engine, which has been ported to the OpenCL industry standard parallel programming language in order to exploit the performance of modern many-core processors. Our highly optimized OpenCL implementation of BUDE sustains 1.43 TFLOP/s on a single Nvidia GTX 680 GPU, or 46% of peak performance. BUDE also exploits OpenCL to deliver effective performance portability across a broad spectrum of different computer architectures from different vendors, including GPUs from Nvidia and AMD, Intel’s Xeon Phi and multi-core CPUs with SIMD instruction sets. PMID:25972727
Localized transfection on arrays of magnetic beads coated with PCR products.
Isalan, Mark; Santori, Maria Isabel; Gonzalez, Cayetano; Serrano, Luis
2005-02-01
High-throughput gene analysis would benefit from new approaches for delivering DNA or RNA into cells. Here we describe a simple system that allows any molecular biology laboratory to carry out multiple, parallel cell transfections on microscope coverslip arrays. By using magnetically defined positions and PCR product-coated paramagnetic beads, we achieved transfection in a variety of cell lines. Beads may be added to the cells at any time, allowing both spatial and temporal control of transfection. Because the beads may be coated with more than one gene construct, the method can be used to achieve cotransfection within single cells. Furthermore, PCR-generated mutants may be conveniently screened, bypassing cloning and plasmid purification steps. We illustrated the applicability of the method by screening combinatorial peptide libraries, fused to GFP, to identify previously unknown cellular localization motifs. In this way, we identified several localizing peptides, including structured localization signals based around the scaffold of a single C2H2 zinc finger.
2013-01-01
Background Efficient screening of bacterial artificial chromosome (BAC) libraries with polymerase chain reaction (PCR)-based markers is feasible provided that a multidimensional pooling strategy is implemented. Single nucleotide polymorphisms (SNPs) can be screened in multiplexed format, therefore this marker type lends itself particularly well for medium- to high-throughput applications. Combining the power of multiplex-PCR assays with a multidimensional pooling system may prove to be especially challenging in a polyploid genome. In polyploid genomes two classes of SNPs need to be distinguished, polymorphisms between accessions (intragenomic SNPs) and those differentiating between homoeologous genomes (intergenomic SNPs). We have assessed whether the highly parallel Illumina GoldenGate® Genotyping Assay is suitable for the screening of a BAC library of the polyploid Brassica napus genome. Results A multidimensional screening platform was developed for a Brassica napus BAC library which is composed of almost 83,000 clones. Intragenomic and intergenomic SNPs were included in Illumina’s GoldenGate® Genotyping Assay and both SNP classes were used successfully for screening of the multidimensional BAC pools of the Brassica napus library. An optimized scoring method is proposed which is especially valuable for SNP calling of intergenomic SNPs. Validation of the genotyping results by independent methods revealed a success of approximately 80% for the multiplex PCR-based screening regardless of whether intra- or intergenomic SNPs were evaluated. Conclusions Illumina’s GoldenGate® Genotyping Assay can be efficiently used for screening of multidimensional Brassica napus BAC pools. SNP calling was specifically tailored for the evaluation of BAC pool screening data. The developed scoring method can be implemented independently of plant reference samples. It is demonstrated that intergenomic SNPs represent a powerful tool for BAC library screening of a polyploid genome. PMID:24010766
Functional Profiling Using the Saccharomyces Genome Deletion Project Collections.
Nislow, Corey; Wong, Lai Hong; Lee, Amy Huei-Yi; Giaever, Guri
2016-09-01
The ability to measure and quantify the fitness of an entire organism requires considerably more complex approaches than simply using traditional "omic" methods that examine, for example, the abundance of RNA transcripts, proteins, or metabolites. The yeast deletion collections represent the only systematic, comprehensive set of null alleles for any organism in which such fitness measurements can be assayed. Generated by the Saccharomyces Genome Deletion Project, these collections allow the systematic and parallel analysis of gene functions using any measurable phenotype. The unique 20-bp molecular barcodes engineered into the genome of each deletion strain facilitate the massively parallel analysis of individual fitness. Here, we present functional genomic protocols for use with the yeast deletion collections. We describe how to maintain, propagate, and store the deletion collections and how to perform growth fitness assays on single and parallel screening platforms. Phenotypic fitness analyses of the yeast mutants, described in brief here, provide important insights into biological functions, mechanisms of drug action, and response to environmental stresses. It is important to bear in mind that the specific assays described in this protocol represent some of the many ways in which these collections can be assayed, and in this description particular attention is paid to maximizing throughput using growth as the phenotypic measure. © 2016 Cold Spring Harbor Laboratory Press.
Falck, David; de Vlieger, Jon S B; Giera, Martin; Honing, Maarten; Irth, Hubertus; Niessen, Wilfried M A; Kool, Jeroen
2012-04-01
In this study, an integrated approach is developed for the formation, identification and biological characterization of electrochemical conversion products of p38α mitogen-activated protein kinase inhibitors. This work demonstrates the hyphenation of an electrochemical reaction cell with a continuous-flow bioaffinity assay and parallel LC-HR-MS. Competition of the formed products with a tracer (SKF-86002) that shows fluorescence enhancement in the orthosteric binding site of the p38α kinase is the readout for bioaffinity. Parallel HR-MS(n) experiments provided information on the identity of binders and non-binders. Finally, the data produced with this on-line system were compared to electrochemical conversion products generated off-line. The electrochemical conversion of 1-{6-chloro-5-[(2R,5S)-4-(4-fluorobenzyl)-2,5-dimethylpiperazine-1-carbonyl]-3aH-indol-3-yl}-2-morpholinoethane-1,2-dione resulted in eight products, three of which showed bioaffinity in the continuous-flow p38α bioaffinity assay used. Electrochemical conversion of BIRB796 resulted, amongst others, in the formation of the reactive quinoneimine structure and its corresponding hydroquinone. Both products were detected in the p38α bioaffinity assay, which indicates binding to the p38α kinase.
High-Throughput Screening of Na(V)1.7 Modulators Using a Giga-Seal Automated Patch Clamp Instrument.
Chambers, Chris; Witton, Ian; Adams, Cathryn; Marrington, Luke; Kammonen, Juha
2016-03-01
Voltage-gated sodium (Na(V)) channels have an essential role in the initiation and propagation of action potentials in excitable cells, such as neurons. Of these channels, Na(V)1.7 has been indicated as a key channel for pain sensation. While extensive efforts have gone into discovering novel Na(V)1.7 modulating compounds for the treatment of pain, none has reached the market yet. In the last two years, new compound screening technologies have been introduced, which may speed up the discovery of such compounds. The Sophion Qube(®) is a next-generation 384-well giga-seal automated patch clamp (APC) screening instrument, capable of testing thousands of compounds per day. By combining high-throughput screening and follow-up compound testing on the same APC platform, it should be possible to accelerate the hit-to-lead stage of ion channel drug discovery and help identify the most interesting compounds faster. Following a period of instrument beta-testing, a Na(V)1.7 high-throughput screen was run with two Pfizer plate-based compound subsets. In total, data were generated for 158,000 compounds at a median success rate of 83%, which can be considered high in APC screening. In parallel, IC50 assay validation and protocol optimization was completed with a set of reference compounds to understand how the IC50 potencies generated on the Qube correlate with data generated on the more established Sophion QPatch(®) APC platform. In summary, the results presented here demonstrate that the Qube provides a comparable but much faster approach to study Na(V)1.7 in a robust and reliable APC assay for compound screening.
Identifying Interacting Genetic Variations by Fish-Swarm Logic Regression
Yang, Aiyuan; Yan, Chunxia; Zhu, Feng; Zhao, Zhongmeng; Cao, Zhi
2013-01-01
Understanding associations between genotypes and complex traits is a fundamental problem in human genetics. A major open problem in mapping phenotypes is that of identifying a set of interacting genetic variants, which might contribute to complex traits. Logic regression (LR) is a powerful multivariant association tool. Several LR-based approaches have been successfully applied to different datasets. However, these approaches are not adequate with regard to accuracy and efficiency. In this paper, we propose a new LR-based approach, called fish-swarm logic regression (FSLR), which improves the logic regression process by incorporating swarm optimization. In our approach, a school of fish agents are conducted in parallel. Each fish agent holds a regression model, while the school searches for better models through various preset behaviors. A swarm algorithm improves the accuracy and the efficiency by speeding up the convergence and preventing it from dropping into local optimums. We apply our approach on a real screening dataset and a series of simulation scenarios. Compared to three existing LR-based approaches, our approach outperforms them by having lower type I and type II error rates, being able to identify more preset causal sites, and performing at faster speeds. PMID:23984382
NASA Astrophysics Data System (ADS)
Matsuda, Y.; Kakutani, K.; Nonomura, T.; Kimbara, J.; Osamura, K.; Kusakar, S.; Toyoda, H.
2015-10-01
An electric field screen can be used to keep mosquitoes out of houses with open windows. In this study, doubly charged dipolar electric field screens (DD-screens) were used to capture mosquitoes entering through a window. The screen had two components: three layers of insulated conductor iron wires (ICWs) in parallel arrays and two electrostatic direct current (DC) voltage generators that supplied negative or positive voltages to the ICWs. Within each layer, the ICWs were parallel at 5-mm intervals, and connected to each other and to a negative or positive voltage generator. The negatively and positively charged ICWs are represented as ICW(-) and ICW(+), respectively. The screen consisted of one ICW(+) layer with an ICW(-) layer on either side. The Asian tiger mosquito (Aedes albopictus) and house mosquito (Culex pipiens) were used as models of vectors carrying viral pathogens. Adult mosquitoes were blown into the space between the ICWs by sending compressed air through the tip of an insect aspirator to determine the voltage range that captured all of the test insects. Wind speed was measured at the surface of the ICW using a sensitive anemometer. The result showed that at ≥ 1.2 kV, the force was strong enough that the ICWs captured all of the mosquitoes, despite a wind speed of 7 m/s. Therefore, the DD-screen could serve as a physical barrier to prevent noxious mosquitoes from entering houses with good air penetration.
AA9int: SNP Interaction Pattern Search Using Non-Hierarchical Additive Model Set.
Lin, Hui-Yi; Huang, Po-Yu; Chen, Dung-Tsa; Tung, Heng-Yuan; Sellers, Thomas A; Pow-Sang, Julio; Eeles, Rosalind; Easton, Doug; Kote-Jarai, Zsofia; Amin Al Olama, Ali; Benlloch, Sara; Muir, Kenneth; Giles, Graham G; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher A; Schleutker, Johanna; Nordestgaard, Børge G; Travis, Ruth C; Hamdy, Freddie; Neal, David E; Pashayan, Nora; Khaw, Kay-Tee; Stanford, Janet L; Blot, William J; Thibodeau, Stephen N; Maier, Christiane; Kibel, Adam S; Cybulski, Cezary; Cannon-Albright, Lisa; Brenner, Hermann; Kaneva, Radka; Batra, Jyotsna; Teixeira, Manuel R; Pandha, Hardev; Lu, Yong-Jie; Park, Jong Y
2018-06-07
The use of single nucleotide polymorphism (SNP) interactions to predict complex diseases is getting more attention during the past decade, but related statistical methods are still immature. We previously proposed the SNP Interaction Pattern Identifier (SIPI) approach to evaluate 45 SNP interaction patterns/patterns. SIPI is statistically powerful but suffers from a large computation burden. For large-scale studies, it is necessary to use a powerful and computation-efficient method. The objective of this study is to develop an evidence-based mini-version of SIPI as the screening tool or solitary use and to evaluate the impact of inheritance mode and model structure on detecting SNP-SNP interactions. We tested two candidate approaches: the 'Five-Full' and 'AA9int' method. The Five-Full approach is composed of the five full interaction models considering three inheritance modes (additive, dominant and recessive). The AA9int approach is composed of nine interaction models by considering non-hierarchical model structure and the additive mode. Our simulation results show that AA9int has similar statistical power compared to SIPI and is superior to the Five-Full approach, and the impact of the non-hierarchical model structure is greater than that of the inheritance mode in detecting SNP-SNP interactions. In summary, it is recommended that AA9int is a powerful tool to be used either alone or as the screening stage of a two-stage approach (AA9int+SIPI) for detecting SNP-SNP interactions in large-scale studies. The 'AA9int' and 'parAA9int' functions (standard and parallel computing version) are added in the SIPI R package, which is freely available at https://linhuiyi.github.io/LinHY_Software/. hlin1@lsuhsc.edu. Supplementary data are available at Bioinformatics online.
Williams, Jane H; Carter, Stacy M
2016-10-06
Cervical cancer disproportionately burdens disadvantaged women. Organised cervical screening aims to make cancer prevention available to all women in a population, yet screening uptake and cancer incidence and mortality are strongly correlated with socioeconomic status (SES). Reaching underscreened populations is a stated priority in many screening programs, usually with an emphasis on something like 'equity'. Equity is a poorly defined and understood concept. We aimed to explain experts' perspectives on how cervical screening programs might justifiably respond to 'the underscreened'. This paper reports on a grounded theory study of cervical screening experts involved in program organisation. Participants were 23 experts from several countries and a range of backgrounds: gynecology; epidemiology; public health; pathology; general practice; policy making. Data were gathered via semi-structured interview and concepts developed through transcript coding and memo writing. Most experts expressed an intuitive commitment to reducing systematic differences in screening participation or cancer outcomes. They took three different implicit positions, however, on what made organised programs justifiable with respect to underscreened populations. These were: 1) accepting that population screening is likely to miss certain disenfranchised groups for practical and cultural reasons, and focusing on maximising mainstream reach; 2) identifying and removing barriers to screening; and 3) providing parallel tailored screening services that attended to different cultural needs. Positions tended to fall along country of practice lines. Experts emphasised the provision of opportunity for underscreened populations to take up screening. A focus on opportunity appeared to rely on tacit premises not supported by evidence: that provision of meaningful opportunity leads to increased uptake, and that increased uptake of an initial screening test by disadvantaged populations would decrease cervical cancer incidence and mortality. There was little attention to anything other than the point of testing, or the difficulties disadvantaged women can have in accessing follow up care. The different approaches to 'improving equity' taken by participants are differently justified, and differently justifiable, but none attend directly to the broader conditions of disadvantage.
DOVIS: an implementation for high-throughput virtual screening using AutoDock.
Zhang, Shuxing; Kumar, Kamal; Jiang, Xiaohui; Wallqvist, Anders; Reifman, Jaques
2008-02-27
Molecular-docking-based virtual screening is an important tool in drug discovery that is used to significantly reduce the number of possible chemical compounds to be investigated. In addition to the selection of a sound docking strategy with appropriate scoring functions, another technical challenge is to in silico screen millions of compounds in a reasonable time. To meet this challenge, it is necessary to use high performance computing (HPC) platforms and techniques. However, the development of an integrated HPC system that makes efficient use of its elements is not trivial. We have developed an application termed DOVIS that uses AutoDock (version 3) as the docking engine and runs in parallel on a Linux cluster. DOVIS can efficiently dock large numbers (millions) of small molecules (ligands) to a receptor, screening 500 to 1,000 compounds per processor per day. Furthermore, in DOVIS, the docking session is fully integrated and automated in that the inputs are specified via a graphical user interface, the calculations are fully integrated with a Linux cluster queuing system for parallel processing, and the results can be visualized and queried. DOVIS removes most of the complexities and organizational problems associated with large-scale high-throughput virtual screening, and provides a convenient and efficient solution for AutoDock users to use this software in a Linux cluster platform.
Chitty, Lyn S.; Lo, Y. M. Dennis
2015-01-01
The identification of cell-free fetal DNA (cffDNA) in maternal plasma in 1997 heralded the most significant change in obstetric care for decades, with the advent of safer screening and diagnosis based on analysis of maternal blood. Here, we describe how the technological advances offered by next-generation sequencing have allowed for the development of a highly sensitive screening test for aneuploidies as well as definitive prenatal molecular diagnosis for some monogenic disorders. PMID:26187875
Directions in parallel programming: HPF, shared virtual memory and object parallelism in pC++
NASA Technical Reports Server (NTRS)
Bodin, Francois; Priol, Thierry; Mehrotra, Piyush; Gannon, Dennis
1994-01-01
Fortran and C++ are the dominant programming languages used in scientific computation. Consequently, extensions to these languages are the most popular for programming massively parallel computers. We discuss two such approaches to parallel Fortran and one approach to C++. The High Performance Fortran Forum has designed HPF with the intent of supporting data parallelism on Fortran 90 applications. HPF works by asking the user to help the compiler distribute and align the data structures with the distributed memory modules in the system. Fortran-S takes a different approach in which the data distribution is managed by the operating system and the user provides annotations to indicate parallel control regions. In the case of C++, we look at pC++ which is based on a concurrent aggregate parallel model.
An oppositely charged insect exclusion screen with gap-free multiple electric fields
NASA Astrophysics Data System (ADS)
Matsuda, Yoshinori; Kakutani, Koji; Nonomura, Teruo; Kimbara, Junji; Kusakari, Shin-ichi; Osamura, Kazumi; Toyoda, Hideyoshi
2012-12-01
An electric field screen was constructed to examine insect attraction mechanisms in multiple electric fields generated inside the screen. The screen consisted of two parallel insulated conductor wires (ICWs) charged with equal but opposite voltages and two separate grounded nets connected to each other and placed on each side of the ICW layer. Insects released inside the fields were charged either positively or negatively as a result of electricity flow from or to the insect, respectively. The force generated between the charged insects and opposite ICW charges was sufficient to capture all insects.
Students' Adoption of Course-Specific Approaches to Learning in Two Parallel Courses
ERIC Educational Resources Information Center
Öhrstedt, Maria; Lindfors, Petra
2016-01-01
Research on students' adoption of course-specific approaches to learning in parallel courses is limited and inconsistent. This study investigated second-semester psychology students' levels of deep, surface and strategic approaches in two courses running in parallel within a real-life university setting. The results showed significant differences…
Insufficient evidence for the role of school dental screening in improving oral health.
Holmes, Richard D
2018-03-23
Data sourcesThe Cochrane Oral Health's Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Medline, Embase, the US National Institutes of Health Trials Registry (ClinicalTrials.gov) and the World Health Organization International Clinical Trials Registry Platform databases.Study selectionRandomised controlled trials (cluster or parallel) evaluating school dental screening compared with no intervention or with one type of screening compared with another were included.Data extraction and synthesisTwo reviewers independently abstracted data and assessed risk of bias. Risk ratios were calculated for dichotomous outcomes, with data being pooled where appropriate. The GRADE approach was used to interpret findings.ResultsSix trials involving 19,498 children were included. Two were considered to be at low risk of bias, three at unclear risk and one at high risk. No conclusions could be made from four studies comparing traditional screening versus no screening because the evidence was inconsistent. Two trials evaluating criteria-based screening versus no screening suggested a possible benefit; RR = 1.07 (95% CI; 0.99-1.16). No difference was found when comparing criteria-based screening with traditional screening, RR = 1.01, (95% CI; 0.94-1.08). No trials reported on long-term follow-up or cost-effectiveness and adverse events.ConclusionsThe trials included in this review evaluated short-term effects of screening, assessing follow-up periods of three to eight months. We found very low certainty evidence that was insufficient to allow us to draw conclusions about whether there is a role for traditional school dental screening in improving dental attendance. For criteria-based screening, we found low-certainty evidence that it may improve dental attendance when compared to no screening. However, when compared to traditional screening there was no evidence of a difference in dental attendance (very low-certainty evidence).We found low-certainty evidence to conclude that personalised or specific referral letters improve dental attendance when compared to non-specific counterparts. We also found low-certainty evidence that screening supplemented with motivation (oral health education and offer of free treatment) improves dental attendance in comparison to screening alone.We did not find any trials addressing cost-effectiveness and adverse effects of school dental screening.
Lü, Qiang; Xia, Xiao-Yan; Chen, Rong; Miao, Da-Jun; Chen, Sha-Sha; Quan, Li-Jun; Li, Hai-Ou
2012-01-01
Protein structure prediction (PSP), which is usually modeled as a computational optimization problem, remains one of the biggest challenges in computational biology. PSP encounters two difficult obstacles: the inaccurate energy function problem and the searching problem. Even if the lowest energy has been luckily found by the searching procedure, the correct protein structures are not guaranteed to obtain. A general parallel metaheuristic approach is presented to tackle the above two problems. Multi-energy functions are employed to simultaneously guide the parallel searching threads. Searching trajectories are in fact controlled by the parameters of heuristic algorithms. The parallel approach allows the parameters to be perturbed during the searching threads are running in parallel, while each thread is searching the lowest energy value determined by an individual energy function. By hybridizing the intelligences of parallel ant colonies and Monte Carlo Metropolis search, this paper demonstrates an implementation of our parallel approach for PSP. 16 classical instances were tested to show that the parallel approach is competitive for solving PSP problem. This parallel approach combines various sources of both searching intelligences and energy functions, and thus predicts protein conformations with good quality jointly determined by all the parallel searching threads and energy functions. It provides a framework to combine different searching intelligence embedded in heuristic algorithms. It also constructs a container to hybridize different not-so-accurate objective functions which are usually derived from the domain expertise.
Lü, Qiang; Xia, Xiao-Yan; Chen, Rong; Miao, Da-Jun; Chen, Sha-Sha; Quan, Li-Jun; Li, Hai-Ou
2012-01-01
Background Protein structure prediction (PSP), which is usually modeled as a computational optimization problem, remains one of the biggest challenges in computational biology. PSP encounters two difficult obstacles: the inaccurate energy function problem and the searching problem. Even if the lowest energy has been luckily found by the searching procedure, the correct protein structures are not guaranteed to obtain. Results A general parallel metaheuristic approach is presented to tackle the above two problems. Multi-energy functions are employed to simultaneously guide the parallel searching threads. Searching trajectories are in fact controlled by the parameters of heuristic algorithms. The parallel approach allows the parameters to be perturbed during the searching threads are running in parallel, while each thread is searching the lowest energy value determined by an individual energy function. By hybridizing the intelligences of parallel ant colonies and Monte Carlo Metropolis search, this paper demonstrates an implementation of our parallel approach for PSP. 16 classical instances were tested to show that the parallel approach is competitive for solving PSP problem. Conclusions This parallel approach combines various sources of both searching intelligences and energy functions, and thus predicts protein conformations with good quality jointly determined by all the parallel searching threads and energy functions. It provides a framework to combine different searching intelligence embedded in heuristic algorithms. It also constructs a container to hybridize different not-so-accurate objective functions which are usually derived from the domain expertise. PMID:23028708
Nonlinear Real-Time Optical Signal Processing
1990-09-01
pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with
Iversen, Carol; Druggan, Patrick; Schumacher, Sandra; Lehner, Angelika; Feer, Claudia; Gschwend, Karl; Joosten, Han; Stephan, Roger
2008-01-01
A differential medium, “Cronobacter” screening broth, has been designed to complement agars based on hydrolysis of chromogenic α-glucopyranoside substrates. The broth was evaluated using 329 Enterobacteriaceae strains (229 target isolates), spiked/naturally contaminated samples, and a parallel comparison with current methods for raw materials, line/end products, and factory environment samples. PMID:18310415
Moberg, Andreas; Hansson, Eva; Boyd, Helen
2014-01-01
Abstract With the public availability of biochemical assays and screening data constantly increasing, new applications for data mining and method analysis are evolving in parallel. One example is BioAssay Ontology (BAO) for systematic classification of assays based on screening setup and metadata annotations. In this article we report a high-throughput screening (HTS) against phospho-N-acetylmuramoyl-pentapeptide translocase (MraY), an attractive antibacterial drug target involved in peptidoglycan synthesis. The screen resulted in novel chemistry identification using a fluorescence resonance energy transfer assay. To address a subset of the false positive hits, a frequent hitter analysis was performed using an approach in which MraY hits were compared with hits from similar assays, previously used for HTS. The MraY assay was annotated according to BAO and three internal reference assays, using a similar assay design and detection technology, were identified. Analyzing the assays retrospectively, it was clear that both MraY and the three reference assays all showed a high false positive rate in the primary HTS assays. In the case of MraY, false positives were efficiently identified by applying a method to correct for compound interference at the hit-confirmation stage. Frequent hitter analysis based on the three reference assays with similar assay method identified additional false actives in the primary MraY assay as frequent hitters. This article demonstrates how assays annotated using BAO terms can be used to identify closely related reference assays, and that analysis based on these assays clearly can provide useful data to influence assay design, technology, and screening strategy. PMID:25415593
Concurrency-based approaches to parallel programming
NASA Technical Reports Server (NTRS)
Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.
1995-01-01
The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.
Shim, Youngseon; Kim, Hyung J; Jung, Younjoon
2012-01-01
Supercapacitors with two single-sheet graphene electrodes in the parallel plate geometry are studied via molecular dynamics (MD) computer simulations. Pure 1-ethyl-3-methylimidazolium tetrafluoroborate (EMI+BF4-) and a 1.1 M solution of EMI+BF4- in acetonitrile are considered as prototypes of room-temperature ionic liquids (RTILs) and organic electrolytes. Electrolyte structure, charge density and associated electric potential are investigated by varying the charges and separation of the two electrodes. Multiple charge layers formed in the electrolytes in the vicinity of the electrodes are found to screen the electrode surface charge almost completely. As a result, the supercapacitors show nearly an ideal electric double layer behavior, i.e., the electric potential exhibits essentially a plateau behavior in the entire electrolyte region except for sharp changes in screening zones very close to the electrodes. Due to its small size and large charge separation, BF4- is considerably more efficient in shielding electrode charges than EMI+. In the case of the acetonitrile solution, acetonitrile also plays an important role by aligning its dipoles near the electrodes; however, the overall screening mainly arises from ions. Because of the disparity of shielding efficiency between cations and anions, the capacitance of the positively-charged anode is significantly larger than that of the negatively-charged cathode. Therefore, the total cell capacitance in the parallel plate configuration is primarily governed by the cathode. Ion conductivity obtained via the Green-Kubo (GK) method is found to be largely independent of the electrode surface charge. Interestingly, EMI+BF4- shows higher GK ion conductivity than the 1.1 M acetonitrile solution between two parallel plate electrodes.
Manner, Suvi; Fallarero, Adyary
2018-05-03
Owing to the failure of conventional antibiotics in biofilm control, alternative approaches are urgently needed. Inhibition of quorum sensing (QS) represents an attractive target since it is involved in several processes essential for biofilm formation. In this study, a compound library of natural product derivatives ( n = 3040) was screened for anti-quorum sensing activity using Chromobacterium violaceum as reporter bacteria. Screening assays, based on QS-mediated violacein production and viability, were performed in parallel to identify non-bactericidal QS inhibitors (QSIs). Nine highly active QSIs were identified, while 328 compounds were classified as moderately actives and 2062 compounds as inactives. Re-testing of the highly actives at a lower concentration against C. violaceum , complemented by a literature search, led to the identification of two flavonoid derivatives as the most potent QSIs, and their impact on biofilm maturation in Escherichia coli and Pseudomonas aeruginosa was further investigated. Finally, effects of these leads on swimming and swarming motility of P. aeruginosa were quantified. The identified flavonoids affected all the studied QS-related functions at micromolar concentrations. These compounds can serve as starting points for further optimization and development of more potent QSIs as adjunctive agents used with antibiotics in the treatment of biofilms.
An integrated microfludic device for culturing and screening of Giardia lamblia.
Zheng, Guo-Xia; Zhang, Xue-Mei; Yang, Yu-Suo; Zeng, Shu-Rui; Wei, Jun-Feng; Wang, Yun-Hua; Li, Ya-Jie
2014-02-01
In vitro culturing of trophozoites was important for research of Giardia lamblia (G. lamblia), especially in discovery of anti-Giardia agents. The current culture methods mainly suffer from lab-intension or the obstacle in standardizing the gas condition. Thus, it could benefit from a more streamlined and integrated approach. Microfluidics offers a way to accomplish this goal. Here we presented an integrated microfluidic device for culturing and screening of G. lamblia. The device consisted of a polydimethylsiloxane (PDMS) microchip with an aerobic culture system. In the microchip, the functionality of integrated concentration gradient generator (CGG) with micro-scale cell culture enables dose-response experiment to be performed in a simple and reagent-saving way. The diffusion-based culture chambers allowed growing G. lamblia at the in vivo like environment. It notable that the highly air permeable material of parallel chambers maintain uniform anaerobic environment in different chambers easily. Using this device, G. lamblia were successfully cultured and stressed on-chip. In all cases, a dose-related inhibitory response was detected. The application of this device for these purposes represents the first step in developing a completely integrated microfluidic platform for high-throughput screening and might be expanded to other assays based on in vitro culture of G. lamblia with further tests. Copyright © 2013 Elsevier Inc. All rights reserved.
Parallel processing considerations for image recognition tasks
NASA Astrophysics Data System (ADS)
Simske, Steven J.
2011-01-01
Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.
NASA Astrophysics Data System (ADS)
Rizki, Permata Nur Miftahur; Lee, Heezin; Lee, Minsu; Oh, Sangyoon
2017-01-01
With the rapid advance of remote sensing technology, the amount of three-dimensional point-cloud data has increased extraordinarily, requiring faster processing in the construction of digital elevation models. There have been several attempts to accelerate the computation using parallel methods; however, little attention has been given to investigating different approaches for selecting the most suited parallel programming model for a given computing environment. We present our findings and insights identified by implementing three popular high-performance parallel approaches (message passing interface, MapReduce, and GPGPU) on time demanding but accurate kriging interpolation. The performances of the approaches are compared by varying the size of the grid and input data. In our empirical experiment, we demonstrate the significant acceleration by all three approaches compared to a C-implemented sequential-processing method. In addition, we also discuss the pros and cons of each method in terms of usability, complexity infrastructure, and platform limitation to give readers a better understanding of utilizing those parallel approaches for gridding purposes.
Fast parallel approach for 2-D DHT-based real-valued discrete Gabor transform.
Tao, Liang; Kwan, Hon Keung
2009-12-01
Two-dimensional fast Gabor transform algorithms are useful for real-time applications due to the high computational complexity of the traditional 2-D complex-valued discrete Gabor transform (CDGT). This paper presents two block time-recursive algorithms for 2-D DHT-based real-valued discrete Gabor transform (RDGT) and its inverse transform and develops a fast parallel approach for the implementation of the two algorithms. The computational complexity of the proposed parallel approach is analyzed and compared with that of the existing 2-D CDGT algorithms. The results indicate that the proposed parallel approach is attractive for real time image processing.
2010-01-01
Background Australia has a comparatively high incidence of colorectal (bowel) cancer; however, population screening uptake using faecal occult blood test (FOBT) remains low. This study will determine the impact on screening participation of a novel, Internet-based Personalised Decision Support (PDS) package. The PDS is designed to measure attitudes and cognitive concerns and provide people with individually tailored information, in real time, that will assist them with making a decision to screen. The hypothesis is that exposure to (tailored) PDS will result in greater participation in screening than participation following exposure to non-tailored PDS or resulting from the current non-tailored, paper-based approach. Methods/design A randomised parallel trial comprising three arms will be conducted. Men and women aged 50-74 years (N = 3240) will be recruited. They must have access to the Internet; have not had an FOBT within the previous 12 months, or sigmoidoscopy or colonoscopy within the previous 5 years; have had no clinical diagnosis of bowel cancer. Groups 1 and 2 (PDS arms) will access a website and complete a baseline survey measuring decision-to-screen stage, attitudes and cognitive concerns and will receive immediate feedback; Group 1 will receive information 'tailored' to their responses in the baseline survey and group 2 will received 'non-tailored' bowel cancer information. Respondents in both groups will subsequently receive an FOBT kit. Group 3 (usual practice arm) will complete a paper-based version of the baseline survey and respondents will subsequently receive 'non-tailored' paper-based bowel cancer information with accompanying FOBT kit. Following despatch of FOBTs, all respondents will be requested to complete an endpoint survey. Main outcome measures are (1) completion of FOBT and (2) change in decision-to-screen stage. Secondary outcomes include satisfaction with decision and change in attitudinal scores from baseline to endpoint. Analyses will be performed using Chi-square tests, analysis of variance and log binomial generalized linear models as appropriate. Discussion It is necessary to restrict participants to Internet users to provide an appropriately controlled evaluation of PDS. Once efficacy of the approach has been established, it will be important to evaluate effectiveness in the wider at-risk population, and to identify barriers to its implementation in those settings. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12610000095066 PMID:20843369
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jun; Liu, Guodong; Wu, Hong
2008-01-01
In this paper, we demonstrate an electrochemical high-throughput sensing platform for simple, sensitive detection of PSA based on QD labels. This sensing platform uses a microplate for immunoreactions and disposable screen-printed electrodes (SPE) for electrochemical stripping analysis of metal ions released from QD labels. With the 96-well microplate, capturing antibodies are conveniently immobilized to the well surface, and the process of immunoreaction is easily controlled. The formed sandwich complexes on the well surface are also easily isolated from reaction solutions. In particular, a microplate-based electrochemical assay can make it feasible to conduct a parallel analysis of several samples or multiplemore » protein markers. This assay offers a number of advantages including (1) simplicity, cost-effectiveness, (2) high sensitivity, (3) capability to sense multiple samples or targets in parallel, and (4) a potentially portable device with an SPE array implanted in the microplate. This PSA assay is sensitive because it uses two amplification processes: (1) QDs as a label for enhancing electrical signal since secondary antibodies are linked to QDs that contain a large number of metal atoms and (2) there is inherent signal amplification for electrochemical stripping analysis—preconcentration of metal ion onto the electrode surface for amplifying electrical signals. Therefore, the high sensitivity of this method, stemming from dual signal amplification via QD labels and pre-concentration, allows low concentration levels to be detected while using small sample volumes. Thus, this QD-based electrochemical detection approach offers a simple, rapid, cost-effective, and high throughput assay of PSA.« less
Compaction die for forming a solid annulus on a right circular cylinder. [Patent application
Harlow, J.L.
1981-09-14
A compacting die is disclosed wherein the improvement comprises providing a screen in the die cavity, the screen being positioned parallel to the side walls of said die and dividing the die cavity into center and annular compartments. In addition, the use of this die in a method for producing an annular clad ceramic fuel material is disclosed.
Page, Tessa; Nguyen, Huong Thi Huynh; Hilts, Lindsey; Ramos, Lorena; Hanrahan, Grady
2012-06-01
This work reveals a computational framework for parallel electrophoretic separation of complex biological macromolecules and model urinary metabolites. More specifically, the implementation of a particle swarm optimization (PSO) algorithm on a neural network platform for multiparameter optimization of multiplexed 24-capillary electrophoresis technology with UV detection is highlighted. Two experimental systems were examined: (1) separation of purified rabbit metallothioneins and (2) separation of model toluene urinary metabolites and selected organic acids. Results proved superior to the use of neural networks employing standard back propagation when examining training error, fitting response, and predictive abilities. Simulation runs were obtained as a result of metaheuristic examination of the global search space with experimental responses in good agreement with predicted values. Full separation of selected analytes was realized after employing optimal model conditions. This framework provides guidance for the application of metaheuristic computational tools to aid in future studies involving parallel chemical separation and screening. Adaptable pseudo-code is provided to enable users of varied software packages and modeling framework to implement the PSO algorithm for their desired use.
Transputer parallel processing at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1989-01-01
The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.
Reiman, Anne; Pandey, Sarojini; Lloyd, Kate L; Dyer, Nigel; Khan, Mike; Crockard, Martin; Latten, Mark J; Watson, Tracey L; Cree, Ian A; Grammatopoulos, Dimitris K
2016-11-01
Background Detection of disease-associated mutations in patients with familial hypercholesterolaemia is crucial for early interventions to reduce risk of cardiovascular disease. Screening for these mutations represents a methodological challenge since more than 1200 different causal mutations in the low-density lipoprotein receptor has been identified. A number of methodological approaches have been developed for screening by clinical diagnostic laboratories. Methods Using primers targeting, the low-density lipoprotein receptor, apolipoprotein B, and proprotein convertase subtilisin/kexin type 9, we developed a novel Ion Torrent-based targeted re-sequencing method. We validated this in a West Midlands-UK small cohort of 58 patients screened in parallel with other mutation-targeting methods, such as multiplex polymerase chain reaction (Elucigene FH20), oligonucleotide arrays (Randox familial hypercholesterolaemia array) or the Illumina next-generation sequencing platform. Results In this small cohort, the next-generation sequencing method achieved excellent analytical performance characteristics and showed 100% and 89% concordance with the Randox array and the Elucigene FH20 assay. Investigation of the discrepant results identified two cases of mutation misclassification of the Elucigene FH20 multiplex polymerase chain reaction assay. A number of novel mutations not previously reported were also identified by the next-generation sequencing method. Conclusions Ion Torrent-based next-generation sequencing can deliver a suitable alternative for the molecular investigation of familial hypercholesterolaemia patients, especially when comprehensive mutation screening for rare or unknown mutations is required.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
Dissecting Immune Circuits by Linking CRISPR-Pooled Screens with Single-Cell RNA-Seq.
Jaitin, Diego Adhemar; Weiner, Assaf; Yofe, Ido; Lara-Astiaso, David; Keren-Shaul, Hadas; David, Eyal; Salame, Tomer Meir; Tanay, Amos; van Oudenaarden, Alexander; Amit, Ido
2016-12-15
In multicellular organisms, dedicated regulatory circuits control cell type diversity and responses. The crosstalk and redundancies within these circuits and substantial cellular heterogeneity pose a major research challenge. Here, we present CRISP-seq, an integrated method for massively parallel single-cell RNA sequencing (RNA-seq) and clustered regularly interspaced short palindromic repeats (CRISPR)-pooled screens. We show that profiling the genomic perturbation and transcriptome in the same cell enables us to simultaneously elucidate the function of multiple factors and their interactions. We applied CRISP-seq to probe regulatory circuits of innate immunity. By sampling tens of thousands of perturbed cells in vitro and in mice, we identified interactions and redundancies between developmental and signaling-dependent factors. These include opposing effects of Cebpb and Irf8 in regulating the monocyte/macrophage versus dendritic cell lineages and differential functions for Rela and Stat1/2 in monocyte versus dendritic cell responses to pathogens. This study establishes CRISP-seq as a broadly applicable, comprehensive, and unbiased approach for elucidating mammalian regulatory circuits. Copyright © 2016 Elsevier Inc. All rights reserved.
High-Throughput, Motility-Based Sorter for Microswimmers such as C. elegans
Yuan, Jinzhou; Zhou, Jessie; Raizen, David M.; Bau, Haim H.
2015-01-01
Animal motility varies with genotype, disease, aging, and environmental conditions. In many studies, it is desirable to carry out high throughput motility-based sorting to isolate rare animals for, among other things, forward genetic screens to identify genetic pathways that regulate phenotypes of interest. Many commonly used screening processes are labor-intensive, lack sensitivity, and require extensive investigator training. Here, we describe a sensitive, high throughput, automated, motility-based method for sorting nematodes. Our method is implemented in a simple microfluidic device capable of sorting thousands of animals per hour per module, and is amenable to parallelism. The device successfully enriches for known C. elegans motility mutants. Furthermore, using this device, we isolate low-abundance mutants capable of suppressing the somnogenic effects of the flp-13 gene, which regulates C. elegans sleep. By performing genetic complementation tests, we demonstrate that our motility-based sorting device efficiently isolates mutants for the same gene identified by tedious visual inspection of behavior on an agar surface. Therefore, our motility-based sorter is capable of performing high throughput gene discovery approaches to investigate fundamental biological processes. PMID:26008643
Peptide mediators of cholesterol efflux
Bielicki, John K.; Johansson, Jan
2013-04-09
The present invention provides a family of non-naturally occurring polypeptides having cholesterol efflux activity that parallels that of full-length apolipoproteins (e.g., Apo AI and Apo E), and having high selectivity for ABAC1 that parallels that of full-length apolipoproteins. The invention also provides compositions comprising such polypeptides, methods of identifying, screening and synthesizing such polypeptides, and methods of treating, preventing or diagnosing diseases and disorders associated with dyslipidemia, hypercholesterolemia and inflammation.
Potent and selective mediators of cholesterol efflux
Bielicki, John K; Johansson, Jan
2015-03-24
The present invention provides a family of non-naturally occurring polypeptides having cholesterol efflux activity that parallels that of full-length apolipoproteins (e.g., Apo AI and Apo E), and having high selectivity for ABAC1 that parallels that of full-length apolipoproteins. The invention also provides compositions comprising such polypeptides, methods of identifying, screening and synthesizing such polypeptides, and methods of treating, preventing or diagnosing diseases and disorders associated with dyslipidemia, hypercholesterolemia and inflammation.
THE EFFECT OF TWO-MAGNON SCATTERING ON PARALLEL-PUMP INSTABILITY THRESHOLDS.
Following a general description of the important properties and symmetries of the parallel-pump coupling and of two- magnon scattering, several...theoretical approaches to the problem of the effect of two- magnon scattering on the parallel-pump instability threshold are explored. A successful approach
Studies in optical parallel processing. [All optical and electro-optic approaches
NASA Technical Reports Server (NTRS)
Lee, S. H.
1978-01-01
Threshold and A/D devices for converting a gray scale image into a binary one were investigated for all-optical and opto-electronic approaches to parallel processing. Integrated optical logic circuits (IOC) and optical parallel logic devices (OPA) were studied as an approach to processing optical binary signals. In the IOC logic scheme, a single row of an optical image is coupled into the IOC substrate at a time through an array of optical fibers. Parallel processing is carried out out, on each image element of these rows, in the IOC substrate and the resulting output exits via a second array of optical fibers. The OPAL system for parallel processing which uses a Fabry-Perot interferometer for image thresholding and analog-to-digital conversion, achieves a higher degree of parallel processing than is possible with IOC.
Sentence alignment using feed forward neural network.
Fattah, Mohamed Abdel; Ren, Fuji; Kuroiwa, Shingo
2006-12-01
Parallel corpora have become an essential resource for work in multi lingual natural language processing. However, sentence aligned parallel corpora are more efficient than non-aligned parallel corpora for cross language information retrieval and machine translation applications. In this paper, we present a new approach to align sentences in bilingual parallel corpora based on feed forward neural network classifier. A feature parameter vector is extracted from the text pair under consideration. This vector contains text features such as length, punctuate score, and cognate score values. A set of manually prepared training data has been assigned to train the feed forward neural network. Another set of data was used for testing. Using this new approach, we could achieve an error reduction of 60% over length based approach when applied on English-Arabic parallel documents. Moreover this new approach is valid for any language pair and it is quite flexible approach since the feature parameter vector may contain more/less or different features than that we used in our system such as lexical match feature.
Evolving binary classifiers through parallel computation of multiple fitness cases.
Cagnoni, Stefano; Bergenti, Federico; Mordonini, Monica; Adorni, Giovanni
2005-06-01
This paper describes two versions of a novel approach to developing binary classifiers, based on two evolutionary computation paradigms: cellular programming and genetic programming. Such an approach achieves high computation efficiency both during evolution and at runtime. Evolution speed is optimized by allowing multiple solutions to be computed in parallel. Runtime performance is optimized explicitly using parallel computation in the case of cellular programming or implicitly taking advantage of the intrinsic parallelism of bitwise operators on standard sequential architectures in the case of genetic programming. The approach was tested on a digit recognition problem and compared with a reference classifier.
A conservative approach to parallelizing the Sharks World simulation
NASA Technical Reports Server (NTRS)
Nicol, David M.; Riffe, Scott E.
1990-01-01
Parallelizing a benchmark problem for parallel simulation, the Sharks World, is described. The described solution is conservative, in the sense that no state information is saved, and no 'rollbacks' occur. The used approach illustrates both the principal advantage and principal disadvantage of conservative parallel simulation. The advantage is that by exploiting lookahead an approach was found that dramatically improves the serial execution time, and also achieves excellent speedups. The disadvantage is that if the model rules are changed in such a way that the lookahead is destroyed, it is difficult to modify the solution to accommodate the changes.
Modelling parallel programs and multiprocessor architectures with AXE
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.
1991-01-01
AXE, An Experimental Environment for Parallel Systems, was designed to model and simulate for parallel systems at the process level. It provides an integrated environment for specifying computation models, multiprocessor architectures, data collection, and performance visualization. AXE is being used at NASA-Ames for developing resource management strategies, parallel problem formulation, multiprocessor architectures, and operating system issues related to the High Performance Computing and Communications Program. AXE's simple, structured user-interface enables the user to model parallel programs and machines precisely and efficiently. Its quick turn-around time keeps the user interested and productive. AXE models multicomputers. The user may easily modify various architectural parameters including the number of sites, connection topologies, and overhead for operating system activities. Parallel computations in AXE are represented as collections of autonomous computing objects known as players. Their use and behavior is described. Performance data of the multiprocessor model can be observed on a color screen. These include CPU and message routing bottlenecks, and the dynamic status of the software.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
NASA Astrophysics Data System (ADS)
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification.
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-01-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value. PMID:27905520
A multi-model approach to nucleic acid-based drug development.
Gautherot, Isabelle; Sodoyer, Regís
2004-01-01
With the advent of functional genomics and the shift of interest towards sequence-based therapeutics, the past decades have witnessed intense research efforts on nucleic acid-mediated gene regulation technologies. Today, RNA interference is emerging as a groundbreaking discovery, holding promise for development of genetic modulators of unprecedented potency. Twenty-five years after the discovery of antisense RNA and ribozymes, gene control therapeutics are still facing developmental difficulties, with only one US FDA-approved antisense drug currently available in the clinic. Limited predictability of target site selection models is recognized as one major stumbling block that is shared by all of the so-called complementary technologies, slowing the progress towards a commercial product. Currently employed in vitro systems for target site selection include RNAse H-based mapping, antisense oligonucleotide microarrays, and functional screening approaches using libraries of catalysts with randomized target-binding arms to identify optimal ribozyme/DNAzyme cleavage sites. Individually, each strategy has its drawbacks from a drug development perspective. Utilization of message-modulating sequences as therapeutic agents requires that their action on a given target transcript meets criteria of potency and selectivity in the natural physiological environment. In addition to sequence-dependent characteristics, other factors will influence annealing reactions and duplex stability, as well as nucleic acid-mediated catalysis. Parallel consideration of physiological selection systems thus appears essential for screening for nucleic acid compounds proposed for therapeutic applications. Cellular message-targeting studies face issues relating to efficient nucleic acid delivery and appropriate analysis of response. For reliability and simplicity, prokaryotic systems can provide a rapid and cost-effective means of studying message targeting under pseudo-cellular conditions, but such approaches also have limitations. To streamline nucleic acid drug discovery, we propose a multi-model strategy integrating high-throughput-adapted bacterial screening, followed by reporter-based and/or natural cellular models and potentially also in vitro assays for characterization of the most promising candidate sequences, before final in vivo testing.
Combinatorial Drug Screening Identifies Ewing Sarcoma-specific Sensitivities.
Radic-Sarikas, Branka; Tsafou, Kalliopi P; Emdal, Kristina B; Papamarkou, Theodore; Huber, Kilian V M; Mutz, Cornelia; Toretsky, Jeffrey A; Bennett, Keiryn L; Olsen, Jesper V; Brunak, Søren; Kovar, Heinrich; Superti-Furga, Giulio
2017-01-01
Improvements in survival for Ewing sarcoma pediatric and adolescent patients have been modest over the past 20 years. Combinations of anticancer agents endure as an option to overcome resistance to single treatments caused by compensatory pathways. Moreover, combinations are thought to lessen any associated adverse side effects through reduced dosing, which is particularly important in childhood tumors. Using a parallel phenotypic combinatorial screening approach of cells derived from three pediatric tumor types, we identified Ewing sarcoma-specific interactions of a diverse set of targeted agents including approved drugs. We were able to retrieve highly synergistic drug combinations specific for Ewing sarcoma and identified signaling processes important for Ewing sarcoma cell proliferation determined by EWS-FLI1 We generated a molecular target profile of PKC412, a multikinase inhibitor with strong synergistic propensity in Ewing sarcoma, revealing its targets in critical Ewing sarcoma signaling routes. Using a multilevel experimental approach including quantitative phosphoproteomics, we analyzed the molecular rationale behind the disease-specific synergistic effect of simultaneous application of PKC412 and IGF1R inhibitors. The mechanism of the drug synergy between these inhibitors is different from the sum of the mechanisms of the single agents. The combination effectively inhibited pathway crosstalk and averted feedback loop repression, in EWS-FLI1-dependent manner. Mol Cancer Ther; 16(1); 88-101. ©2016 AACR. ©2016 American Association for Cancer Research.
Bengali, Aditya N; Tessier, Peter M
2009-10-01
"Reversible" protein interactions govern diverse biological behavior ranging from intracellular transport and toxic protein aggregation to protein crystallization and inactivation of protein therapeutics. Much less is known about weak protein interactions than their stronger counterparts since they are difficult to characterize, especially in a parallel format (in contrast to a sequential format) necessary for high-throughput screening. We have recently introduced a highly efficient approach of characterizing protein self-association, namely self-interaction nanoparticle spectroscopy (SINS; Tessier et al., 2008; J Am Chem Soc 130:3106-3112). This approach exploits the separation-dependent optical properties of gold nanoparticles to detect weak self-interactions between proteins immobilized on nanoparticles. A limitation of our previous work is that differences in the sequence and structure of proteins can lead to significant differences in their affinity to adsorb to nanoparticle surfaces, which complicates analysis of the corresponding protein self-association behavior. In this work we demonstrate a highly specific approach for coating nanoparticles with proteins using biotin-avidin interactions to generate protein-nanoparticle conjugates that report protein self-interactions through changes in their optical properties. Using lysozyme as a model protein that is refractory to characterization by conventional SINS, we demonstrate that surface Plasmon wavelengths for gold-avidin-lysozyme conjugates over a range of solution conditions (i.e., pH and ionic strength) are well correlated with lysozyme osmotic second virial coefficient measurements. Since SINS requires orders of magnitude less protein and time than conventional methods (e.g., static light scattering), we envision this approach will find application in large screens of protein self-association aimed at either preventing (e.g., protein aggregation) or promoting (e.g., protein crystallization) these interactions. (c) 2009 Wiley Periodicals, Inc.
Parallel synthesis of a series of potentially brain penetrant aminoalkyl benzoimidazoles.
Micco, Iolanda; Nencini, Arianna; Quinn, Joanna; Bothmann, Hendrick; Ghiron, Chiara; Padova, Alessandro; Papini, Silvia
2008-03-01
Alpha7 agonists were identified via GOLD (CCDC) docking in the putative agonist binding site of an alpha7 homology model and a series of aminoalkyl benzoimidazoles was synthesised to obtain potentially brain penetrant drugs. The array was prepared starting from the reaction of ortho-fluoronitrobenzenes with a selection of diamines, followed by reduction of the nitro group to obtain a series of monoalkylated phenylene diamines. N,N'-Carbonyldiimidazole (CDI) mediated acylation, followed by a parallel automated work-up procedure, afforded the monoacylated phenylenediamines which were cyclised under acidic conditions. Parallel work-up and purification afforded the array products in good yields and purities with a robust parallel methodology which will be useful for other libraries. Screening for alpha7 activity revealed compounds with agonist activity for the receptor.
Discovery of novel human acrosin inhibitors by virtual screening
NASA Astrophysics Data System (ADS)
Liu, Xuefei; Dong, Guoqiang; Zhang, Jue; Qi, Jingjing; Zheng, Canhui; Zhou, Youjun; Zhu, Ju; Sheng, Chunquan; Lü, Jiaguo
2011-10-01
Human acrosin is an attractive target for the discovery of male contraceptive drugs. For the first time, structure-based drug design was applied to discover structurally diverse human acrosin inhibitors. A parallel virtual screening strategy in combination with pharmacophore-based and docking-based techniques was used to screen the SPECS database. From 16 compounds selected by virtual screening, a total of 10 compounds were found to be human acrosin inhibitors. Compound 2 was found to be the most potent hit (IC50 = 14 μM) and its binding mode was investigated by molecular dynamics simulations. The hit interacted with human acrosin mainly through hydrophobic and hydrogen-bonding interactions, which provided a good starting structure for further optimization studies.
Pengchit, Watcharaporn; Walters, Scott T.; Simmons, Rebecca G.; Kohlmann, Wendy; Burt, Randall W.; Schwartz, Marc D.; Kinney, Anita Y.
2011-01-01
Colorectal cancer (CRC) screening rates have been low despite effectiveness of screening in reducing CRC mortality. This article outlines the theoretical background and development of an innovative, telephone-based risk communication designed to promote screening among individuals at increased risk for familial CRC. This ongoing intervention integrates the Extended Parallel Process Model of fear management and the motivational interviewing counselling style. Tailoring and implementation intentions are incorporated. The primary outcome is self-reported colonoscopy within nine months following intervention. If proven effective, the remote intervention could be broadly disseminated to individuals at increased familial CRC risk, especially those in geographically underserved areas. PMID:21464114
Dalecki, Alex G; Malalasekera, Aruni P; Schaaf, Kaitlyn; Kutsch, Olaf; Bossmann, Stefan H; Wolschendorf, Frank
2016-04-01
The continuous rise of multi-drug resistant pathogenic bacteria has become a significant challenge for the health care system. In particular, novel drugs to treat infections of methicillin-resistant Staphylococcus aureus strains (MRSA) are needed, but traditional drug discovery campaigns have largely failed to deliver clinically suitable antibiotics. More than simply new drugs, new drug discovery approaches are needed to combat bacterial resistance. The recently described phenomenon of copper-dependent inhibitors has galvanized research exploring the use of metal-coordinating molecules to harness copper's natural antibacterial properties for therapeutic purposes. Here, we describe the results of the first concerted screening effort to identify copper-dependent inhibitors of Staphylococcus aureus. A standard library of 10 000 compounds was assayed for anti-staphylococcal activity, with hits defined as those compounds with a strict copper-dependent inhibitory activity. A total of 53 copper-dependent hit molecules were uncovered, similar to the copper independent hit rate of a traditionally executed campaign conducted in parallel on the same library. Most prominent was a hit family with an extended thiourea core structure, termed the NNSN motif. This motif resulted in copper-dependent and copper-specific S. aureus inhibition, while simultaneously being well tolerated by eukaryotic cells. Importantly, we could demonstrate that copper binding by the NNSN motif is highly unusual and likely responsible for the promising biological qualities of these compounds. A subsequent chemoinformatic meta-analysis of the ChEMBL chemical database confirmed the NNSNs as an unrecognized staphylococcal inhibitor, despite the family's presence in many chemical screening libraries. Thus, our copper-biased screen has proven able to discover inhibitors within previously screened libraries, offering a mechanism to reinvigorate exhausted molecular collections.
Pereira, Gilberto de Araujo; Louzada-Neto, Francisco; Barbosa, Valdirene de Fátima; Ferreira-Silva, Márcia Maria; de Moraes-Souza, Helio
2012-01-01
The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.
gWEGA: GPU-accelerated WEGA for molecular superposition and shape comparison.
Yan, Xin; Li, Jiabo; Gu, Qiong; Xu, Jun
2014-06-05
Virtual screening of a large chemical library for drug lead identification requires searching/superimposing a large number of three-dimensional (3D) chemical structures. This article reports a graphic processing unit (GPU)-accelerated weighted Gaussian algorithm (gWEGA) that expedites shape or shape-feature similarity score-based virtual screening. With 86 GPU nodes (each node has one GPU card), gWEGA can screen 110 million conformations derived from an entire ZINC drug-like database with diverse antidiabetic agents as query structures within 2 s (i.e., screening more than 55 million conformations per second). The rapid screening speed was accomplished through the massive parallelization on multiple GPU nodes and rapid prescreening of 3D structures (based on their shape descriptors and pharmacophore feature compositions). Copyright © 2014 Wiley Periodicals, Inc.
Developing science gateways for drug discovery in a grid environment.
Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra
2016-01-01
Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.
Telemedicine optoelectronic biomedical data processing system
NASA Astrophysics Data System (ADS)
Prosolovska, Vita V.
2010-08-01
The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.
Isolated colorectal cancer screening or integrated cancer prevention? A provocative suggestion!
Stockbrugger, Reinhold
2012-01-01
Colorectal cancer (CRC) screening is still not fully established in the European Union. Recently, the United European Gastroenterology Federation (UEGF) supported CRC screening with the publication of quality guidelines and a written declaration in the European Parliament in favor of European-wide monitored CRC screening and primary prevention of CRC, the latter particularly in young citizens. In this article, the need for population-based CRC screening is once again stressed. In addition, the value of opportunistic CRC screening is pointed out, either as a regional or nation-wide alternative (such as in the USA and Germany) or as a 'forerunner' activity in view of subsequent population-based CRC screening. With regard to other parallel organ-related screening activities in Europe (breast, uterus) and the increasing need for primary prevention of malignant and benign diseases, the question is raised as to whether preventive activities should not be recognized as an integrated and logical part of a 'healthcare chain' offered to all European citizens. Copyright © 2012 S. Karger AG, Basel.
Maier, Andrew; Vincent, Melissa J; Parker, Ann; Gadagbui, Bernard K; Jayjock, Michael
2015-12-01
Asthma is a complex syndrome with significant consequences for those affected. The number of individuals affected is growing, although the reasons for the increase are uncertain. Ensuring the effective management of potential exposures follows from substantial evidence that exposure to some chemicals can increase the likelihood of asthma responses. We have developed a safety assessment approach tailored to the screening of asthma risks from residential consumer product ingredients as a proactive risk management tool. Several key features of the proposed approach advance the assessment resources often used for asthma issues. First, a quantitative health benchmark for asthma or related endpoints (irritation and sensitization) is provided that extends qualitative hazard classification methods. Second, a parallel structure is employed to include dose-response methods for asthma endpoints and methods for scenario specific exposure estimation. The two parallel tracks are integrated in a risk characterization step. Third, a tiered assessment structure is provided to accommodate different amounts of data for both the dose-response assessment (i.e., use of existing benchmarks, hazard banding, or the threshold of toxicological concern) and exposure estimation (i.e., use of empirical data, model estimates, or exposure categories). Tools building from traditional methods and resources have been adapted to address specific issues pertinent to asthma toxicology (e.g., mode-of-action and dose-response features) and the nature of residential consumer product use scenarios (e.g., product use patterns and exposure durations). A case study for acetic acid as used in various sentinel products and residential cleaning scenarios was developed to test the safety assessment methodology. In particular, the results were used to refine and verify relationships among tiered approaches such that each lower data tier in the approach provides a similar or greater margin of safety for a given scenario. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Training the gastrointestinal endoscopy trainer.
Waschke, Kevin A; Anderson, John; Macintosh, Donald; Valori, Roland M
2016-06-01
Endoscopy training has traditionally been accomplished by an informal process in the endoscopy unit that parallels apprenticeship training seen in other areas of professional education. Subsequent to an audit, a series of interventions were implemented in the English National Health Service to support both service delivery and to improve endoscopy training. The resulting training centers deliver a variety of hands-on endoscopy courses, established in parallel with the roll out of a colon cancer screening program that monitors and documents quality outcomes among endoscopists. The program developed a 'training the trainer' module that subsequently became known as the Training the Colonoscopy Trainer course (TCT). Several years after its implementation, colonoscopy quality outcomes in the UK have improved substantially. The core TCT program has spread to other countries with demonstration of a marked impact on endoscopy training and performance. The aim of this chapter is to describe the principles that underlie effective endoscopy training in this program using the TCT as an example. While the review focuses on the specific example of colonoscopy training, the approach is generic to the teaching of any technical skill; it has been successfully transferred to the teaching of laparoscopic surgery as well as other endoscopic techniques. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.
2017-12-01
We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.
Gamma-Ray Light Curves from Pulsar Magnetospheres with Finite Conductivity
NASA Technical Reports Server (NTRS)
Harding, A. K.; Kalapotharakos, C.; Kazanas, D.; Contopoulos, I.
2012-01-01
The Fermi Large Area Telescope has provided an unprecedented database for pulsar emission studies that includes gamma-ray light curves for over 100 pulsars. Modeling these light curves can reveal and constrain the geometry of the particle accelerator, as well as the pulsar magnetic field structure. We have constructed 3D magnetosphere models with finite conductivity, that bridge the extreme vacuum and force-free solutions used in previous light curves modeling. We are investigating the shapes of pulsar gamma-ray light curves using these dissipative solutions with two different approaches: (l) assuming geometric emission patterns of the slot gap and outer gap, and (2) using the parallel electric field provided by the resistive models to compute the trajectories and . emission of the radiating particles. The light curves using geometric emission patterns show a systematic increase in gamma-ray peak phase with increasing conductivity, introducing a new diagnostic of these solutions. The light curves using the model electric fields are very sensitive to the conductivity but do not resemble the observed Fermi light curves, suggesting that some screening of the parallel electric field, by pair cascades not included in the models, is necessary
Discovery of DNA repair inhibitors by combinatorial library profiling
Moeller, Benjamin J.; Sidman, Richard L.; Pasqualini, Renata; Arap, Wadih
2011-01-01
Small molecule inhibitors of DNA repair are emerging as potent and selective anti-cancer therapies, but the sheer magnitude of the protein networks involved in DNA repair processes poses obstacles to discovery of effective candidate drugs. To address this challenge, we used a subtractive combinatorial selection approach to identify a panel of peptide ligands that bind DNA repair complexes. Supporting the concept that these ligands have therapeutic potential, we show that one selected peptide specifically binds and non-competitively inactivates DNA-PKcs, a protein kinase critical in double-strand DNA break repair. In doing so, this ligand sensitizes BRCA-deficient tumor cells to genotoxic therapy. Our findings establish a platform for large-scale parallel screening for ligand-directed DNA repair inhibitors, with immediate applicability to cancer therapy. PMID:21343400
Establishing MALDI-TOF as Versatile Drug Discovery Readout to Dissect the PTP1B Enzymatic Reaction.
Winter, Martin; Bretschneider, Tom; Kleiner, Carola; Ries, Robert; Hehn, Jörg P; Redemann, Norbert; Luippold, Andreas H; Bischoff, Daniel; Büttner, Frank H
2018-07-01
Label-free, mass spectrometric (MS) detection is an emerging technology in the field of drug discovery. Unbiased deciphering of enzymatic reactions is a proficient advantage over conventional label-based readouts suffering from compound interference and intricate generation of tailored signal mediators. Significant evolvements of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS, as well as associated liquid handling instrumentation, triggered extensive efforts in the drug discovery community to integrate the comprehensive MS readout into the high-throughput screening (HTS) portfolio. Providing speed, sensitivity, and accuracy comparable to those of conventional, label-based readouts, combined with merits of MS-based technologies, such as label-free parallelized measurement of multiple physiological components, emphasizes the advantages of MALDI-TOF for HTS approaches. Here we describe the assay development for the identification of protein tyrosine phosphatase 1B (PTP1B) inhibitors. In the context of this precious drug target, MALDI-TOF was integrated into the HTS environment and cross-compared with the well-established AlphaScreen technology. We demonstrate robust and accurate IC 50 determination with high accordance to data generated by AlphaScreen. Additionally, a tailored MALDI-TOF assay was developed to monitor compound-dependent, irreversible modification of the active cysteine of PTP1B. Overall, the presented data proves the promising perspective for the integration of MALDI-TOF into drug discovery campaigns.
Fitting new technologies into the safety paradigm: use of microarrays in transfusion.
Fournier-Wirth, C; Coste, J
2007-01-01
Until the late 1990s, mandatory blood screening for transmissible infectious agents depended entirely on antigen/antibody-based detection assays. The recent emergence of Nucleic acid Amplification Technologies (NAT) has revolutionised viral diagnosis, not only by increasing the level of sensitivity but also by facilitating the detection of several viruses in parallel by multiplexing specific primers. In more complex biological situations, when a broad spectrum of pathogens must be screened, the limitations of these first generation technologies became apparent. High throughput systems, such as DNA Arrays, permit a conceptually new approach. These miniaturised micro systems allow the detection of hundreds of different targets simultaneously, inducing a dramatic decrease in reagent consumption, a reduction in the number of confirmation tests and a simplification of data interpretation. However, the systems currently available require additional instrumentation and reagents for sample preparation and target amplification prior to detection on the DNA array. A major challenge in the area of DNA detection is the development of methods that do not rely on target amplification systems. Likewise, the advances of protein microarrays have lagged because of poor stability of proteins, complex coupling chemistry and weak detection signals. Emerging technologies like Biosensors and nano-particle based DNA or Protein Bio-Barcode Amplification Assays are promising diagnostic tools for a wide range of clinical applications, including blood donation screening.
Advances in Predictive Toxicology for Discovery Safety through High Content Screening.
Persson, Mikael; Hornberg, Jorrit J
2016-12-19
High content screening enables parallel acquisition of multiple molecular and cellular readouts. In particular the predictive toxicology field has progressed from the advances in high content screening, as more refined end points that report on cellular health can be studied in combination, at the single cell level, and in relatively high throughput. Here, we discuss how high content screening has become an essential tool for Discovery Safety, the discipline that integrates safety and toxicology in the drug discovery process to identify and mitigate safety concerns with the aim to design drug candidates with a superior safety profile. In addition to customized mechanistic assays to evaluate target safety, routine screening assays can be applied to identify risk factors for frequently occurring organ toxicities. We discuss the current state of high content screening assays for hepatotoxicity, cardiotoxicity, neurotoxicity, nephrotoxicity, and genotoxicity, including recent developments and current advances.
Theory and procedures for finding a correct kinetic model for the bacteriorhodopsin photocycle.
Hendler, R W; Shrager, R; Bose, S
2001-04-26
In this paper, we present the implementation and results of new methodology based on linear algebra. The theory behind these methods is covered in detail in the Supporting Information, available electronically (Shragerand Hendler). In brief, the methods presented search through all possible forward sequential submodels in order to find candidates that can be used to construct a complete model for the BR-photocycle. The methodology is limited only to forward sequential models. If no such models are compatible with the experimental data,none will be found. The procedures apply objective tests and filters to eliminate possibilities that cannot be correct, thus cutting the total number of candidate sequences to be considered. In the current application,which uses six exponentials, the total sequences were cut from 1950 to 49. The remaining sequences were further screened using known experimental criteria. The approach led to a solution which consists of a pair of sequences, one with 5 exponentials showing BR* f L(f) M(f) N O BR and the other with three exponentials showing BR* L(s) M(s) BR. The deduced complete kinetic model for the BR photocycle is thus either a single photocycle branched at the L intermediate or a pair of two parallel photocycles. Reasons for preferring the parallel photocycles are presented. Synthetic data constructed on the basis of the parallel photocycles were indistinguishable from the experimental data in a number of analytical tests that were applied.
Parallel architectures for iterative methods on adaptive, block structured grids
NASA Technical Reports Server (NTRS)
Gannon, D.; Vanrosendale, J.
1983-01-01
A parallel computer architecture well suited to the solution of partial differential equations in complicated geometries is proposed. Algorithms for partial differential equations contain a great deal of parallelism. But this parallelism can be difficult to exploit, particularly on complex problems. One approach to extraction of this parallelism is the use of special purpose architectures tuned to a given problem class. The architecture proposed here is tuned to boundary value problems on complex domains. An adaptive elliptic algorithm which maps effectively onto the proposed architecture is considered in detail. Two levels of parallelism are exploited by the proposed architecture. First, by making use of the freedom one has in grid generation, one can construct grids which are locally regular, permitting a one to one mapping of grids to systolic style processor arrays, at least over small regions. All local parallelism can be extracted by this approach. Second, though there may be a regular global structure to the grids constructed, there will be parallelism at this level. One approach to finding and exploiting this parallelism is to use an architecture having a number of processor clusters connected by a switching network. The use of such a network creates a highly flexible architecture which automatically configures to the problem being solved.
Liu, Tao; Sims, David; Baum, Buzz
2009-01-01
In recent years RNAi screening has proven a powerful tool for dissecting gene functions in animal cells in culture. However, to date, most RNAi screens have been performed in a single cell line, and results then extrapolated across cell types and systems. Here, to dissect generic and cell type-specific mechanisms underlying cell morphology, we have performed identical kinome RNAi screens in six different Drosophila cell lines, derived from two distinct tissues of origin. This analysis identified a core set of kinases required for normal cell morphology in all lines tested, together with a number of kinases with cell type-specific functions. Most significantly, the screen identified a role for minibrain (mnb/DYRK1A), a kinase associated with Down's syndrome, in the regulation of actin-based protrusions in CNS-derived cell lines. This cell type-specific requirement was not due to the peculiarities in the morphology of CNS-derived cells and could not be attributed to differences in mnb expression. Instead, it likely reflects differences in gene expression that constitute the cell type-specific functional context in which mnb/DYRK1A acts. Using parallel RNAi screens and gene expression analyses across cell types we have identified generic and cell type-specific regulators of cell morphology, which include mnb/DYRK1A in the regulation of protrusion morphology in CNS-derived cell lines. This analysis reveals the importance of using different cell types to gain a thorough understanding of gene function across the genome and, in the case of kinases, the difficulties of using the differential gene expression to predict function.
Potta, Thrimoorthy; Zhen, Zhuo; Grandhi, Taraka Sai Pavan; Christensen, Matthew D.; Ramos, James; Breneman, Curt M.; Rege, Kaushal
2014-01-01
We describe the combinatorial synthesis and cheminformatics modeling of aminoglycoside antibiotics-derived polymers for transgene delivery and expression. Fifty-six polymers were synthesized by polymerizing aminoglycosides with diglycidyl ether cross-linkers. Parallel screening resulted in identification of several lead polymers that resulted in high transgene expression levels in cells. The role of polymer physicochemical properties in determining efficacy of transgene expression was investigated using Quantitative Structure-Activity Relationship (QSAR) cheminformatics models based on Support Vector Regression (SVR) and ‘building block’ polymer structures. The QSAR model exhibited high predictive ability, and investigation of descriptors in the model, using molecular visualization and correlation plots, indicated that physicochemical attributes related to both, aminoglycosides and diglycidyl ethers facilitated transgene expression. This work synergistically combines combinatorial synthesis and parallel screening with cheminformatics-based QSAR models for discovery and physicochemical elucidation of effective antibiotics-derived polymers for transgene delivery in medicine and biotechnology. PMID:24331709
A dynamic bead-based microarray for parallel DNA detection
NASA Astrophysics Data System (ADS)
Sochol, R. D.; Casavant, B. P.; Dueck, M. E.; Lee, L. P.; Lin, L.
2011-05-01
A microfluidic system has been designed and constructed by means of micromachining processes to integrate both microfluidic mixing of mobile microbeads and hydrodynamic microbead arraying capabilities on a single chip to simultaneously detect multiple bio-molecules. The prototype system has four parallel reaction chambers, which include microchannels of 18 × 50 µm2 cross-sectional area and a microfluidic mixing section of 22 cm length. Parallel detection of multiple DNA oligonucleotide sequences was achieved via molecular beacon probes immobilized on polystyrene microbeads of 16 µm diameter. Experimental results show quantitative detection of three distinct DNA oligonucleotide sequences from the Hepatitis C viral (HCV) genome with single base-pair mismatch specificity. Our dynamic bead-based microarray offers an effective microfluidic platform to increase parallelization of reactions and improve microbead handling for various biological applications, including bio-molecule detection, medical diagnostics and drug screening.
A time-parallel approach to strong-constraint four-dimensional variational data assimilation
NASA Astrophysics Data System (ADS)
Rao, Vishwas; Sandu, Adrian
2016-05-01
A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.
Parallel approach in RDF query processing
NASA Astrophysics Data System (ADS)
Vajgl, Marek; Parenica, Jan
2017-07-01
Parallel approach is nowadays a very cheap solution to increase computational power due to possibility of usage of multithreaded computational units. This hardware became typical part of nowadays personal computers or notebooks and is widely spread. This contribution deals with experiments how evaluation of computational complex algorithm of the inference over RDF data can be parallelized over graphical cards to decrease computational time.
Otvos, Reka A; Mladic, Marija; Arias-Alpizar, Gabriela; Niessen, Wilfried M A; Somsen, Govert W; Smit, August B; Kool, Jeroen
2016-06-01
The α7-nicotinic acetylcholine receptor (α7-nAChR) is a ligand-gated ion channel expressed in different regions of the central nervous system (CNS). The α7-nAChR has been associated with Alzheimer's disease, epilepsy, and schizophrenia, and therefore is extensively studied as a drug target for the treatment of these diseases. Important sources for new compounds in drug discovery are natural extracts. Since natural extracts are complex mixtures, identification of the bioactives demands the use of analytical techniques to separate a bioactive from inactive compounds. This study describes screening methodology for identifying bioactive compounds in mixtures acting on the α7-nAChR. The methodology developed combines liquid chromatography (LC) coupled via a split with both an at-line calcium (Ca(2+))-flux assay and high-resolution mass spectrometry (MS). This allows evaluation of α7-nAChR responses after LC separation, while parallel MS enables compound identification. The methodology was optimized for analysis of agonists and positive allosteric modulators, and was successfully applied to screening of the hallucinogen mushroom Psilocybe Mckennaii The crude mushroom extract was analyzed using both reversed-phase and hydrophilic interaction liquid chromatography. Matching retention times and peak shapes of bioactives found with data from the parallel MS measurements allowed rapid pinpointing of accurate masses corresponding to the bioactives. © 2016 Society for Laboratory Automation and Screening.
Low cost automated whole smear microscopy screening system for detection of acid fast bacilli.
Law, Yan Nei; Jian, Hanbin; Lo, Norman W S; Ip, Margaret; Chan, Mia Mei Yuk; Kam, Kai Man; Wu, Xiaohua
2018-01-01
In countries with high tuberculosis (TB) burden, there is urgent need for rapid, large-scale screening to detect smear-positive patients. We developed a computer-aided whole smear screening system that focuses in real-time, captures images and provides diagnostic grading, for both bright-field and fluorescence microscopy for detection of acid-fast-bacilli (AFB) from respiratory specimens. To evaluate the performance of dual-mode screening system in AFB diagnostic algorithms on concentrated smears with auramine O (AO) staining, as well as direct smears with AO and Ziehl-Neelsen (ZN) staining, using mycobacterial culture results as gold standard. Adult patient sputum samples requesting for M. tuberculosis cultures were divided into three batches for staining: direct AO-stained, direct ZN-stained and concentrated smears AO-stained. All slides were graded by an experienced microscopist, in parallel with the automated whole smear screening system. Sensitivity and specificity of a TB diagnostic algorithm in using the screening system alone, and in combination with a microscopist, were evaluated. Of 488 direct AO-stained smears, 228 were culture positive. These yielded a sensitivity of 81.6% and specificity of 74.2%. Of 334 direct smears with ZN staining, 142 were culture positive, which gave a sensitivity of 70.4% and specificity of 76.6%. Of 505 concentrated smears with AO staining, 250 were culture positive, giving a sensitivity of 86.4% and specificity of 71.0%. To further improve performance, machine grading was confirmed by manual smear grading when the number of AFBs detected fell within an uncertainty range. These combined results gave significant improvement in specificity (AO-direct:85.4%; ZN-direct:85.4%; AO-concentrated:92.5%) and slight improvement in sensitivity while requiring only limited manual workload. Our system achieved high sensitivity without substantially compromising specificity when compared to culture results. Significant improvement in specificity was obtained when uncertain results were confirmed by manual smear grading. This approach had potential to substantially reduce workload of microscopists in high burden countries.
Jones, Ryan J. R.; Shinde, Aniketa; Guevarra, Dan; ...
2015-01-05
There are many energy technologies require electrochemical stability or preactivation of functional materials. Due to the long experiment duration required for either electrochemical preactivation or evaluation of operational stability, parallel screening is required to enable high throughput experimentation. We found that imposing operational electrochemical conditions to a library of materials in parallel creates several opportunities for experimental artifacts. We discuss the electrochemical engineering principles and operational parameters that mitigate artifacts int he parallel electrochemical treatment system. We also demonstrate the effects of resistive losses within the planar working electrode through a combination of finite element modeling and illustrative experiments. Operationmore » of the parallel-plate, membrane-separated electrochemical treatment system is demonstrated by exposing a composition library of mixed metal oxides to oxygen evolution conditions in 1M sulfuric acid for 2h. This application is particularly important because the electrolysis and photoelectrolysis of water are promising future energy technologies inhibited by the lack of highly active, acid-stable catalysts containing only earth abundant elements.« less
2014-01-01
Background Colorectal cancer is an important public health problem in Spain. Over the last decade, several regions have carried out screening programmes, but population participation rates remain below recommended European goals. Reminders on electronic medical records have been identified as a low-cost and high-reach strategy to increase participation. Further knowledge is needed about their effect in a population-based screening programme. The main aim of this study is to evaluate the effectiveness of an electronic reminder to promote the participation in a population-based colorectal cancer screening programme. Secondary aims are to learn population’s reasons for refusing to take part in the screening programme and to find out the health professionals’ opinion about the official programme implementation and on the new computerised tool. Methods/Design This is a parallel randomised trial with a cross-sectional second stage. Participants: all the invited subjects to participate in the public colorectal cancer screening programme that includes men and women aged between 50–69, allocated to the eleven primary care centres of the study and all their health professionals. The randomisation unit will be the primary care physician. The intervention will consist of activating an electronic reminder, in the patient’s electronic medical record, in order to promote colorectal cancer screening, during a synchronous medical appointment, throughout the year that the intervention takes place. A comparison of the screening rates will then take place, using the faecal occult blood test of the patients from the control and the intervention groups. We will also take a questionnaire to know the opinions of the health professionals. The main outcome is the screening status at the end of the study. Data will be analysed with an intention-to-treat approach. Discussion We expect that the introduction of specific reminders in electronic medical records, as a tool to facilitate and encourage direct referral by physicians and nurse practitioners to perform colorectal cancer screening will mean an increase in participation of the target population. The introduction of this new software tool will have good acceptance and increase compliance with recommendations from health professionals. Trial registration Clinical Trials.gov identifier NCT01877018 PMID:24685117
Parallel fabrication of macroporous scaffolds.
Dobos, Andrew; Grandhi, Taraka Sai Pavan; Godeshala, Sudhakar; Meldrum, Deirdre R; Rege, Kaushal
2018-07-01
Scaffolds generated from naturally occurring and synthetic polymers have been investigated in several applications because of their biocompatibility and tunable chemo-mechanical properties. Existing methods for generation of 3D polymeric scaffolds typically cannot be parallelized, suffer from low throughputs, and do not allow for quick and easy removal of the fragile structures that are formed. Current molds used in hydrogel and scaffold fabrication using solvent casting and porogen leaching are often single-use and do not facilitate 3D scaffold formation in parallel. Here, we describe a simple device and related approaches for the parallel fabrication of macroporous scaffolds. This approach was employed for the generation of macroporous and non-macroporous materials in parallel, in higher throughput and allowed for easy retrieval of these 3D scaffolds once formed. In addition, macroporous scaffolds with interconnected as well as non-interconnected pores were generated, and the versatility of this approach was employed for the generation of 3D scaffolds from diverse materials including an aminoglycoside-derived cationic hydrogel ("Amikagel"), poly(lactic-co-glycolic acid) or PLGA, and collagen. Macroporous scaffolds generated using the device were investigated for plasmid DNA binding and cell loading, indicating the use of this approach for developing materials for different applications in biotechnology. Our results demonstrate that the device-based approach is a simple technology for generating scaffolds in parallel, which can enhance the toolbox of current fabrication techniques. © 2018 Wiley Periodicals, Inc.
Housley, Daniel; Caine, Abby; Cherubini, Giunio; Taeymans, Olivier
2017-07-01
Sagittal T2-weighted sequences (T2-SAG) are the foundation of spinal protocols when screening for the presence of intervertebral disc extrusion. We often utilize sagittal short-tau inversion recovery sequences (STIR-SAG) as an adjunctive screening series, and experience suggests that this combined approach provides superior detection rates. We hypothesized that STIR-SAG would provide higher sensitivity than T2-SAG in the identification and localization of intervertebral disc extrusion. We further hypothesized that the parallel evaluation of paired T2-SAG and STIR-SAG series would provide a higher sensitivity than could be achieved with either independent sagittal series when viewed in isolation. This retrospective diagnostic accuracy study blindly reviewed T2-SAG and STIR-SAG sequences from dogs (n = 110) with surgically confirmed intervertebral disc extrusion. A consensus between two radiologists found no significant difference in sensitivity between T2-SAG and STIR-SAG during the identification of intervertebral disc extrusion (T2-SAG: 92.7%, STIR-SAG: 94.5%, P = 0.752). Nevertheless, STIR-SAG accurately identified intervertebral disc extrusion in 66.7% of cases where the evaluation of T2-SAG in isolation had provided a false negative diagnosis. Additionally, one radiologist found that the parallel evaluation of paired T2-SAG and STIR-SAG series provided a significantly higher sensitivity than T2-SAG in isolation, during the identification of intervertebral disc extrusion (T2-SAG: 78.2%, paired T2-SAG, and STIR-SAG: 90.9%, P = 0.017). A similar nonsignificant trend was observed when the consensus of both radiologists was taken into consideration (T2-SAG: 92.7%, paired T2-SAG, and STIR-SAG = 97.3%, P = 0.392). We therefore conclude that STIR-SAG is capable of identifying intervertebral disc extrusion that is inconspicuous in T2-SAG, and that STIR-SAG should be considered a useful adjunctive sequence during preliminary sagittal screening for intervertebral disc extrusion in low-field magnetic resonance. © 2017 American College of Veterinary Radiology.
NASA Astrophysics Data System (ADS)
Hofierka, Jaroslav; Lacko, Michal; Zubal, Stanislav
2017-10-01
In this paper, we describe the parallelization of three complex and computationally intensive modules of GRASS GIS using the OpenMP application programming interface for multi-core computers. These include the v.surf.rst module for spatial interpolation, the r.sun module for solar radiation modeling and the r.sim.water module for water flow simulation. We briefly describe the functionality of the modules and parallelization approaches used in the modules. Our approach includes the analysis of the module's functionality, identification of source code segments suitable for parallelization and proper application of OpenMP parallelization code to create efficient threads processing the subtasks. We document the efficiency of the solutions using the airborne laser scanning data representing land surface in the test area and derived high-resolution digital terrain model grids. We discuss the performance speed-up and parallelization efficiency depending on the number of processor threads. The study showed a substantial increase in computation speeds on a standard multi-core computer while maintaining the accuracy of results in comparison to the output from original modules. The presented parallelization approach showed the simplicity and efficiency of the parallelization of open-source GRASS GIS modules using OpenMP, leading to an increased performance of this geospatial software on standard multi-core computers.
Comparison of a rational vs. high throughput approach for rapid salt screening and selection.
Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C
2013-01-01
In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.
NASA Technical Reports Server (NTRS)
Tilton, James C.
1988-01-01
Image segmentation can be a key step in data compression and image analysis. However, the segmentation results produced by most previous approaches to region growing are suspect because they depend on the order in which portions of the image are processed. An iterative parallel segmentation algorithm avoids this problem by performing globally best merges first. Such a segmentation approach, and two implementations of the approach on NASA's Massively Parallel Processor (MPP) are described. Application of the segmentation approach to data compression and image analysis is then described, and results of such application are given for a LANDSAT Thematic Mapper image.
Automatic Management of Parallel and Distributed System Resources
NASA Technical Reports Server (NTRS)
Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.
1990-01-01
Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.; Hansman, R. John
1997-01-01
Efforts to increase airport capacity include studies of aircraft systems that would enable simultaneous approaches to closely spaced parallel runway in Instrument Meteorological Conditions (IMC). The time-critical nature of a parallel approach results in key design issues for current and future collision avoidance systems. Two part-task flight simulator studies have examined the procedural and display issues inherent in such a time-critical task, the interaction of the pilot with a collision avoidance system, and the alerting criteria and avoidance maneuvers preferred by subjects.
Approximation algorithms for scheduling unrelated parallel machines with release dates
NASA Astrophysics Data System (ADS)
Avdeenko, T. V.; Mesentsev, Y. A.; Estraykh, I. V.
2017-01-01
In this paper we propose approaches to optimal scheduling of unrelated parallel machines with release dates. One approach is based on the scheme of dynamic programming modified with adaptive narrowing of search domain ensuring its computational effectiveness. We discussed complexity of the exact schedules synthesis and compared it with approximate, close to optimal, solutions. Also we explain how the algorithm works for the example of two unrelated parallel machines and five jobs with release dates. Performance results that show the efficiency of the proposed approach have been given.
An object-oriented approach to nested data parallelism
NASA Technical Reports Server (NTRS)
Sheffler, Thomas J.; Chatterjee, Siddhartha
1994-01-01
This paper describes an implementation technique for integrating nested data parallelism into an object-oriented language. Data-parallel programming employs sets of data called 'collections' and expresses parallelism as operations performed over the elements of a collection. When the elements of a collection are also collections, then there is the possibility for 'nested data parallelism.' Few current programming languages support nested data parallelism however. In an object-oriented framework, a collection is a single object. Its type defines the parallel operations that may be applied to it. Our goal is to design and build an object-oriented data-parallel programming environment supporting nested data parallelism. Our initial approach is built upon three fundamental additions to C++. We add new parallel base types by implementing them as classes, and add a new parallel collection type called a 'vector' that is implemented as a template. Only one new language feature is introduced: the 'foreach' construct, which is the basis for exploiting elementwise parallelism over collections. The strength of the method lies in the compilation strategy, which translates nested data-parallel C++ into ordinary C++. Extracting the potential parallelism in nested 'foreach' constructs is called 'flattening' nested parallelism. We show how to flatten 'foreach' constructs using a simple program transformation. Our prototype system produces vector code which has been successfully run on workstations, a CM-2, and a CM-5.
Parallel computing for probabilistic fatigue analysis
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.
1993-01-01
This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.
Distributed computing feasibility in a non-dedicated homogeneous distributed system
NASA Technical Reports Server (NTRS)
Leutenegger, Scott T.; Sun, Xian-He
1993-01-01
The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.
Dong, Suwei; Cahill, Katharine J; Kang, Moon-Il; Colburn, Nancy H; Henrich, Curtis J; Wilson, Jennifer A; Beutler, John A; Johnson, Richard P; Porco, John A
2011-11-04
We have accomplished a parallel screen of cycloaddition partners for o-quinols utilizing a plate-based microwave system. Microwave irradiation improves the efficiency of retro-Diels-Alder/Diels-Alder cascades of o-quinol dimers which generally proceed in a diastereoselective fashion. Computational studies indicate that asynchronous transition states are favored in Diels-Alder cycloadditions of o-quinols. Subsequent biological evaluation of a collection of cycloadducts has identified an inhibitor of activator protein-1 (AP-1), an oncogenic transcription factor.
Crooks, Richard O; Baxter, Daniel; Panek, Anna S; Lubben, Anneke T; Mason, Jody M
2016-01-29
Interactions between naturally occurring proteins are highly specific, with protein-network imbalances associated with numerous diseases. For designed protein-protein interactions (PPIs), required specificity can be notoriously difficult to engineer. To accelerate this process, we have derived peptides that form heterospecific PPIs when combined. This is achieved using software that generates large virtual libraries of peptide sequences and searches within the resulting interactome for preferentially interacting peptides. To demonstrate feasibility, we have (i) generated 1536 peptide sequences based on the parallel dimeric coiled-coil motif and varied residues known to be important for stability and specificity, (ii) screened the 1,180,416 member interactome for predicted Tm values and (iii) used predicted Tm cutoff points to isolate eight peptides that form four heterospecific PPIs when combined. This required that all 32 hypothetical off-target interactions within the eight-peptide interactome be disfavoured and that the four desired interactions pair correctly. Lastly, we have verified the approach by characterising all 36 pairs within the interactome. In analysing the output, we hypothesised that several sequences are capable of adopting antiparallel orientations. We subsequently improved the software by removing sequences where doing so led to fully complementary electrostatic pairings. Our approach can be used to derive increasingly large and therefore complex sets of heterospecific PPIs with a wide range of potential downstream applications from disease modulation to the design of biomaterials and peptides in synthetic biology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Abdullah, Fauziah; Su, Tin Tin
2013-01-01
The objective of this study was to evaluate the effect of a call-recall approach in enhancing Pap smear practice by changes of motivation stage among non-compliant women. A cluster randomized controlled trial with parallel and un-blinded design was conducted between January and November 2010 in 40 public secondary schools in Malaysia among 403 female teachers who never or infrequently attended for a Pap test. A cluster randomization was applied in assigning schools to both groups. An intervention group received an invitation and reminder (call-recall program) for a Pap test (20 schools with 201 participants), while the control group received usual care from the existing cervical screening program (20 schools with 202 participants). Multivariate logistic regression was performed to determine the effect of the intervention program on the action stage (Pap smear uptake) at 24 weeks. In both groups, pre-contemplation stage was found as the highest proportion of changes in stages. At 24 weeks, an intervention group showed two times more in the action stage than control group (adjusted odds ratio 2.44, 95% CI 1.29-4.62). The positive effect of a call-recall approach in motivating women to change the behavior of screening practice should be appreciated by policy makers and health care providers in developing countries as an intervention to enhance Pap smear uptake. Copyright © 2013 Elsevier Inc. All rights reserved.
3D Data Denoising via Nonlocal Means Filter by Using Parallel GPU Strategies
Cuomo, Salvatore; De Michele, Pasquale; Piccialli, Francesco
2014-01-01
Nonlocal Means (NLM) algorithm is widely considered as a state-of-the-art denoising filter in many research fields. Its high computational complexity leads researchers to the development of parallel programming approaches and the use of massively parallel architectures such as the GPUs. In the recent years, the GPU devices had led to achieving reasonable running times by filtering, slice-by-slice, and 3D datasets with a 2D NLM algorithm. In our approach we design and implement a fully 3D NonLocal Means parallel approach, adopting different algorithm mapping strategies on GPU architecture and multi-GPU framework, in order to demonstrate its high applicability and scalability. The experimental results we obtained encourage the usability of our approach in a large spectrum of applicative scenarios such as magnetic resonance imaging (MRI) or video sequence denoising. PMID:25045397
Parallel Implementation of the Discontinuous Galerkin Method
NASA Technical Reports Server (NTRS)
Baggag, Abdalkader; Atkins, Harold; Keyes, David
1999-01-01
This paper describes a parallel implementation of the discontinuous Galerkin method. Discontinuous Galerkin is a spatially compact method that retains its accuracy and robustness on non-smooth unstructured grids and is well suited for time dependent simulations. Several parallelization approaches are studied and evaluated. The most natural and symmetric of the approaches has been implemented in all object-oriented code used to simulate aeroacoustic scattering. The parallel implementation is MPI-based and has been tested on various parallel platforms such as the SGI Origin, IBM SP2, and clusters of SGI and Sun workstations. The scalability results presented for the SGI Origin show slightly superlinear speedup on a fixed-size problem due to cache effects.
Stewart, Eugene L; Brown, Peter J; Bentley, James A; Willson, Timothy M
2004-08-01
A methodology for the selection and validation of nuclear receptor ligand chemical descriptors is described. After descriptors for a targeted chemical space were selected, a virtual screening methodology utilizing this space was formulated for the identification of potential NR ligands from our corporate collection. Using simple descriptors and our virtual screening method, we are able to quickly identify potential NR ligands from a large collection of compounds. As validation of the virtual screening procedure, an 8, 000-membered NR targeted set and a 24, 000-membered diverse control set of compounds were selected from our in-house general screening collection and screened in parallel across a number of orphan NR FRET assays. For the two assays that provided at least one hit per set by the established minimum pEC(50) for activity, the results showed a 2-fold increase in the hit-rate of the targeted compound set over the diverse set.
Wrighton-Smith, Peter; Sneed, Laurie; Humphrey, Frances; Tao, Xuguang; Bernacki, Edward
2012-07-01
To determine the price point at which an interferon-γ release assay (IGRA) is less costly than a tuberculin skin test (TST) for health care employee tuberculosis screening. A multidecision tree-based cost model incorporating inputs gathered from time-motion studies and parallel testing by IGRA and TST was conducted in a subset of our employees. Administering a TST testing program costs $73.20 per person screened, $90.80 per new hire, and $63.42 per annual screen. Use of an IGRA for employee health testing is cost saving at an IGRA test cost of $54.83 or less per test and resulted in higher completion rates because of the elimination of the need for a second visit to interpret the TST. Using an IGRA for employee health screening can be an institutional cost saving and results in higher compliance rates.
An integrated miRNA functional screening and target validation method for organ morphogenesis.
Rebustini, Ivan T; Vlahos, Maryann; Packer, Trevor; Kukuruzinska, Maria A; Maas, Richard L
2016-03-16
The relative ease of identifying microRNAs and their increasing recognition as important regulators of organogenesis motivate the development of methods to efficiently assess microRNA function during organ morphogenesis. In this context, embryonic organ explants provide a reliable and reproducible system that recapitulates some of the important early morphogenetic processes during organ development. Here we present a method to target microRNA function in explanted mouse embryonic organs. Our method combines the use of peptide-based nanoparticles to transfect specific microRNA inhibitors or activators into embryonic organ explants, with a microRNA pulldown assay that allows direct identification of microRNA targets. This method provides effective assessment of microRNA function during organ morphogenesis, allows prioritization of multiple microRNAs in parallel for subsequent genetic approaches, and can be applied to a variety of embryonic organs.
Study of Lamb Waves for Non-Destructive Testing Behind Screens
NASA Astrophysics Data System (ADS)
Kauffmann, P.; Ploix, M.-A.; Chaix, J.-F.; Gueudré, C.; Corneloup, G.; Baqué, F. AF(; )
2018-01-01
The inspection and control of sodium-cooled fast reactors (SFR) is a major issue for the nuclear industry. Ultrasonic solutions are under study because of the opacity of liquid sodium. In this paper, the use of leaky Lamb waves is considered for non-destructive testing (NDT) on parallel and immersed structures assimilated as plates. The first phase of our approach involved studying the propagation properties of leaky Lamb waves. Equations that model the propagation of Lamb waves in an immersed plate were solved numerically. The phase velocity can be experimentally measured using a two dimensional Fourier transform. The group velocity can be experimentally measured using a short-time Fourier transform technique. Attenuation of leaky Lamb waves is mostly due to the re-emission of energy into the surrounding fluid, and it can be measured by these two techniques.
Detecting Spatial Patterns in Biological Array Experiments
ROOT, DAVID E.; KELLEY, BRIAN P.; STOCKWELL, BRENT R.
2005-01-01
Chemical genetic screening and DNA and protein microarrays are among a number of increasingly important and widely used biological research tools that involve large numbers of parallel experiments arranged in a spatial array. It is often difficult to ensure that uniform experimental conditions are present throughout the entire array, and as a result, one often observes systematic spatially correlated errors, especially when array experiments are performed using robots. Here, the authors apply techniques based on the discrete Fourier transform to identify and quantify spatially correlated errors superimposed on a spatially random background. They demonstrate that these techniques are effective in identifying common spatially systematic errors in high-throughput 384-well microplate assay data. In addition, the authors employ a statistical test to allow for automatic detection of such errors. Software tools for using this approach are provided. PMID:14567791
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sofronov, I.D.; Voronin, B.L.; Butnev, O.I.
1997-12-31
The aim of the work performed is to develop a 3D parallel program for numerical calculation of gas dynamics problem with heat conductivity on distributed memory computational systems (CS), satisfying the condition of numerical result independence from the number of processors involved. Two basically different approaches to the structure of massive parallel computations have been developed. The first approach uses the 3D data matrix decomposition reconstructed at temporal cycle and is a development of parallelization algorithms for multiprocessor CS with shareable memory. The second approach is based on using a 3D data matrix decomposition not reconstructed during a temporal cycle.more » The program was developed on 8-processor CS MP-3 made in VNIIEF and was adapted to a massive parallel CS Meiko-2 in LLNL by joint efforts of VNIIEF and LLNL staffs. A large number of numerical experiments has been carried out with different number of processors up to 256 and the efficiency of parallelization has been evaluated in dependence on processor number and their parameters.« less
NASA Technical Reports Server (NTRS)
Agrawal, Gagan; Sussman, Alan; Saltz, Joel
1993-01-01
Scientific and engineering applications often involve structured meshes. These meshes may be nested (for multigrid codes) and/or irregularly coupled (called multiblock or irregularly coupled regular mesh problems). A combined runtime and compile-time approach for parallelizing these applications on distributed memory parallel machines in an efficient and machine-independent fashion was described. A runtime library which can be used to port these applications on distributed memory machines was designed and implemented. The library is currently implemented on several different systems. To further ease the task of application programmers, methods were developed for integrating this runtime library with compilers for HPK-like parallel programming languages. How this runtime library was integrated with the Fortran 90D compiler being developed at Syracuse University is discussed. Experimental results to demonstrate the efficacy of our approach are presented. A multiblock Navier-Stokes solver template and a multigrid code were experimented with. Our experimental results show that our primitives have low runtime communication overheads. Further, the compiler parallelized codes perform within 20 percent of the code parallelized by manually inserting calls to the runtime library.
Peptides having reduced toxicity that stimulate cholesterol efflux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bielicki, John K.; Johansson, Jan; Danho, Waleed
The present invention provides a family of non-naturally occurring polypeptides having cholesterol efflux activity that parallels that of full-length apolipoproteins (e.g., Apo AI and Apo E), and having high selectivity for ABCA1 that parallels that of full-length apolipoproteins. Further, the peptides of the invention have little or no toxicity when administered at therapeutic and higher doses. The invention also provides compositions comprising such polypeptides, methods of identifying, screening and synthesizing such polypeptides, and methods of treating, preventing or diagnosing diseases and disorders associated with dyslipidemia, hypercholesterolemia and inflammation.
Shitanda, Isao; Momiyama, Misaki; Watanabe, Naoto; Tanaka, Tomohiro; Tsujimura, Seiya; Hoshi, Yoshinao; Itagaki, Masayuki
2017-10-01
A novel paper-based biofuel cell with a series/parallel array structure has been fabricated, in which the cell voltage and output power can easily be adjusted as required by printing. The output of the fabricated 4-series/4-parallel biofuel cell reached 0.97±0.02 mW at 1.4 V, which is the highest output power reported to date for a paper-based biofuel cell. This work contributes to the development of flexible, wearable energy storage device.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Dan; Wang, Jun; Wang, Limin
An integrated lateral flow test strip with electrochemical sensor (LFTSES) device with rapid, selective and sensitive response for quantification of exposure to organophosphorus (OP) pesticides and nerve agents has been developed. The principle of this approach is based on parallel measurements of post-exposure and baseline acetylcholinesterase (AChE) enzyme activity, where reactivation of the phosphorylated AChE is exploited to enable measurement of total amount of AChE (including inhibited and active) which is used as a baseline for calculation of AChE inhibition. Quantitative measurement of phosphorylated adduct (OP-AChE) was realized by subtracting the active AChE from the total amount of AChE. Themore » proposed LFTSES device integrates immunochromatographic test strip technology with electrochemical measurement using a disposable screen printed electrode which is located under the test zone. It shows linear response between AChE enzyme activity and enzyme concentration from 0.05 to 10 nM, with detection limit of 0.02 nM. Based on this reactivation approach, the LFTSES device has been successfully applied for in vitro red blood cells inhibition studies using chlorpyrifos oxon as a model OP agent. This approach not only eliminates the difficulty in screening of low-dose OP exposure because of individual variation of normal AChE values, but also avoids the problem in overlapping substrate specificity with cholinesterases and avoids potential interference from other electroactive species in biological samples. It is baseline free and thus provides a rapid, sensitive, selective and inexpensive tool for in-field and point-of-care assessment of exposures to OP pesticides and nerve agents.« less
Engineered plant biomass feedstock particles
Dooley, James H [Federal Way, WA; Lanning, David N [Federal Way, WA; Broderick, Thomas F [Lake Forest Park, WA
2011-10-18
A novel class of flowable biomass feedstock particles with unusually large surface areas that can be manufactured in remarkably uniform sizes using low-energy comminution techniques. The feedstock particles are roughly parallelepiped in shape and characterized by a length dimension (L) aligned substantially with the grain direction and defining a substantially uniform distance along the grain, a width dimension (W) normal to L and aligned cross grain, and a height dimension (H) normal to W and L. The particles exhibit a disrupted grain structure with prominent end and surface checks that greatly enhances their skeletal surface area as compared to their envelope surface area. The L.times.H dimensions define a pair of substantially parallel side surfaces characterized by substantially intact longitudinally arrayed fibers. The W.times.H dimensions define a pair of substantially parallel end surfaces characterized by crosscut fibers and end checking between fibers. The L.times.W dimensions define a pair of substantially parallel top surfaces characterized by some surface checking between longitudinally arrayed fibers. At least 80% of the particles pass through a 1/4 inch screen having a 6.3 mm nominal sieve opening but are retained by a No. 10 screen having a 2 mm nominal sieve opening. The feedstock particles are manufactured from a variety of plant biomass materials including wood, crop residues, plantation grasses, hemp, bagasse, and bamboo.
Linker, Kevin L.; Conrad, Frank J.; Custer, Chad A.; Rhykerd, Jr., Charles L.
1998-01-01
An apparatus and method for preconcentrating particles and vapors. The preconcentrator apparatus permits detection of highly diluted amounts of particles in a main gas stream, such as a stream of ambient air. A main gas stream having airborne particles entrained therein is passed through a pervious screen. The particles accumulate upon the screen, as the screen acts as a sort of selective particle filter. The flow of the main gas stream is then interrupted by diaphragm shutter valves, whereupon a cross-flow of carrier gas stream is blown parallel past the faces of the screen to dislodge the accumulated particles and carry them to a particle or vapor detector, such as an ion mobility spectrometer. The screen may be heated, such as by passing an electrical current there through, to promote desorption of particles therefrom during the flow of the carrier gas. Various types of screens are disclosed. The apparatus and method of the invention may find particular utility in the fields of narcotics, explosives detection and chemical agents.
Linker, Kevin L.; Conrad, Frank J.; Custer, Chad A.; Rhykerd, Jr., Charles L.
2005-09-20
An apparatus and method for preconcentrating particles and vapors. The preconcentrator apparatus permits detection of highly diluted amounts of particles in a main gas stream, such as a stream of ambient air. A main gas stream having airborne particles entrained therein is passed through a pervious screen. The particles accumulate upon the screen, as the screen acts as a sort of selective particle filter. The flow of the main gas stream is then interrupted by diaphragm shutter valves, whereupon a cross-flow of carrier gas stream is blown parallel past the faces of the screen to dislodge the accumulated particles and carry them to a particle or vapor detector, such as an ion mobility spectrometer. The screen may be heated, such as by passing an electrical current there through, to promote desorption of particles therefrom during the flow of the carrier gas. Various types of screens are disclosed. The apparatus and method of the invention may find particular utility in the fields of narcotics, explosives detection and chemical agents.
Linker, Kevin L.; Conrad, Frank J.; Custer, Chad A.; Rhykerd, Jr., Charles L.
2000-01-01
An apparatus and method for preconcentrating particles and vapors. The preconcentrator apparatus permits detection of highly diluted amounts of particles in a main gas stream, such as a stream of ambient air. A main gas stream having airborne particles entrained therein is passed through a pervious screen. The particles accumulate upon the screen, as the screen acts as a sort of selective particle filter. The flow of the main gas stream is then interrupted by diaphragm shutter valves, whereupon a cross-flow of carrier gas stream is blown parallel past the faces of the screen to dislodge the accumulated particles and carry them to a particle or vapor detector, such as an ion mobility spectrometer. The screen may be heated, such as by passing an electrical current there through, to promote desorption of particles therefrom during the flow of the carrier gas. Various types of screens are disclosed. The apparatus and method of the invention may find particular utility in the fields of narcotics, explosives detection and chemical agents.
Linker, K.L.; Conrad, F.J.; Custer, C.A.; Rhykerd, C.L. Jr.
1998-12-29
An apparatus and method are disclosed for preconcentrating particles and vapors. The preconcentrator apparatus permits detection of highly diluted amounts of particles in a main gas stream, such as a stream of ambient air. A main gas stream having airborne particles entrained therein is passed through a pervious screen. The particles accumulate upon the screen, as the screen acts as a sort of selective particle filter. The flow of the main gas stream is then interrupted by diaphragm shutter valves, whereupon a cross-flow of carrier gas stream is blown parallel past the faces of the screen to dislodge the accumulated particles and carry them to a particle or vapor detector, such as an ion mobility spectrometer. The screen may be heated, such as by passing an electrical current there through, to promote desorption of particles therefrom during the flow of the carrier gas. Various types of screens are disclosed. The apparatus and method of the invention may find particular utility in the fields of narcotics, explosives detection and chemical agents. 3 figs.
Programming parallel architectures: The BLAZE family of languages
NASA Technical Reports Server (NTRS)
Mehrotra, Piyush
1988-01-01
Programming multiprocessor architectures is a critical research issue. An overview is given of the various approaches to programming these architectures that are currently being explored. It is argued that two of these approaches, interactive programming environments and functional parallel languages, are particularly attractive since they remove much of the burden of exploiting parallel architectures from the user. Also described is recent work by the author in the design of parallel languages. Research on languages for both shared and nonshared memory multiprocessors is described, as well as the relations of this work to other current language research projects.
Vascular system modeling in parallel environment - distributed and shared memory approaches
Jurczuk, Krzysztof; Kretowski, Marek; Bezy-Wendling, Johanne
2011-01-01
The paper presents two approaches in parallel modeling of vascular system development in internal organs. In the first approach, new parts of tissue are distributed among processors and each processor is responsible for perfusing its assigned parts of tissue to all vascular trees. Communication between processors is accomplished by passing messages and therefore this algorithm is perfectly suited for distributed memory architectures. The second approach is designed for shared memory machines. It parallelizes the perfusion process during which individual processing units perform calculations concerning different vascular trees. The experimental results, performed on a computing cluster and multi-core machines, show that both algorithms provide a significant speedup. PMID:21550891
NASA Astrophysics Data System (ADS)
Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir
2016-12-01
The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.
Hierarchical screening for multiple mental disorders.
Batterham, Philip J; Calear, Alison L; Sunderland, Matthew; Carragher, Natacha; Christensen, Helen; Mackinnon, Andrew J
2013-10-01
There is a need for brief, accurate screening when assessing multiple mental disorders. Two-stage hierarchical screening, consisting of brief pre-screening followed by a battery of disorder-specific scales for those who meet diagnostic criteria, may increase the efficiency of screening without sacrificing precision. This study tested whether more efficient screening could be gained using two-stage hierarchical screening than by administering multiple separate tests. Two Australian adult samples (N=1990) with high rates of psychopathology were recruited using Facebook advertising to examine four methods of hierarchical screening for four mental disorders: major depressive disorder, generalised anxiety disorder, panic disorder and social phobia. Using K6 scores to determine whether full screening was required did not increase screening efficiency. However, pre-screening based on two decision tree approaches or item gating led to considerable reductions in the mean number of items presented per disorder screened, with estimated item reductions of up to 54%. The sensitivity of these hierarchical methods approached 100% relative to the full screening battery. Further testing of the hierarchical screening approach based on clinical criteria and in other samples is warranted. The results demonstrate that a two-phase hierarchical approach to screening multiple mental disorders leads to considerable increases efficiency gains without reducing accuracy. Screening programs should take advantage of prescreeners based on gating items or decision trees to reduce the burden on respondents. © 2013 Elsevier B.V. All rights reserved.
Ascent control studies of the 049 and ATP parallel burn solid rocket motor shuttle configurations
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Mowery, D. K.; Hammer, M.; Weisler, A. C.
1972-01-01
The control authority approach is discussed as a major problem of the parallel burn soil shuttle configuration due to the many resulting system impacts regardless of the approach. The major trade studies and their results, which led to the recommendation of an SRB TVC control authority approach are presented.
Dahl, Marie; Søgaard, Rikke; Frost, Lars; Høgh, Annette; Lindholt, Jes
2018-05-01
To investigate the effectiveness of systematic screening for multifaceted cardiovascular disease (CVD) in postmenopausal women on all cause mortality and, secondarily, on CVD morbidity. Effectiveness was also evaluated across age strata. This was a population based, prospective, parallel cohort study. In total, 107,491 women born in 1936-1951 living in the Central Denmark region were identified in the Danish Civil Registration System. From this population, all women born in 1936, 1941, 1946, and 1951 (n = 1984) living in the Viborg municipality were invited to attend screening. Of those invited to the screening, 1474 (74.3%) attended. The control group included all women from the general population born in 1936-1951 and living in the Central Denmark Region, excluding those invited for the screening. Information on medication and comorbidities prior to inclusion and study outcomes were retrieved from national registries for both groups. The screening included examination for abdominal aortic aneurysm (AAA), peripheral arterial disease (PAD), carotid plaque (CP), potential hypertension (HT), atrial fibrillation (AF), diabetes mellitus (DM), and dyslipidaemia. The adjusted Cox proportional hazards model with the intention to screen principle was used to assess effectiveness for the total population and across age groups. During follow up (median 3.3 years, IQR 2.9-3.9), the adjusted hazard ratios (HRs) for invited versus controls were the following: all cause mortality, 0.89 (95% CI 0.71-1.12); myocardial infarction (MI), 1.26 (95% CI 0.52-3.07); ischaemic heart disease (IHD), 0.72 (95% CI 0.49-1.05); PAD, 1.07 (95% CI 0.49-2.31); and ischaemic stroke, 1.20 (95% CI 0.78-1.85). A substantial number of women with AAA, PAD, and/or CP declined prophylactic therapy: 45% for antiplatelet and 35% for cholesterol lowering agents. This multifaceted screening offer to a general population sample of postmenopausal women had no effects on all cause mortality or hospital admission for MI, IHD, PAD, and stroke within a short-term follow up period. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.
Baumann, Pascal; Baumgartner, Kai; Hubbuch, Jürgen
2015-05-29
Hydrophobic interaction chromatography (HIC) is one of the most frequently used purification methods in biopharmaceutical industry. A major drawback of HIC, however, is the rather low dynamic binding capacity (DBC) obtained when compared to e.g. ion exchange chromatography (IEX). The typical purification procedure for HIC includes binding at neutral pH, independently of the proteins nature and isoelectric point. Most approaches to process intensification are based on resin and salt screenings. In this paper a combination of protein solubility data and varying binding pH leads to a clear enhancement of dynamic binding capacity. This is shown for three proteins of acidic, neutral, and alkaline isoelectric points. High-throughput solubility screenings as well as miniaturized and parallelized breakthrough curves on Media Scout RoboColumns (Atoll, Germany) were conducted at pH 3-10 on a fully automated robotic workstation. The screening results show a correlation between the DBC and the operational pH, the protein's isoelectric point and the overall solubility. Also, an inverse relationship of DBC in HIC and the binding kinetics was observed. By changing the operational pH, the DBC could be increased up to 30% compared to the standard purification procedure performed at neutral pH. As structural changes of the protein are reported during HIC processes, the applied samples and the elution fractions were proven not to be irreversibly unfolded. Copyright © 2015 Elsevier B.V. All rights reserved.
Galan, Maxime; Pons, Jean-Baptiste; Tournayre, Orianne; Pierre, Éric; Leuchtmann, Maxime; Pontier, Dominique; Charbonnel, Nathalie
2018-05-01
Assessing diet variability is of main importance to better understand the biology of bats and design conservation strategies. Although the advent of metabarcoding has facilitated such analyses, this approach does not come without challenges. Biases may occur throughout the whole experiment, from fieldwork to biostatistics, resulting in the detection of false negatives, false positives or low taxonomic resolution. We detail a rigorous metabarcoding approach based on a short COI minibarcode and two-step PCR protocol enabling the "all at once" taxonomic identification of bats and their arthropod prey for several hundreds of samples. Our study includes faecal pellets collected in France from 357 bats representing 16 species, as well as insect mock communities that mimic bat meals of known composition, negative and positive controls. All samples were analysed using three replicates. We compare the efficiency of DNA extraction methods, and we evaluate the effectiveness of our protocol using identification success, taxonomic resolution, sensitivity and amplification biases. Our parallel identification strategy of predators and prey reduces the risk of mis-assigning prey to wrong predators and decreases the number of molecular steps. Controls and replicates enable to filter the data and limit the risk of false positives, hence guaranteeing high confidence results for both prey occurrence and bat species identification. We validate 551 COI variants from arthropod including 18 orders, 117 family, 282 genus and 290 species. Our method therefore provides a rapid, resolutive and cost-effective screening tool for addressing evolutionary ecological issues or developing "chirosurveillance" and conservation strategies. © 2017 John Wiley & Sons Ltd.
COLA with scale-dependent growth: applications to screened modified gravity models
NASA Astrophysics Data System (ADS)
Winther, Hans A.; Koyama, Kazuya; Manera, Marc; Wright, Bill S.; Zhao, Gong-Bo
2017-08-01
We present a general parallelized and easy-to-use code to perform numerical simulations of structure formation using the COLA (COmoving Lagrangian Acceleration) method for cosmological models that exhibit scale-dependent growth at the level of first and second order Lagrangian perturbation theory. For modified gravity theories we also include screening using a fast approximate method that covers all the main examples of screening mechanisms in the literature. We test the code by comparing it to full simulations of two popular modified gravity models, namely f(R) gravity and nDGP, and find good agreement in the modified gravity boost-factors relative to ΛCDM even when using a fairly small number of COLA time steps.
Kasam, Vinod; Salzemann, Jean; Botha, Marli; Dacosta, Ana; Degliesposti, Gianluca; Isea, Raul; Kim, Doman; Maass, Astrid; Kenyon, Colin; Rastelli, Giulio; Hofmann-Apitius, Martin; Breton, Vincent
2009-05-01
Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR), and on a new promising one, glutathione-S-transferase. In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software on computational grids in finding hits against three different targets (PfGST, PfDHFR, PvDHFR (wild type and mutant forms) implicated in malaria. Grid-enabled virtual screening approach is proposed to produce focus compound libraries for other biological targets relevant to fight the infectious diseases of the developing world.
Parallel approach for bioinspired algorithms
NASA Astrophysics Data System (ADS)
Zaporozhets, Dmitry; Zaruba, Daria; Kulieva, Nina
2018-05-01
In the paper, a probabilistic parallel approach based on the population heuristic, such as a genetic algorithm, is suggested. The authors proposed using a multithreading approach at the micro level at which new alternative solutions are generated. On each iteration, several threads that independently used the same population to generate new solutions can be started. After the work of all threads, a selection operator combines obtained results in the new population. To confirm the effectiveness of the suggested approach, the authors have developed software on the basis of which experimental computations can be carried out. The authors have considered a classic optimization problem – finding a Hamiltonian cycle in a graph. Experiments show that due to the parallel approach at the micro level, increment of running speed can be obtained on graphs with 250 and more vertices.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses.
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C; Altman, Sidney; Schwarz, Udo D; Kyriakides, Themis R; Schroers, Jan
2016-05-27
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
NASA Astrophysics Data System (ADS)
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-05-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design.
Combinatorial development of antibacterial Zr-Cu-Al-Ag thin film metallic glasses
Liu, Yanhui; Padmanabhan, Jagannath; Cheung, Bettina; Liu, Jingbei; Chen, Zheng; Scanley, B. Ellen; Wesolowski, Donna; Pressley, Mariyah; Broadbridge, Christine C.; Altman, Sidney; Schwarz, Udo D.; Kyriakides, Themis R.; Schroers, Jan
2016-01-01
Metallic alloys are normally composed of multiple constituent elements in order to achieve integration of a plurality of properties required in technological applications. However, conventional alloy development paradigm, by sequential trial-and-error approach, requires completely unrelated strategies to optimize compositions out of a vast phase space, making alloy development time consuming and labor intensive. Here, we challenge the conventional paradigm by proposing a combinatorial strategy that enables parallel screening of a multitude of alloys. Utilizing a typical metallic glass forming alloy system Zr-Cu-Al-Ag as an example, we demonstrate how glass formation and antibacterial activity, two unrelated properties, can be simultaneously characterized and the optimal composition can be efficiently identified. We found that in the Zr-Cu-Al-Ag alloy system fully glassy phase can be obtained in a wide compositional range by co-sputtering, and antibacterial activity is strongly dependent on alloy compositions. Our results indicate that antibacterial activity is sensitive to Cu and Ag while essentially remains unchanged within a wide range of Zr and Al. The proposed strategy not only facilitates development of high-performing alloys, but also provides a tool to unveil the composition dependence of properties in a highly parallel fashion, which helps the development of new materials by design. PMID:27230692
Multiplexed microsatellite recovery using massively parallel sequencing
Jennings, T.N.; Knaus, B.J.; Mullins, T.D.; Haig, S.M.; Cronn, R.C.
2011-01-01
Conservation and management of natural populations requires accurate and inexpensive genotyping methods. Traditional microsatellite, or simple sequence repeat (SSR), marker analysis remains a popular genotyping method because of the comparatively low cost of marker development, ease of analysis and high power of genotype discrimination. With the availability of massively parallel sequencing (MPS), it is now possible to sequence microsatellite-enriched genomic libraries in multiplex pools. To test this approach, we prepared seven microsatellite-enriched, barcoded genomic libraries from diverse taxa (two conifer trees, five birds) and sequenced these on one lane of the Illumina Genome Analyzer using paired-end 80-bp reads. In this experiment, we screened 6.1 million sequences and identified 356958 unique microreads that contained di- or trinucleotide microsatellites. Examination of four species shows that our conversion rate from raw sequences to polymorphic markers compares favourably to Sanger- and 454-based methods. The advantage of multiplexed MPS is that the staggering capacity of modern microread sequencing is spread across many libraries; this reduces sample preparation and sequencing costs to less than $400 (USD) per species. This price is sufficiently low that microsatellite libraries could be prepared and sequenced for all 1373 organisms listed as 'threatened' and 'endangered' in the United States for under $0.5M (USD).
Everett, Jeremy R
2013-03-01
The 10th Anniversary of International Drug Discovery Science and Technology (IDDST) Conference was held in Nanjing, China from 8 to 10 November 2012. The conference ran in parallel with the 2nd Annual Symposium of Drug Delivery Systems. Over 400 delegates from both conferences came together for the Opening Ceremony and Keynote Addresses but otherwise pursued separate paths in the huge facilities of the Nanjing International Expo Centre. The IDDST was arranged into 19 separate Chapters covering drug discovery biology, target validation, chemistry, rational drug design, pharmacology and toxicology, drug screening technology, 'omics' technologies, analytical, automation and enabling technologies, informatics, stem cells and regenerative medicine, bioprocessing, generics, biosimilars and biologicals and seven disease areas: cancer, CNS, respiratory and inflammation, autoimmune, emerging infectious, bone and orphan diseases. There were also two sessions of a 'Bench to Bedside to Business' Program and a Chinese Scientist programme. In each period of the IDDST conference, up to seven sessions were running in parallel. This Meeting Highlight samples just a fraction of the content of this large meeting. The talks included have as a link, the use of new approaches to drug discovery. Many other excellent talks could have been highlighted and the author has necessarily had to be selective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lawrence R.; Hoyt, David W.; Walker, S. Michael
We present a novel approach to improve accuracy of metabolite identification by combining direct infusion ESI MS1 with 1D 1H NMR spectroscopy. The new approach first applies standard 1D 1H NMR metabolite identification protocol by matching the chemical shift, J-coupling and intensity information of experimental NMR signals against the NMR signals of standard metabolites in metabolomics library. This generates a list of candidate metabolites. The list contains false positive and ambiguous identifications. Next, we constrained the list with the chemical formulas derived from high-resolution direct infusion ESI MS1 spectrum of the same sample. Detection of the signals of a metabolitemore » both in NMR and MS significantly improves the confidence of identification and eliminates false positive identification. 1D 1H NMR and direct infusion ESI MS1 spectra of a sample can be acquired in parallel in several minutes. This is highly beneficial for rapid and accurate screening of hundreds of samples in high-throughput metabolomics studies. In order to make this approach practical, we developed a software tool, which is integrated to Chenomx NMR Suite. The approach is demonstrated on a model mixture, tomato and Arabidopsis thaliana metabolite extracts, and human urine.« less
Integrated bioassays in microfluidic devices: botulinum toxin assays.
Mangru, Shakuntala; Bentz, Bryan L; Davis, Timothy J; Desai, Nitin; Stabile, Paul J; Schmidt, James J; Millard, Charles B; Bavari, Sina; Kodukula, Krishna
2005-12-01
A microfluidic assay was developed for screening botulinum neurotoxin serotype A (BoNT-A) by using a fluorescent resonance energy transfer (FRET) assay. Molded silicone microdevices with integral valves, pumps, and reagent reservoirs were designed and fabricated. Electrical and pneumatic control hardware were constructed, and software was written to automate the assay protocol and data acquisition. Detection was accomplished by fluorescence microscopy. The system was validated with a peptide inhibitor, running 2 parallel assays, as a feasibility demonstration. The small footprint of each bioreactor cell (0.5 cm2) and scalable fluidic architecture enabled many parallel assays on a single chip. The chip is programmable to run a dilution series in each lane, generating concentration-response data for multiple inhibitors. The assay results showed good agreement with the corresponding experiments done at a macroscale level. Although the system has been developed for BoNT-A screening, a wide variety of assays can be performed on the microfluidic chip with little or no modification.
Suprun, Elena V; Saveliev, Anatoly A; Evtugyn, Gennady A; Lisitsa, Alexander V; Bulko, Tatiana V; Shumyantseva, Victoria V; Archakov, Alexander I
2012-03-15
A novel direct antibodies-free electrochemical approach for acute myocardial infarction (AMI) diagnosis has been developed. For this purpose, a combination of the electrochemical assay of plasma samples with chemometrics was proposed. Screen printed carbon electrodes modified with didodecyldimethylammonium bromide were used for plasma charactrerization by cyclic (CV) and square wave voltammetry and square wave (SWV) voltammetry. It was shown that the cathodic peak in voltammograms at about -250 mV vs. Ag/AgCl can be associated with AMI. In parallel tests, cardiac myoglobin and troponin I, the AMI biomarkers, were determined in each sample by RAMP immunoassay. The applicability of the electrochemical testing for AMI diagnostics was confirmed by statistical methods: generalized linear model (GLM), linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA), artificial neural net (multi-layer perception, MLP), and support vector machine (SVM), all of which were created to obtain the "True-False" distribution prediction where "True" and "False" are, respectively, positive and negative decision about an illness event. Copyright © 2011 Elsevier B.V. All rights reserved.
A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations
NASA Technical Reports Server (NTRS)
Venter, Gerhard; Sobieszczanski-Sobieski, Jaroslaw
2005-01-01
A parallel Particle Swarm Optimization (PSO) algorithm is presented. Particle swarm optimization is a fairly recent addition to the family of non-gradient based, probabilistic search algorithms that is based on a simplified social model and is closely tied to swarming theory. Although PSO algorithms present several attractive properties to the designer, they are plagued by high computational cost as measured by elapsed time. One approach to reduce the elapsed time is to make use of coarse-grained parallelization to evaluate the design points. Previous parallel PSO algorithms were mostly implemented in a synchronous manner, where all design points within a design iteration are evaluated before the next iteration is started. This approach leads to poor parallel speedup in cases where a heterogeneous parallel environment is used and/or where the analysis time depends on the design point being analyzed. This paper introduces an asynchronous parallel PSO algorithm that greatly improves the parallel e ciency. The asynchronous algorithm is benchmarked on a cluster assembled of Apple Macintosh G5 desktop computers, using the multi-disciplinary optimization of a typical transport aircraft wing as an example.
Using CLIPS in the domain of knowledge-based massively parallel programming
NASA Technical Reports Server (NTRS)
Dvorak, Jiri J.
1994-01-01
The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.
Hybrid Parallelism for Volume Rendering on Large-, Multi-, and Many-Core Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howison, Mark; Bethel, E. Wes; Childs, Hank
2012-01-01
With the computing industry trending towards multi- and many-core processors, we study how a standard visualization algorithm, ray-casting volume rendering, can benefit from a hybrid parallelism approach. Hybrid parallelism provides the best of both worlds: using distributed-memory parallelism across a large numbers of nodes increases available FLOPs and memory, while exploiting shared-memory parallelism among the cores within each node ensures that each node performs its portion of the larger calculation as efficiently as possible. We demonstrate results from weak and strong scaling studies, at levels of concurrency ranging up to 216,000, and with datasets as large as 12.2 trillion cells.more » The greatest benefit from hybrid parallelism lies in the communication portion of the algorithm, the dominant cost at higher levels of concurrency. We show that reducing the number of participants with a hybrid approach significantly improves performance.« less
Increasing airport capacity with modified IFR approach procedures for close-spaced parallel runways
DOT National Transportation Integrated Search
2001-01-01
Because of wake turbulence considerations, current instrument approach : procedures treat close-spaced (i.e., less than 2,500 feet apart) parallel run : ways as a single runway. This restriction is designed to assure safety for all : aircraft types u...
NASA Astrophysics Data System (ADS)
Chaves-González, José M.; Vega-Rodríguez, Miguel A.; Gómez-Pulido, Juan A.; Sánchez-Pérez, Juan M.
2011-08-01
This article analyses the use of a novel parallel evolutionary strategy to solve complex optimization problems. The work developed here has been focused on a relevant real-world problem from the telecommunication domain to verify the effectiveness of the approach. The problem, known as frequency assignment problem (FAP), basically consists of assigning a very small number of frequencies to a very large set of transceivers used in a cellular phone network. Real data FAP instances are very difficult to solve due to the NP-hard nature of the problem, therefore using an efficient parallel approach which makes the most of different evolutionary strategies can be considered as a good way to obtain high-quality solutions in short periods of time. Specifically, a parallel hyper-heuristic based on several meta-heuristics has been developed. After a complete experimental evaluation, results prove that the proposed approach obtains very high-quality solutions for the FAP and beats any other result published.
NASA Technical Reports Server (NTRS)
Waller, Marvin C. (Editor); Scanlon, Charles H. (Editor)
1996-01-01
A Government and Industry workshop on Flight-Deck-Centered Parallel Runway Approaches in Instrument Meteorological Conditions (IMC) was conducted October 29, 1996 at the NASA Langley Research Center. This document contains the slides and records of the proceedings of the workshop. The purpose of the workshop was to disclose to the National airspace community the status of ongoing NASA R&D to address the closely spaced parallel runway problem in IMC and to seek advice and input on direction of future work to assure an optimized research approach. The workshop also included a description of a Paired Approach Concept which is being studied at United Airlines for application at the San Francisco International Airport.
An Analysis of the Role of ATC in the AILS Concept
NASA Technical Reports Server (NTRS)
Waller, Marvin C.; Doyle, Thomas M.; McGee, Frank G.
2000-01-01
Airborne information for lateral spacing (AILS) is a concept for making approaches to closely spaced parallel runways in instrument meteorological conditions (IMC). Under the concept, each equipped aircraft will assume responsibility for accurately managing its flight path along the approach course and maintaining separation from aircraft on the parallel approach. This document presents the results of an analysis of the AILS concept from an Air Traffic Control (ATC) perspective. The process has been examined in a step by step manner to determine ATC system support necessary to safely conduct closely spaced parallel approaches using the AILS concept. The analysis resulted in recognizing a number of issues related to integrating the process into the airspace system and proposes operating procedures.
Colorectal Cancer Deaths Attributable to Nonuse of Screening in the United States
Meester, Reinier G.S.; Doubeni, Chyke A.; Lansdorp-Vogelaar, Iris; Goede, S.L.; Levin, Theodore R.; Quinn, Virginia P.; van Ballegooijen, Marjolein; Corley, Douglas A.; Zauber, Ann G.
2015-01-01
Purpose Screening is a major contributor to colorectal cancer (CRC) mortality reductions in the U.S., but is underutilized. We estimated the fraction of CRC deaths attributable to nonuse of screening to demonstrate the potential benefits from targeted interventions. Methods The established MISCAN-colon microsimulation model was used to estimate the population attributable fraction (PAF) in people aged ≥50 years. The model incorporates long-term patterns and effects of screening by age and type of screening test. PAF for 2010 was estimated using currently available data on screening uptake; PAF was also projected assuming constant future screening rates to incorporate lagged effects from past increases in screening uptake. We also computed PAF using Levin's formula to gauge how this simpler approach differs from the model-based approach. Results There were an estimated 51,500 CRC deaths in 2010, about 63% (N∼32,200) of which were attributable to non-screening. The PAF decreases slightly to 58% in 2020. Levin's approach yielded a considerably more conservative PAF of 46% (N∼23,600) for 2010. Conclusions The majority of current U.S. CRC deaths are attributable to non-screening. This underscores the potential benefits of increasing screening uptake in the population. Traditional methods of estimating PAF underestimated screening effects compared with model-based approaches. PMID:25721748
Interaction of Vortex Rings and Steady Jets with Permeable Screens of Varied Porosity
NASA Astrophysics Data System (ADS)
Musta, Mustafa
2013-11-01
Vortex ring and steady jet interaction with a porous matrix formed from several parallel, transparent permeable screens with the same grid geometry for open area ratios (φ) 49.5% - 83.8% was studied previously using digital particle image velocimetry (DPIV) at jet Reynolds number (Re) of 1000-3000. Vortex ring results showed that unlike the experiments with thin screens, a transmitted vortex ring, which has a similar diameter to the primary one, wasn't formed. Instead a centerline vortex ring like structure formed and its diameter, circulation, and dissipation time decreased as φ decreased. However, for the case of screens φ = 55.7% with large screen spacing, reformation of large scale weak vortex rings was observed downstream of the first screen. The present work experimentally investigates the interaction of vortex rings and steady jets with screens of decreasing φ (83.8%-49.5%) in the flow direction. A piston type vortex ring generator was used and measurements were made using DPIV. The vortex ring results show that the size and circulation of the vortex ring like flow structure was changed based on the screen φ within the permeable screen matrix. Similarly, steady jet flow structure and the local turbulent kinetic energy was changed based on the local screen φ.
Real-Time Monitoring of Scada Based Control System for Filling Process
NASA Astrophysics Data System (ADS)
Soe, Aung Kyaw; Myint, Aung Naing; Latt, Maung Maung; Theingi
2008-10-01
This paper is a design of real-time monitoring for filling system using Supervisory Control and Data Acquisition (SCADA). The monitoring of production process is described in real-time using Visual Basic.Net programming under Visual Studio 2005 software without SCADA software. The software integrators are programmed to get the required information for the configuration screens. Simulation of components is expressed on the computer screen using parallel port between computers and filling devices. The programs of real-time simulation for the filling process from the pure drinking water industry are provided.
Dong, Suwei; Cahill, Kath arine J.; Kang, Moon -Il; Colburn, Nancy H.; Henrich, Curtis J.; Wilson, Jennifer A.; Beutler, John A.; Johnson, Richard P.; Porco, John A.
2011-01-01
We have accomplished a parallel screen of cycloaddition partners for ortho-quinols utilizing a plate-based microwave system. Microwave irradiation improves the efficiency of retro-Diels-Alder/Diels-Alder cascades of ortho-quinol dimers which generally proceed in a diastereoselective fashion. Computational studies indicate that asynchronous transition states are favored in Diels-Alder cycloadditions of ortho-quinols. Subsequent biological evaluation of a collection of cycloadducts has identified an inhibitor of activator protein-1 (AP-1), an oncogenic transcription factor. PMID:21942286
Two-Piece Screens for Decontaminating Granular Material
NASA Technical Reports Server (NTRS)
Backes, Douglas; Poulter, Clay; Godfrey, Max; Dutton, Melinda; Tolman, Dennis
2009-01-01
Two-piece screens have been designed specifically for use in filtering a granular material to remove contaminant particles that are significantly wider or longer than are the desired granules. In the original application for which the twopiece screens were conceived, the granular material is ammonium perchlorate and the contaminant particles tend to be wires and other relatively long, rigid strands. The basic design of the twopiece screens can be adapted to other granular materials and contaminants by modifying critical dimensions to accommodate different grain and contaminant- particle sizes. A two-piece screen of this type consists mainly of (1) a top flat plate perforated with circular holes arranged in a hexagonal pattern and (2) a bottom plate that is also perforated with circular holes (but not in a pure hexagonal pattern) and is folded into an accordion structure. Fabrication of the bottom plate begins with drilling circular holes into a flat plate in a hexagonal pattern that is interrupted, at regular intervals, by parallel gaps. The plate is then folded into the accordion structure along the gaps. Because the folds are along the gaps, there are no holes at the peaks and valleys of the accordion screen. The top flat plate and the bottom accordion plate are secured within a metal frame. The resulting two-piece screen is placed at the bottom opening of a feed hopper containing the granular material to be filtered. Tests have shown that such long, rigid contaminant strands as wires readily can pass through a filter consisting of the flat screen alone and that the addition of the accordion screen below the flat screen greatly increases the effectiveness of removal of wires and other contaminant strands. Part of the reason for increased effectiveness is in the presentation of the contaminant to the filter surface. Testing has shown that wire type contamination will readily align itself parallel to the material direction flow. Since this direction of flow is nearly always perpendicular to the filter surface holes, the contamination is automatically aligned to pass through. The two-filter configuration reduces the likelihood that a given contaminant strand will be aligned with the flow of material by eliminating the perpendicular presentation angle. Thus, for wires of a certain diameter, a two-piece screen is 20 percent more effective than is the corresponding flat perforated plate alone, even if the holes in the flat plate are narrower. An accordion screen alone is similarly effective in catching contaminants, but lumps of agglomerated granules of the desired material often collect in the valleys and clog the screen. The addition of a flat screen above the accordion screen prevents clogging of the accordion screen. Flat wire screens have often been used to remove contaminants from granular materials, and are about as effective as are the corresponding perforated flat plates used alone.
Petit, Charlotte; Bujard, Alban; Skalicka-Woźniak, Krystyna; Cretton, Sylvian; Houriet, Joëlle; Christen, Philippe; Carrupt, Pierre-Alain; Wolfender, Jean-Luc
2016-03-01
At the early drug discovery stage, the high-throughput parallel artificial membrane permeability assay is one of the most frequently used in vitro models to predict transcellular passive absorption. While thousands of new chemical entities have been screened with the parallel artificial membrane permeability assay, in general, permeation properties of natural products have been scarcely evaluated. In this study, the parallel artificial membrane permeability assay through a hexadecane membrane was used to predict the passive intestinal absorption of a representative set of frequently occurring natural products. Since natural products are usually ingested for medicinal use as components of complex extracts in traditional herbal preparations or as phytopharmaceuticals, the applicability of such an assay to study the constituents directly in medicinal crude plant extracts was further investigated. Three representative crude plant extracts with different natural product compositions were chosen for this study. The first extract was composed of furanocoumarins (Angelica archangelica), the second extract included alkaloids (Waltheria indica), and the third extract contained flavonoid glycosides (Pueraria montana var. lobata). For each medicinal plant, the effective passive permeability values Pe (cm/s) of the main natural products of interest were rapidly calculated thanks to a generic ultrahigh-pressure liquid chromatography-UV detection method and because Pe calculations do not require knowing precisely the concentration of each natural product within the extracts. The original parallel artificial membrane permeability assay through a hexadecane membrane was found to keep its predictive power when applied to constituents directly in crude plant extracts provided that higher quantities of the extract were initially loaded in the assay in order to ensure suitable detection of the individual constituents of the extracts. Such an approach is thus valuable for the high-throughput, cost-effective, and early evaluation of passive intestinal absorption of active principles in medicinal plants. In phytochemical studies, obtaining effective passive permeability values of pharmacologically active natural products is important to predict if natural products showing interesting activities in vitro may have a chance to reach their target in vivo. Georg Thieme Verlag KG Stuttgart · New York.
Binh, Chu Thi Thanh; Tong, Tiezheng; Gaillard, Jean-François; Gray, Kimberly A; Kelly, John J
2014-01-01
The nanotechnology industry is growing rapidly, leading to concerns about the potential ecological consequences of the release of engineered nanomaterials (ENMs) to the environment. One challenge of assessing the ecological risks of ENMs is the incredible diversity of ENMs currently available and the rapid pace at which new ENMs are being developed. High-throughput screening (HTS) is a popular approach to assessing ENM cytotoxicity that offers the opportunity to rapidly test in parallel a wide range of ENMs at multiple concentrations. However, current HTS approaches generally test one cell type at a time, which limits their ability to predict responses of complex microbial communities. In this study toxicity screening via a HTS platform was used in combination with next generation sequencing (NGS) to assess responses of bacterial communities from two aquatic habitats, Lake Michigan (LM) and the Chicago River (CR), to short-term exposure in their native waters to several commercial TiO2 nanomaterials under simulated solar irradiation. Results demonstrate that bacterial communities from LM and CR differed in their sensitivity to nano-TiO2, with the community from CR being more resistant. NGS analysis revealed that the composition of the bacterial communities from LM and CR were significantly altered by exposure to nano-TiO2, including decreases in overall bacterial diversity, decreases in the relative abundance of Actinomycetales, Sphingobacteriales, Limnohabitans, and Flavobacterium, and a significant increase in Limnobacter. These results suggest that the release of nano-TiO2 to the environment has the potential to alter the composition of aquatic bacterial communities, which could have implications for the stability and function of aquatic ecosystems. The novel combination of HTS and NGS described in this study represents a major advance over current methods for assessing ENM ecotoxicity because the relative toxicities of multiple ENMs to thousands of naturally occurring bacterial species can be assessed simultaneously under environmentally relevant conditions.
Binh, Chu Thi Thanh; Tong, Tiezheng; Gaillard, Jean-François; Gray, Kimberly A.; Kelly, John J.
2014-01-01
The nanotechnology industry is growing rapidly, leading to concerns about the potential ecological consequences of the release of engineered nanomaterials (ENMs) to the environment. One challenge of assessing the ecological risks of ENMs is the incredible diversity of ENMs currently available and the rapid pace at which new ENMs are being developed. High-throughput screening (HTS) is a popular approach to assessing ENM cytotoxicity that offers the opportunity to rapidly test in parallel a wide range of ENMs at multiple concentrations. However, current HTS approaches generally test one cell type at a time, which limits their ability to predict responses of complex microbial communities. In this study toxicity screening via a HTS platform was used in combination with next generation sequencing (NGS) to assess responses of bacterial communities from two aquatic habitats, Lake Michigan (LM) and the Chicago River (CR), to short-term exposure in their native waters to several commercial TiO2 nanomaterials under simulated solar irradiation. Results demonstrate that bacterial communities from LM and CR differed in their sensitivity to nano-TiO2, with the community from CR being more resistant. NGS analysis revealed that the composition of the bacterial communities from LM and CR were significantly altered by exposure to nano-TiO2, including decreases in overall bacterial diversity, decreases in the relative abundance of Actinomycetales, Sphingobacteriales, Limnohabitans, and Flavobacterium, and a significant increase in Limnobacter. These results suggest that the release of nano-TiO2 to the environment has the potential to alter the composition of aquatic bacterial communities, which could have implications for the stability and function of aquatic ecosystems. The novel combination of HTS and NGS described in this study represents a major advance over current methods for assessing ENM ecotoxicity because the relative toxicities of multiple ENMs to thousands of naturally occurring bacterial species can be assessed simultaneously under environmentally relevant conditions. PMID:25162615
Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)
2002-01-01
A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel supercomputers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.
Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Byun, Chansup; Kwak, Dochan (Technical Monitor)
2001-01-01
A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel super computers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.
Operation of high power converters in parallel
NASA Technical Reports Server (NTRS)
Decker, D. K.; Inouye, L. Y.
1993-01-01
High power converters that are used in space power subsystems are limited in power handling capability due to component and thermal limitations. For applications, such as Space Station Freedom, where multi-kilowatts of power must be delivered to user loads, parallel operation of converters becomes an attractive option when considering overall power subsystem topologies. TRW developed three different unequal power sharing approaches for parallel operation of converters. These approaches, known as droop, master-slave, and proportional adjustment, are discussed and test results are presented.
Parallelized seeded region growing using CUDA.
Park, Seongjin; Lee, Jeongjin; Lee, Hyunna; Shin, Juneseuk; Seo, Jinwook; Lee, Kyoung Ho; Shin, Yeong-Gil; Kim, Bohyoung
2014-01-01
This paper presents a novel method for parallelizing the seeded region growing (SRG) algorithm using Compute Unified Device Architecture (CUDA) technology, with intention to overcome the theoretical weakness of SRG algorithm of its computation time being directly proportional to the size of a segmented region. The segmentation performance of the proposed CUDA-based SRG is compared with SRG implementations on single-core CPUs, quad-core CPUs, and shader language programming, using synthetic datasets and 20 body CT scans. Based on the experimental results, the CUDA-based SRG outperforms the other three implementations, advocating that it can substantially assist the segmentation during massive CT screening tests.
Debugging Fortran on a shared memory machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, T.R.; Padua, D.A.
1987-01-01
Debugging on a parallel processor is more difficult than debugging on a serial machine because errors in a parallel program may introduce nondeterminism. The approach to parallel debugging presented here attempts to reduce the problem of debugging on a parallel machine to that of debugging on a serial machine by automatically detecting nondeterminism. 20 refs., 6 figs.
NASA Astrophysics Data System (ADS)
Akil, Mohamed
2017-05-01
The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.
High-throughput Titration of Luciferase-expressing Recombinant Viruses
Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon
2014-01-01
Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536
Identifying gnostic predictors of the vaccine response.
Haining, W Nicholas; Pulendran, Bali
2012-06-01
Molecular predictors of the response to vaccination could transform vaccine development. They would allow larger numbers of vaccine candidates to be rapidly screened, shortening the development time for new vaccines. Gene-expression based predictors of vaccine response have shown early promise. However, a limitation of gene-expression based predictors is that they often fail to reveal the mechanistic basis of their ability to classify response. Linking predictive signatures to the function of their component genes would advance basic understanding of vaccine immunity and also improve the robustness of vaccine prediction. New analytic tools now allow more biological meaning to be extracted from predictive signatures. Functional genomic approaches to perturb gene expression in mammalian cells permit the function of predictive genes to be surveyed in highly parallel experiments. The challenge for vaccinologists is therefore to use these tools to embed mechanistic insights into predictors of vaccine response. Copyright © 2012 Elsevier Ltd. All rights reserved.
Identifying gnostic predictors of the vaccine response
Haining, W. Nicholas; Pulendran, Bali
2012-01-01
Molecular predictors of the response to vaccination could transform vaccine development. They would allow larger numbers of vaccine candidates to be rapidly screened, shortening the development time for new vaccines. Gene-expression based predictors of vaccine response have shown early promise. However, a limitation of gene-expression based predictors is that they often fail to reveal the mechanistic basis for their ability to classify response. Linking predictive signatures to the function of their component genes would advance basic understanding of vaccine immunity and also improve the robustness of outcome classification. New analytic tools now allow more biological meaning to be extracted from predictive signatures. Functional genomic approaches to perturb gene expression in mammalian cells permit the function of predictive genes to be surveyed in highly parallel experiments. The challenge for vaccinologists is therefore to use these tools to embed mechanistic insights into predictors of vaccine response. PMID:22633886
Refolding strategies from inclusion bodies in a structural genomics project.
Trésaugues, Lionel; Collinet, Bruno; Minard, Philippe; Henckes, Gilles; Aufrère, Robert; Blondeau, Karine; Liger, Dominique; Zhou, Cong-Zhao; Janin, Joël; Van Tilbeurgh, Herman; Quevillon-Cheruel, Sophie
2004-01-01
The South-Paris Yeast Structural Genomics Project aims at systematically expressing, purifying and determining the structure of S. cerevisiae proteins with no detectable homology to proteins of known structure. We brought 250 yeast ORFs to expression in E. coli, but 37% of them form inclusion bodies. This important fraction of proteins that are well expressed but lost for structural studies prompted us to test methodologies to recover these proteins. Three different strategies were explored in parallel on a set of 20 proteins: (1) refolding from solubilized inclusion bodies using an original and fast 96-well plates screening test, (2) co-expression of the targets in E. coli with DnaK-DnaJ-GrpE and GroEL-GroES chaperones, and (3) use of the cell-free expression system. Most of the tested proteins (17/20) could be resolubilized at least by one approach, but the subsequent purification proved to be difficult for most of them.
Microfluidic droplet trapping array as nanoliter reactors for gas-liquid chemical reaction.
Zhang, Qingquan; Zeng, Shaojiang; Qin, Jianhua; Lin, Bingcheng
2009-09-01
This article presents a simple method for trapping arrays of droplets relying on the designed microstructures of the microfluidic device, and this has been successfully used for parallel gas-liquid chemical reaction. In this approach, the trapping structure is composed of main channel, lateral channel and trapping region. Under a negative pressure, array droplets can be generated and trapped in the microstructure simultaneously, without the use of surfactant and the precise control of the flow velocity. By using a multi-layer microdevice containing the microstructures, single (pH gradient) and multiple gas-liquid reactions (metal ion-NH3 complex reaction) can be performed in array droplets through the transmembrane diffusion of the gas. The droplets with quantitative concentration gradient can be formed by only replacing the specific membrane. The established method is simple, robust and easy to operate, demonstrating the potential of this device for droplet-based high-throughput screening.
Crescentini, Marco; Thei, Frederico; Bennati, Marco; Saha, Shimul; de Planque, Maurits R R; Morgan, Hywel; Tartagni, Marco
2015-06-01
Lipid bilayer membrane (BLM) arrays are required for high throughput analysis, for example drug screening or advanced DNA sequencing. Complex microfluidic devices are being developed but these are restricted in terms of array size and structure or have integrated electronic sensing with limited noise performance. We present a compact and scalable multichannel electrophysiology platform based on a hybrid approach that combines integrated state-of-the-art microelectronics with low-cost disposable fluidics providing a platform for high-quality parallel single ion channel recording. Specifically, we have developed a new integrated circuit amplifier based on a novel noise cancellation scheme that eliminates flicker noise derived from devices under test and amplifiers. The system is demonstrated through the simultaneous recording of ion channel activity from eight bilayer membranes. The platform is scalable and could be extended to much larger array sizes, limited only by electronic data decimation and communication capabilities.
RATIONALIZED AND COMPLEMENTARY FINDINGS OF SILYMARIN (MILK THISTLE) IN PAKISTANI HEALTHY VOLUNTEERS.
Ashraf, Muhammad; Abid, Farah; Riffat, Sualeha; Bashir, Sajid; Iqbal, Javed; Sarfraz, Muhammad; Afzal, Attia; Zaheer, Muhammad
2015-01-01
The aim of the work was to examine the influence of gender on pharmacokinetics of silymarin; a basic constituent of medicinal herb "milk thistle" (Silybum marianum). The presented work is the extension of published work of Usman et al. (16). The comparative parallel design pharmacokinetic study was conducted in Pakistani healthy volunteers (male and female) receiving a single 200 mg oral dose of silymarin. Sixteen subjects (8 males and 8 females) were enrolled and completed the 12 h study. Blood screening was done on HPLC and the pharmacokinetic parameters were calculated by APO, 3.2 Ver. software using non-compartmental and two compartment model approaches. A significant difference (p < 0.05) was observed in almost all calculated pharmacokinetic parameters of silymarin in male and female. Clinically, the silymarin has been underestimated in the previous study. Gender based clinical investigations should be directed in the future on other flavono-lignans of 'milk thistle' as well.
GPU Accelerated Chemical Similarity Calculation for Compound Library Comparison
Ma, Chao; Wang, Lirong; Xie, Xiang-Qun
2012-01-01
Chemical similarity calculation plays an important role in compound library design, virtual screening, and “lead” optimization. In this manuscript, we present a novel GPU-accelerated algorithm for all-vs-all Tanimoto matrix calculation and nearest neighbor search. By taking advantage of multi-core GPU architecture and CUDA parallel programming technology, the algorithm is up to 39 times superior to the existing commercial software that runs on CPUs. Because of the utilization of intrinsic GPU instructions, this approach is nearly 10 times faster than existing GPU-accelerated sparse vector algorithm, when Unity fingerprints are used for Tanimoto calculation. The GPU program that implements this new method takes about 20 minutes to complete the calculation of Tanimoto coefficients between 32M PubChem compounds and 10K Active Probes compounds, i.e., 324G Tanimoto coefficients, on a 128-CUDA-core GPU. PMID:21692447
Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred
2014-01-01
Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield. PMID:25333064
Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred
2014-09-01
Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield.
Massively parallel multicanonical simulations
NASA Astrophysics Data System (ADS)
Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard
2018-03-01
Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, D.A.; Grunwald, D.C.
The spectrum of parallel processor designs can be divided into three sections according to the number and complexity of the processors. At one end there are simple, bit-serial processors. Any one of thee processors is of little value, but when it is coupled with many others, the aggregate computing power can be large. This approach to parallel processing can be likened to a colony of termites devouring a log. The most notable examples of this approach are the NASA/Goodyear Massively Parallel Processor, which has 16K one-bit processors, and the Thinking Machines Connection Machine, which has 64K one-bit processors. At themore » other end of the spectrum, a small number of processors, each built using the fastest available technology and the most sophisticated architecture, are combined. An example of this approach is the Cray X-MP. This type of parallel processing is akin to four woodmen attacking the log with chainsaws.« less
Single-agent parallel window search
NASA Technical Reports Server (NTRS)
Powley, Curt; Korf, Richard E.
1991-01-01
Parallel window search is applied to single-agent problems by having different processes simultaneously perform iterations of Iterative-Deepening-A(asterisk) (IDA-asterisk) on the same problem but with different cost thresholds. This approach is limited by the time to perform the goal iteration. To overcome this disadvantage, the authors consider node ordering. They discuss how global node ordering by minimum h among nodes with equal f = g + h values can reduce the time complexity of serial IDA-asterisk by reducing the time to perform the iterations prior to the goal iteration. Finally, the two ideas of parallel window search and node ordering are combined to eliminate the weaknesses of each approach while retaining the strengths. The resulting approach, called simply parallel window search, can be used to find a near-optimal solution quickly, improve the solution until it is optimal, and then finally guarantee optimality, depending on the amount of time available.
Huang, Kuo-Sen; Mark, David; Gandenberger, Frank Ulrich
2006-01-01
The plate::vision is a high-throughput multimode reader capable of reading absorbance, fluorescence, fluorescence polarization, time-resolved fluorescence, and luminescence. Its performance has been shown to be quite comparable with other readers. When the reader is integrated into the plate::explorer, an ultrahigh-throughput screening system with event-driven software and parallel plate-handling devices, it becomes possible to run complicated assays with kinetic readouts in high-density microtiter plate formats for high-throughput screening. For the past 5 years, we have used the plate::vision and the plate::explorer to run screens and have generated more than 30 million data points. Their throughput, performance, and robustness have speeded up our drug discovery process greatly.
Nonlinear AC susceptibility, surface and bulk shielding
NASA Astrophysics Data System (ADS)
van der Beek, C. J.; Indenbom, M. V.; D'Anna, G.; Benoit, W.
1996-02-01
We calculate the nonlinear AC response of a thin superconducting strip in perpendicular field, shielded by an edge current due to the geometrical barrier. A comparison with the results for infinite samples in parallel field, screened by a surface barrier, and with those for screening by a bulk current in the critical state, shows that the AC response due to a barrier has general features that are independent of geometry, and that are significantly different from those for screening by a bulk current in the critical state. By consequence, the nonlinear (global) AC susceptibility can be used to determine the origin of magnetic irreversibility. A comparison with experiments on a Bi 2Sr 2CaCu 2O 8+δ crystal shows that in this material, the low-frequency AC screening at high temperature is mainly due to the screening by an edge current, and that this is the unique source of the nonlinear magnetic response at temperatures above 40 K.
Reducing the health consequences of opioid addiction in primary care.
Bowman, Sarah; Eiserman, Julie; Beletsky, Leo; Stancliff, Sharon; Bruce, R Douglas
2013-07-01
Addiction to prescription opioids is prevalent in primary care settings. Increasing prescription opioid use is largely responsible for a parallel increase in overdose nationally. Many patients most at risk for addiction and overdose come into regular contact with primary care providers. Lack of routine addiction screening results in missed treatment opportunities in this setting. We reviewed the literature on screening and brief interventions for addictive disorders in primary care settings, focusing on opioid addiction. Screening and brief interventions can improve health outcomes for chronic illnesses including diabetes, hypertension, and asthma. Similarly, through the use of screening and brief interventions, patients with addiction can achieve improved health outcome. A spectrum of low-threshold care options can reduce the negative health consequences among individuals with opioid addiction. Screening in primary care coupled with short interventions, including motivational interviewing, syringe distribution, naloxone prescription for overdose prevention, and buprenorphine treatment are effective ways to manage addiction and its associated risks and improve health outcomes for individuals with opioid addiction. Copyright © 2013 Elsevier Inc. All rights reserved.
Quark structure of static correlators in high temperature QCD
NASA Astrophysics Data System (ADS)
Bernard, Claude; DeGrand, Thomas A.; DeTar, Carleton; Gottlieb, Steven; Krasnitz, A.; Ogilvie, Michael C.; Sugar, R. L.; Toussaint, D.
1992-07-01
We present results of numerical simulations of quantum chromodynamics at finite temperature with two flavors of Kogut-Susskind quarks on the Intel iPSC/860 parallel processor. We investigate the properties of the objects whose exchange gives static screening lengths by reconstructing their correlated quark-antiquark structure.
3D hyperpolarized C-13 EPI with calibrationless parallel imaging
NASA Astrophysics Data System (ADS)
Gordon, Jeremy W.; Hansen, Rie B.; Shin, Peter J.; Feng, Yesu; Vigneron, Daniel B.; Larson, Peder E. Z.
2018-04-01
With the translation of metabolic MRI with hyperpolarized 13C agents into the clinic, imaging approaches will require large volumetric FOVs to support clinical applications. Parallel imaging techniques will be crucial to increasing volumetric scan coverage while minimizing RF requirements and temporal resolution. Calibrationless parallel imaging approaches are well-suited for this application because they eliminate the need to acquire coil profile maps or auto-calibration data. In this work, we explored the utility of a calibrationless parallel imaging method (SAKE) and corresponding sampling strategies to accelerate and undersample hyperpolarized 13C data using 3D blipped EPI acquisitions and multichannel receive coils, and demonstrated its application in a human study of [1-13C]pyruvate metabolism.
Covassin, L D; Siekmann, A F; Kacergis, M C; Laver, E; Moore, J C; Villefranc, J A; Weinstein, B M; Lawson, N D
2009-05-15
In this work we describe a forward genetic approach to identify mutations that affect blood vessel development in the zebrafish. By applying a haploid screening strategy in a transgenic background that allows direct visualization of blood vessels, it was possible to identify several classes of mutant vascular phenotypes. Subsequent characterization of mutant lines revealed that defects in Vascular endothelial growth factor (Vegf) signaling specifically affected artery development. Comparison of phenotypes associated with different mutations within a functional zebrafish Vegf receptor-2 ortholog (referred to as kdr-like, kdrl) revealed surprisingly varied effects on vascular development. In parallel, we identified an allelic series of mutations in phospholipase c gamma 1 (plcg1). Together with in vivo structure-function analysis, our results suggest a requirement for Plcg1 catalytic activity downstream of receptor tyrosine kinases. We further find that embryos lacking both maternal and zygotic plcg1 display more severe defects in artery differentiation but are otherwise similar to zygotic mutants. Finally, we demonstrate through mosaic analysis that plcg1 functions autonomously in endothelial cells. Together our genetic analyses suggest that Vegf/Plcg1 signaling acts at multiple time points and in different signaling contexts to mediate distinct aspects of artery development.
Covassin, L. D.; Siekmann, A. F.; Kacergis, M. C.; Laver, E.; Moore, J. C.; Villefranc, J. A.; Weinstein, B. M.; Lawson, N. D.
2009-01-01
In this work we describe a forward genetic approach to identify mutations that affect blood vessel development in the zebrafish. By applying a haploid screening strategy in a transgenic background that allows direct visualization of blood vessels, it was possible to identify several classes of mutant vascular phenotypes. Subsequent characterization of mutant lines revealed that defects in Vascular endothelial growth factor (Vegf) signaling specifically affected artery development. Comparison of phenotypes associated with different mutations within a functional zebrafish Vegf receptor-2 ortholog (referred to as kdr-like, kdrl) revealed surprisingly varied effects on vascular development. In parallel, we identified an allelic series of mutations in phospholipase c gamma 1 (plcg1). Together with in vivo structure-function analysis, our results suggest a requirement for Plcg1 catalytic activity downstream of receptor tyrosine kinases. We further find that embryos lacking both maternal and zygotic plcg1 display more severe defects in artery differentiation but are otherwise similar to zygotic mutants. Finally, we demonstrate through mosaic analysis that plcg1 functions autonomously in endothelial cells. Together our genetic analyses suggest that Vegf/Plcg1 signaling acts at multiple time points and in different signaling contexts to mediate distinct aspects of artery development. PMID:19269286
Discovery of host-targeted covalent inhibitors of dengue virus
de Wispelaere, Mélissanne; Carocci, Margot; Liang, Yanke; Liu, Qingsong; Sun, Eileen; Vetter, Michael L.; Wang, Jinhua; Gray, Nathanael S.; Yang, Priscilla L.
2017-01-01
We report here on an approach targeting the host reactive cysteinome to identify inhibitors of host factors required for the infectious cycle of Flaviviruses and other viruses. We used two parallel cellular phenotypic screens to identify a series of covalent inhibitors, exemplified by QL-XII-47, that are active against dengue virus. We show that the compounds effectively block viral protein expression and that this inhibition is associated with repression of downstream processes of the infectious cycle, and thus significantly contributes to the potent antiviral activity of these compounds. We demonstrate that QL-XII-47’s antiviral activity requires selective, covalent modification of a host target by showing that the compound's antiviral activity is recapitulated when cells are preincubated with QL-XII-47 and then washed prior to viral infection and by showing that QL-XII-47R, a non-reactive analog, lacks antiviral activity at concentrations more than 20-fold higher than QL-XII-47's IC90. QL-XII-47’s inhibition of Zika virus, West Nile virus, hepatitis C virus, and poliovirus further suggests that it acts via a target mediating inhibition of these other medically relevant viruses. These results demonstrate the utility of screens targeting the host reactive cysteinome for rapid identification of compounds with potent antiviral activity. PMID:28034743
ChemHTPS - A virtual high-throughput screening program suite for the chemical and materials sciences
NASA Astrophysics Data System (ADS)
Afzal, Mohammad Atif Faiz; Evangelista, William; Hachmann, Johannes
The discovery of new compounds, materials, and chemical reactions with exceptional properties is the key for the grand challenges in innovation, energy and sustainability. This process can be dramatically accelerated by means of the virtual high-throughput screening (HTPS) of large-scale candidate libraries. The resulting data can further be used to study the underlying structure-property relationships and thus facilitate rational design capability. This approach has been extensively used for many years in the drug discovery community. However, the lack of openly available virtual HTPS tools is limiting the use of these techniques in various other applications such as photovoltaics, optoelectronics, and catalysis. Thus, we developed ChemHTPS, a general-purpose, comprehensive and user-friendly suite, that will allow users to efficiently perform large in silico modeling studies and high-throughput analyses in these applications. ChemHTPS also includes a massively parallel molecular library generator which offers a multitude of options to customize and restrict the scope of the enumerated chemical space and thus tailor it for the demands of specific applications. To streamline the non-combinatorial exploration of chemical space, we incorporate genetic algorithms into the framework. In addition to implementing smarter algorithms, we also focus on the ease of use, workflow, and code integration to make this technology more accessible to the community.
Ragnaill, Michelle Nic; Brown, Meredith; Ye, Dong; Bramini, Mattia; Callanan, Sean; Lynch, Iseult; Dawson, Kenneth A
2011-04-01
Transport of drugs across the blood-brain barrier, which protects the brain from harmful agents, is considered the holy grail of targeted delivery, due to the extreme effectiveness of this barrier at preventing passage of non-essential molecules through to the brain. This has caused severe limitations for therapeutics for many brain-associated diseases, such as HIV and neurodegenerative diseases. Nanomaterials, as a result of their small size (in the order of many protein-lipid clusters routinely transported by cells) and their large surface area (which acts as a scaffold for proteins thereby rendering nanoparticles as biological entities) offer great promise for neuro-therapeutics. However, in parallel with developing neuro-therapeutic applications based on nanotechnology, it is essential to ensure their safety and long-term consequences upon reaching the brain. One approach to determining safe application of nanomaterials in biology is to obtain a deep mechanistic understanding of the interactions between nanomaterials and living systems (bionanointeractions). To this end, we report here on the establishment and internal round robin validation of a human cell model of the blood-brain barrier for use as a tool for screening nanoparticles interactions, and assessing the critical nanoscale parameters that determine transcytosis. Copyright © 2011 Elsevier B.V. All rights reserved.
Geiger, Simon; Kasian, Olga; Mingers, Andrea M; Nicley, Shannon S; Haenen, Ken; Mayrhofer, Karl J J; Cherevko, Serhiy
2017-09-18
In searching for alternative oxygen evolution reaction (OER) catalysts for acidic water splitting, fast screening of the material intrinsic activity and stability in half-cell tests is of vital importance. The screening process significantly accelerates the discovery of new promising materials without the need of time-consuming real-cell analysis. In commonly employed tests, a conclusion on the catalyst stability is drawn solely on the basis of electrochemical data, for example, by evaluating potential-versus-time profiles. Herein important limitations of such approaches, which are related to the degradation of the backing electrode material, are demonstrated. State-of-the-art Ir-black powder is investigated for OER activity and for dissolution as a function of the backing electrode material. Even at very short time intervals materials like glassy carbon passivate, increasing the contact resistance and concealing the degradation phenomena of the electrocatalyst itself. Alternative backing electrodes like gold and boron-doped diamond show better stability and are thus recommended for short accelerated aging investigations. Moreover, parallel quantification of dissolution products in the electrolyte is shown to be of great importance for comparing OER catalyst feasibility. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K
2015-06-01
Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Carr, Ian M; Morgan, Joanne; Watson, Christopher; Melnik, Svitlana; Diggle, Christine P; Logan, Clare V; Harrison, Sally M; Taylor, Graham R; Pena, Sergio D J; Markham, Alexander F; Alkuraya, Fowzan S; Black, Graeme C M; Ali, Manir; Bonthron, David T
2013-07-01
Massively parallel ("next generation") DNA sequencing (NGS) has quickly become the method of choice for seeking pathogenic mutations in rare uncharacterized monogenic diseases. Typically, before DNA sequencing, protein-coding regions are enriched from patient genomic DNA, representing either the entire genome ("exome sequencing") or selected mapped candidate loci. Sequence variants, identified as differences between the patient's and the human genome reference sequences, are then filtered according to various quality parameters. Changes are screened against datasets of known polymorphisms, such as dbSNP and the 1000 Genomes Project, in the effort to narrow the list of candidate causative variants. An increasing number of commercial services now offer to both generate and align NGS data to a reference genome. This potentially allows small groups with limited computing infrastructure and informatics skills to utilize this technology. However, the capability to effectively filter and assess sequence variants is still an important bottleneck in the identification of deleterious sequence variants in both research and diagnostic settings. We have developed an approach to this problem comprising a user-friendly suite of programs that can interactively analyze, filter and screen data from enrichment-capture NGS data. These programs ("Agile Suite") are particularly suitable for small-scale gene discovery or for diagnostic analysis. © 2013 WILEY PERIODICALS, INC.
A comparative study of serial and parallel aeroelastic computations of wings
NASA Technical Reports Server (NTRS)
Byun, Chansup; Guruswamy, Guru P.
1994-01-01
A procedure for computing the aeroelasticity of wings on parallel multiple-instruction, multiple-data (MIMD) computers is presented. In this procedure, fluids are modeled using Euler equations, and structures are modeled using modal or finite element equations. The procedure is designed in such a way that each discipline can be developed and maintained independently by using a domain decomposition approach. In the present parallel procedure, each computational domain is scalable. A parallel integration scheme is used to compute aeroelastic responses by solving fluid and structural equations concurrently. The computational efficiency issues of parallel integration of both fluid and structural equations are investigated in detail. This approach, which reduces the total computational time by a factor of almost 2, is demonstrated for a typical aeroelastic wing by using various numbers of processors on the Intel iPSC/860.
NASA Astrophysics Data System (ADS)
Kim, Jae Wook
2013-05-01
This paper proposes a novel systematic approach for the parallelization of pentadiagonal compact finite-difference schemes and filters based on domain decomposition. The proposed approach allows a pentadiagonal banded matrix system to be split into quasi-disjoint subsystems by using a linear-algebraic transformation technique. As a result the inversion of pentadiagonal matrices can be implemented within each subdomain in an independent manner subject to a conventional halo-exchange process. The proposed matrix transformation leads to new subdomain boundary (SB) compact schemes and filters that require three halo terms to exchange with neighboring subdomains. The internode communication overhead in the present approach is equivalent to that of standard explicit schemes and filters based on seven-point discretization stencils. The new SB compact schemes and filters demand additional arithmetic operations compared to the original serial ones. However, it is shown that the additional cost becomes sufficiently low by choosing optimal sizes of their discretization stencils. Compared to earlier published results, the proposed SB compact schemes and filters successfully reduce parallelization artifacts arising from subdomain boundaries to a level sufficiently negligible for sophisticated aeroacoustic simulations without degrading parallel efficiency. The overall performance and parallel efficiency of the proposed approach are demonstrated by stringent benchmark tests.
Li, Haiou; Lu, Liyao; Chen, Rong; Quan, Lijun; Xia, Xiaoyan; Lü, Qiang
2014-01-01
Structural information related to protein-peptide complexes can be very useful for novel drug discovery and design. The computational docking of protein and peptide can supplement the structural information available on protein-peptide interactions explored by experimental ways. Protein-peptide docking of this paper can be described as three processes that occur in parallel: ab-initio peptide folding, peptide docking with its receptor, and refinement of some flexible areas of the receptor as the peptide is approaching. Several existing methods have been used to sample the degrees of freedom in the three processes, which are usually triggered in an organized sequential scheme. In this paper, we proposed a parallel approach that combines all the three processes during the docking of a folding peptide with a flexible receptor. This approach mimics the actual protein-peptide docking process in parallel way, and is expected to deliver better performance than sequential approaches. We used 22 unbound protein-peptide docking examples to evaluate our method. Our analysis of the results showed that the explicit refinement of the flexible areas of the receptor facilitated more accurate modeling of the interfaces of the complexes, while combining all of the moves in parallel helped the constructing of energy funnels for predictions.
2013-01-01
Background Black men have the greatest burden of premature death and disability from hypertension (HTN) in the United States, and the highest incidence and mortality from colorectal cancer (CRC). While several clinical trials have reported beneficial effects of lifestyle changes on blood pressure (BP) reduction, and improved CRC screening with patient navigation (PN), the effectiveness of these approaches in community-based settings remains understudied, particularly among Black men. Methods/design MISTER B is a two-parallel-arm randomized controlled trial that will compare the effect of a motivational interviewing tailored lifestyle intervention (MINT) versus a culturally targeted PN intervention on improvement of BP and CRC screening among black men aged ≥50 with uncontrolled HTN who are eligible for CRC screening. Approximately 480 self-identified black men will be randomly assigned to one of the two study conditions. This innovative research design allows each intervention to serve as the control for the other. Specifically, the MINT arm is the control condition for the PN arm, and vice-versa. This novel, simultaneous testing of two community-based interventions in a randomized fashion is an economical and yet rigorous strategy that also enhances the acceptability of the project. Participants will be recruited during scheduled screening events at barbershops in New York City. Trained research assistants will conduct the lifestyle intervention, while trained community health workers will deliver the PN intervention. The primary outcomes will be 1) within-patient change in systolic and diastolic BP from baseline to six months and 2) CRC screening rates at six months. Discussion This innovative study will provide a unique opportunity to test two interventions for two health disparities simultaneously in community-based settings. Our study is one of the first to test culturally targeted patient navigation for CRC screening among black men in barbershops. Thus, our study has the potential to improve the reach of hypertension control and cancer prevention efforts within a high-risk population that is under-represented in primary care settings. Trial registration ClinicalTrials.gov, NCT01092078 PMID:24011142
INVITED TOPICAL REVIEW: Parallel magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Larkman, David J.; Nunes, Rita G.
2007-04-01
Parallel imaging has been the single biggest innovation in magnetic resonance imaging in the last decade. The use of multiple receiver coils to augment the time consuming Fourier encoding has reduced acquisition times significantly. This increase in speed comes at a time when other approaches to acquisition time reduction were reaching engineering and human limits. A brief summary of spatial encoding in MRI is followed by an introduction to the problem parallel imaging is designed to solve. There are a large number of parallel reconstruction algorithms; this article reviews a cross-section, SENSE, SMASH, g-SMASH and GRAPPA, selected to demonstrate the different approaches. Theoretical (the g-factor) and practical (coil design) limits to acquisition speed are reviewed. The practical implementation of parallel imaging is also discussed, in particular coil calibration. How to recognize potential failure modes and their associated artefacts are shown. Well-established applications including angiography, cardiac imaging and applications using echo planar imaging are reviewed and we discuss what makes a good application for parallel imaging. Finally, active research areas where parallel imaging is being used to improve data quality by repairing artefacted images are also reviewed.
Parallel adaptive wavelet collocation method for PDEs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu
2015-10-01
A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less
GPU accelerated dynamic functional connectivity analysis for functional MRI data.
Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu
2015-07-01
Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gagnon, J.; Lévesque, E.; Borduas, F.; Chiquette, J.; Diorio, C.; Duchesne, N.; Dumais, M.; Eloy, L.; Foulkes, W.; Gervais, N.; Lalonde, L.; L’Espérance, B.; Meterissian, S.; Provencher, L.; Richard, J.; Savard, C.; Trop, I.; Wong, N.; Knoppers, B.M.; Simard, J.
2016-01-01
In recent years, risk stratification has sparked interest as an innovative approach to disease screening and prevention. The approach effectively personalizes individual risk, opening the way to screening and prevention interventions that are adapted to subpopulations. The international perspective project, which is developing risk stratification for breast cancer, aims to support the integration of its screening approach into clinical practice through comprehensive tool-building. Policies and guidelines for risk stratification—unlike those for population screening programs, which are currently well regulated—are still under development. Indeed, the development of guidelines for risk stratification reflects the translational aspects of perspective. Here, we describe the risk stratification process that was devised in the context of perspective, and we then explain the consensus-based method used to develop recommendations for breast cancer screening and prevention in a risk-stratification approach. Lastly, we discuss how the recommendations might affect current screening policies. PMID:28050152
Parallel algorithms for boundary value problems
NASA Technical Reports Server (NTRS)
Lin, Avi
1990-01-01
A general approach to solve boundary value problems numerically in a parallel environment is discussed. The basic algorithm consists of two steps: the local step where all the P available processors work in parallel, and the global step where one processor solves a tridiagonal linear system of the order P. The main advantages of this approach are two fold. First, this suggested approach is very flexible, especially in the local step and thus the algorithm can be used with any number of processors and with any of the SIMD or MIMD machines. Secondly, the communication complexity is very small and thus can be used as easily with shared memory machines. Several examples for using this strategy are discussed.
A template-based approach for parallel hexahedral two-refinement
Owen, Steven J.; Shih, Ryan M.; Ernst, Corey D.
2016-10-17
Here, we provide a template-based approach for generating locally refined all-hex meshes. We focus specifically on refinement of initially structured grids utilizing a 2-refinement approach where uniformly refined hexes are subdivided into eight child elements. The refinement algorithm consists of identifying marked nodes that are used as the basis for a set of four simple refinement templates. The target application for 2-refinement is a parallel grid-based all-hex meshing tool for high performance computing in a distributed environment. The result is a parallel consistent locally refined mesh requiring minimal communication and where minimum mesh quality is greater than scaled Jacobian 0.3more » prior to smoothing.« less
A template-based approach for parallel hexahedral two-refinement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, Steven J.; Shih, Ryan M.; Ernst, Corey D.
Here, we provide a template-based approach for generating locally refined all-hex meshes. We focus specifically on refinement of initially structured grids utilizing a 2-refinement approach where uniformly refined hexes are subdivided into eight child elements. The refinement algorithm consists of identifying marked nodes that are used as the basis for a set of four simple refinement templates. The target application for 2-refinement is a parallel grid-based all-hex meshing tool for high performance computing in a distributed environment. The result is a parallel consistent locally refined mesh requiring minimal communication and where minimum mesh quality is greater than scaled Jacobian 0.3more » prior to smoothing.« less
YAPPA: a Compiler-Based Parallelization Framework for Irregular Applications on MPSoCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lovergine, Silvia; Tumeo, Antonino; Villa, Oreste
Modern embedded systems include hundreds of cores. Because of the difficulty in providing a fast, coherent memory architecture, these systems usually rely on non-coherent, non-uniform memory architectures with private memories for each core. However, programming these systems poses significant challenges. The developer must extract large amounts of parallelism, while orchestrating communication among cores to optimize application performance. These issues become even more significant with irregular applications, which present data sets difficult to partition, unpredictable memory accesses, unbalanced control flow and fine grained communication. Hand-optimizing every single aspect is hard and time-consuming, and it often does not lead to the expectedmore » performance. There is a growing gap between such complex and highly-parallel architectures and the high level languages used to describe the specification, which were designed for simpler systems and do not consider these new issues. In this paper we introduce YAPPA (Yet Another Parallel Programming Approach), a compilation framework for the automatic parallelization of irregular applications on modern MPSoCs based on LLVM. We start by considering an efficient parallel programming approach for irregular applications on distributed memory systems. We then propose a set of transformations that can reduce the development and optimization effort. The results of our initial prototype confirm the correctness of the proposed approach.« less
Framework for Parallel Preprocessing of Microarray Data Using Hadoop
2018-01-01
Nowadays, microarray technology has become one of the popular ways to study gene expression and diagnosis of disease. National Center for Biology Information (NCBI) hosts public databases containing large volumes of biological data required to be preprocessed, since they carry high levels of noise and bias. Robust Multiarray Average (RMA) is one of the standard and popular methods that is utilized to preprocess the data and remove the noises. Most of the preprocessing algorithms are time-consuming and not able to handle a large number of datasets with thousands of experiments. Parallel processing can be used to address the above-mentioned issues. Hadoop is a well-known and ideal distributed file system framework that provides a parallel environment to run the experiment. In this research, for the first time, the capability of Hadoop and statistical power of R have been leveraged to parallelize the available preprocessing algorithm called RMA to efficiently process microarray data. The experiment has been run on cluster containing 5 nodes, while each node has 16 cores and 16 GB memory. It compares efficiency and the performance of parallelized RMA using Hadoop with parallelized RMA using affyPara package as well as sequential RMA. The result shows the speed-up rate of the proposed approach outperforms the sequential approach and affyPara approach. PMID:29796018
Massively Parallel DNA Sequencing Facilitates Diagnosis of Patients with Usher Syndrome Type 1
Yoshimura, Hidekane; Iwasaki, Satoshi; Nishio, Shin-ya; Kumakawa, Kozo; Tono, Tetsuya; Kobayashi, Yumiko; Sato, Hiroaki; Nagai, Kyoko; Ishikawa, Kotaro; Ikezono, Tetsuo; Naito, Yasushi; Fukushima, Kunihiro; Oshikawa, Chie; Kimitsuki, Takashi; Nakanishi, Hiroshi; Usami, Shin-ichi
2014-01-01
Usher syndrome is an autosomal recessive disorder manifesting hearing loss, retinitis pigmentosa and vestibular dysfunction, and having three clinical subtypes. Usher syndrome type 1 is the most severe subtype due to its profound hearing loss, lack of vestibular responses, and retinitis pigmentosa that appears in prepuberty. Six of the corresponding genes have been identified, making early diagnosis through DNA testing possible, with many immediate and several long-term advantages for patients and their families. However, the conventional genetic techniques, such as direct sequence analysis, are both time-consuming and expensive. Targeted exon sequencing of selected genes using the massively parallel DNA sequencing technology will potentially enable us to systematically tackle previously intractable monogenic disorders and improve molecular diagnosis. Using this technique combined with direct sequence analysis, we screened 17 unrelated Usher syndrome type 1 patients and detected probable pathogenic variants in the 16 of them (94.1%) who carried at least one mutation. Seven patients had the MYO7A mutation (41.2%), which is the most common type in Japanese. Most of the mutations were detected by only the massively parallel DNA sequencing. We report here four patients, who had probable pathogenic mutations in two different Usher syndrome type 1 genes, and one case of MYO7A/PCDH15 digenic inheritance. This is the first report of Usher syndrome mutation analysis using massively parallel DNA sequencing and the frequency of Usher syndrome type 1 genes in Japanese. Mutation screening using this technique has the power to quickly identify mutations of many causative genes while maintaining cost-benefit performance. In addition, the simultaneous mutation analysis of large numbers of genes is useful for detecting mutations in different genes that are possibly disease modifiers or of digenic inheritance. PMID:24618850
Massively parallel DNA sequencing facilitates diagnosis of patients with Usher syndrome type 1.
Yoshimura, Hidekane; Iwasaki, Satoshi; Nishio, Shin-Ya; Kumakawa, Kozo; Tono, Tetsuya; Kobayashi, Yumiko; Sato, Hiroaki; Nagai, Kyoko; Ishikawa, Kotaro; Ikezono, Tetsuo; Naito, Yasushi; Fukushima, Kunihiro; Oshikawa, Chie; Kimitsuki, Takashi; Nakanishi, Hiroshi; Usami, Shin-Ichi
2014-01-01
Usher syndrome is an autosomal recessive disorder manifesting hearing loss, retinitis pigmentosa and vestibular dysfunction, and having three clinical subtypes. Usher syndrome type 1 is the most severe subtype due to its profound hearing loss, lack of vestibular responses, and retinitis pigmentosa that appears in prepuberty. Six of the corresponding genes have been identified, making early diagnosis through DNA testing possible, with many immediate and several long-term advantages for patients and their families. However, the conventional genetic techniques, such as direct sequence analysis, are both time-consuming and expensive. Targeted exon sequencing of selected genes using the massively parallel DNA sequencing technology will potentially enable us to systematically tackle previously intractable monogenic disorders and improve molecular diagnosis. Using this technique combined with direct sequence analysis, we screened 17 unrelated Usher syndrome type 1 patients and detected probable pathogenic variants in the 16 of them (94.1%) who carried at least one mutation. Seven patients had the MYO7A mutation (41.2%), which is the most common type in Japanese. Most of the mutations were detected by only the massively parallel DNA sequencing. We report here four patients, who had probable pathogenic mutations in two different Usher syndrome type 1 genes, and one case of MYO7A/PCDH15 digenic inheritance. This is the first report of Usher syndrome mutation analysis using massively parallel DNA sequencing and the frequency of Usher syndrome type 1 genes in Japanese. Mutation screening using this technique has the power to quickly identify mutations of many causative genes while maintaining cost-benefit performance. In addition, the simultaneous mutation analysis of large numbers of genes is useful for detecting mutations in different genes that are possibly disease modifiers or of digenic inheritance.
NASA Astrophysics Data System (ADS)
Lin, Mingpei; Xu, Ming; Fu, Xiaoyu
2017-05-01
Currently, a tremendous amount of space debris in Earth's orbit imperils operational spacecraft. It is essential to undertake risk assessments of collisions and predict dangerous encounters in space. However, collision predictions for an enormous amount of space debris give rise to large-scale computations. In this paper, a parallel algorithm is established on the Compute Unified Device Architecture (CUDA) platform of NVIDIA Corporation for collision prediction. According to the parallel structure of NVIDIA graphics processors, a block decomposition strategy is adopted in the algorithm. Space debris is divided into batches, and the computation and data transfer operations of adjacent batches overlap. As a consequence, the latency to access shared memory during the entire computing process is significantly reduced, and a higher computing speed is reached. Theoretically, a simulation of collision prediction for space debris of any amount and for any time span can be executed. To verify this algorithm, a simulation example including 1382 pieces of debris, whose operational time scales vary from 1 min to 3 days, is conducted on Tesla C2075 of NVIDIA. The simulation results demonstrate that with the same computational accuracy as that of a CPU, the computing speed of the parallel algorithm on a GPU is 30 times that on a CPU. Based on this algorithm, collision prediction of over 150 Chinese spacecraft for a time span of 3 days can be completed in less than 3 h on a single computer, which meets the timeliness requirement of the initial screening task. Furthermore, the algorithm can be adapted for multiple tasks, including particle filtration, constellation design, and Monte-Carlo simulation of an orbital computation.
A parallel orbital-updating based plane-wave basis method for electronic structure calculations
NASA Astrophysics Data System (ADS)
Pan, Yan; Dai, Xiaoying; de Gironcoli, Stefano; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui
2017-11-01
Motivated by the recently proposed parallel orbital-updating approach in real space method [1], we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.
Speculation and replication in temperature accelerated dynamics
Zamora, Richard J.; Perez, Danny; Voter, Arthur F.
2018-02-12
Accelerated Molecular Dynamics (AMD) is a class of MD-based algorithms for the long-time scale simulation of atomistic systems that are characterized by rare-event transitions. Temperature-Accelerated Dynamics (TAD), a traditional AMD approach, hastens state-to-state transitions by performing MD at an elevated temperature. Recently, Speculatively-Parallel TAD (SpecTAD) was introduced, allowing the TAD procedure to exploit parallel computing systems by concurrently executing in a dynamically generated list of speculative future states. Although speculation can be very powerful, it is not always the most efficient use of parallel resources. In this paper, we compare the performance of speculative parallelism with a replica-based technique, similarmore » to the Parallel Replica Dynamics method. A hybrid SpecTAD approach is also presented, in which each speculation process is further accelerated by a local set of replicas. Finally and overall, this work motivates the use of hybrid parallelism whenever possible, as some combination of speculation and replication is typically most efficient.« less
Speculation and replication in temperature accelerated dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamora, Richard J.; Perez, Danny; Voter, Arthur F.
Accelerated Molecular Dynamics (AMD) is a class of MD-based algorithms for the long-time scale simulation of atomistic systems that are characterized by rare-event transitions. Temperature-Accelerated Dynamics (TAD), a traditional AMD approach, hastens state-to-state transitions by performing MD at an elevated temperature. Recently, Speculatively-Parallel TAD (SpecTAD) was introduced, allowing the TAD procedure to exploit parallel computing systems by concurrently executing in a dynamically generated list of speculative future states. Although speculation can be very powerful, it is not always the most efficient use of parallel resources. In this paper, we compare the performance of speculative parallelism with a replica-based technique, similarmore » to the Parallel Replica Dynamics method. A hybrid SpecTAD approach is also presented, in which each speculation process is further accelerated by a local set of replicas. Finally and overall, this work motivates the use of hybrid parallelism whenever possible, as some combination of speculation and replication is typically most efficient.« less
Elkin, L L; Harden, D G; Saldanha, S; Ferguson, H; Cheney, D L; Pieniazek, S N; Maloney, D P; Zewinski, J; O'Connell, J; Banks, M
2015-06-01
Compound pooling, or multiplexing more than one compound per well during primary high-throughput screening (HTS), is a controversial approach with a long history of limited success. Many issues with this approach likely arise from long-term storage of library plates containing complex mixtures of compounds at high concentrations. Due to the historical difficulties with using multiplexed library plates, primary HTS often uses a one-compound-one-well approach. However, as compound collections grow, innovative strategies are required to increase the capacity of primary screening campaigns. Toward this goal, we have developed a novel compound pooling method that increases screening capacity without compromising data quality. This method circumvents issues related to the long-term storage of complex compound mixtures by using acoustic dispensing to enable "just-in-time" compound pooling directly in the assay well immediately prior to assay. Using this method, we can pool two compounds per well, effectively doubling the capacity of a primary screen. Here, we present data from pilot studies using just-in-time pooling, as well as data from a large >2-million-compound screen using this approach. These data suggest that, for many targets, this method can be used to vastly increase screening capacity without significant reduction in the ability to detect screening hits. © 2015 Society for Laboratory Automation and Screening.
Application of ToxCast High-Throughput Screening and ...
Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors
Pesticide Cumulative Risk Assessment: Framework for Screening Analysis
This document provides guidance on how to screen groups of pesticides for cumulative evaluation using a two-step approach: begin with evaluation of available toxicological information and, if necessary, follow up with a risk-based screening approach.
Numerical Prediction of CCV in a PFI Engine using a Parallel LES Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ameen, Muhsin M; Mirzaeian, Mohsen; Millo, Federico
Cycle-to-cycle variability (CCV) is detrimental to IC engine operation and can lead to partial burn, misfire, and knock. Predicting CCV numerically is extremely challenging due to two key reasons. Firstly, high-fidelity methods such as large eddy simulation (LES) are required to accurately resolve the incylinder turbulent flowfield both spatially and temporally. Secondly, CCV is experienced over long timescales and hence the simulations need to be performed for hundreds of consecutive cycles. Ameen et al. (Int. J. Eng. Res., 2017) developed a parallel perturbation model (PPM) approach to dissociate this long time-scale problem into several shorter timescale problems. The strategy ismore » to perform multiple single-cycle simulations in parallel by effectively perturbing the initial velocity field based on the intensity of the in-cylinder turbulence. This strategy was demonstrated for motored engine and it was shown that the mean and variance of the in-cylinder flowfield was captured reasonably well by this approach. In the present study, this PPM approach is extended to simulate the CCV in a fired port-fuel injected (PFI) SI engine. Two operating conditions are considered – a medium CCV operating case corresponding to 2500 rpm and 16 bar BMEP and a low CCV case corresponding to 4000 rpm and 12 bar BMEP. The predictions from this approach are also shown to be similar to the consecutive LES cycles. Both the consecutive and PPM LES cycles are observed to under-predict the variability in the early stage of combustion. The parallel approach slightly underpredicts the cyclic variability at all stages of combustion as compared to the consecutive LES cycles. However, it is shown that the parallel approach is able to predict the coefficient of variation (COV) of the in-cylinder pressure and burn rate related parameters with sufficient accuracy, and is also able to predict the qualitative trends in CCV with changing operating conditions. The convergence of the statistics predicted by the PPM approach with respect to the number of consecutive cycles required for each parallel simulation is also investigated. It is shown that this new approach is able to give accurate predictions of the CCV in fired engines in less than one-tenth of the time required for the conventional approach of simulating consecutive engine cycles.« less
Nationally Consistent Environmental Justice Screening Approaches
This report discusses screening approaches through the lens of the Agency's Environmental Justice Strategic Enforcement Tool (EJSEAT), in particular, and how such approaches might better identify areas of concern.
Thomas, Duncan C
2017-07-01
Screening behavior depends on previous screening history and family members' behaviors, which can act as both confounders and intermediate variables on a causal pathway from screening to disease risk. Conventional analyses that adjust for these variables can lead to incorrect inferences about the causal effect of screening if high-risk individuals are more likely to be screened. Analyzing the data in a manner that treats screening as randomized conditional on covariates allows causal parameters to be estimated; inverse probability weighting based on propensity of exposure scores is one such method considered here. I simulated family data under plausible models for the underlying disease process and for screening behavior to assess the performance of alternative methods of analysis and whether a targeted screening approach based on individuals' risk factors would lead to a greater reduction in cancer incidence in the population than a uniform screening policy. Simulation results indicate that there can be a substantial underestimation of the effect of screening on subsequent cancer risk when using conventional analysis approaches, which is avoided by using inverse probability weighting. A large case-control study of colonoscopy and colorectal cancer from Germany shows a strong protective effect of screening, but inverse probability weighting makes this effect even stronger. Targeted screening approaches based on either fixed risk factors or family history yield somewhat greater reductions in cancer incidence with fewer screens needed to prevent one cancer than population-wide approaches, but the differences may not be large enough to justify the additional effort required. See video abstract at, http://links.lww.com/EDE/B207.
Schmoll, Hans-Joachim; Arnold, Dirk; de Gramont, Aimery; Ducreux, Michel; Grothey, Axel; O'Dwyer, Peter J; Van Cutsem, Eric; Hermann, Frank; Bosanac, Ivan; Bendahmane, Belguendouz; Mancao, Christoph; Tabernero, Josep
2018-06-01
The old approach of one therapeutic for all patients with mCRC is evolving with a need to target specific molecular aberrations or cell-signalling pathways. Molecular screening approaches and new biomarkers are required to fully characterize tumours, identify patients most likely to benefit, and predict treatment response. MODUL is a signal-seeking trial with a design that is highly adaptable, permitting modification of different treatment cohorts and inclusion of further additional cohorts based on novel evidence on new compounds/combinations that emerge during the study. MODUL is ongoing and its adaptable nature permits timely and efficient recruitment of patients into the most appropriate cohort. Recruitment will take place over approximately 5 years in Europe, Asia, Africa, and South America. The design of MODUL with ongoing parallel/sequential treatment cohorts means that the overall size and duration of the trial can be modified/prolonged based on accumulation of new data. The early success of the current trial suggests that the design may provide definitive leads in a patient-friendly and relatively economical trial structure. Along with other biomarker-driven trials that are currently underway, it is hoped that MODUL will contribute to the continuing evolution of clinical trial design and permit a more 'tailored' approach to the treatment of patients with mCRC.
Late-onset Bartter syndrome type II.
Gollasch, Benjamin; Anistan, Yoland-Marie; Canaan-Kühl, Sima; Gollasch, Maik
2017-10-01
Mutations in the ROMK1 potassium channel gene ( KCNJ1 ) cause antenatal/neonatal Bartter syndrome type II (aBS II), a renal disorder that begins in utero , accounting for the polyhydramnios and premature delivery that is typical in affected infants, who develop massive renal salt wasting, hypokalaemic metabolic alkalosis, secondary hyperreninaemic hyperaldosteronism, hypercalciuria and nephrocalcinosis. This BS type is believed to represent a disorder of the infancy, but not in adulthood. We herein describe a female patient with a remarkably late-onset and mild clinical manifestation of BS II with compound heterozygous KCNJ1 missense mutations, consisting of a novel c.197T > A (p.I66N) and a previously reported c.875G > A (p.R292Q) KCNJ1 mutation. We implemented and evaluated the performance of two different bioinformatics-based approaches of targeted massively parallel sequencing [next generation sequencing (NGS)] in defining the molecular diagnosis. Our results demonstrate that aBS II may be suspected in patients with a late-onset phenotype. Our experimental approach of NGS-based mutation screening combined with Sanger sequencing proved to be a reliable molecular approach for defining the clinical diagnosis in our patient, and results in important differential diagnostic and therapeutic implications for patients with BS. Our results could have a significant impact on the diagnosis and methodological approaches of genetic testing in other patients with clinical unclassified phenotypes of nephrocalcinosis and congenital renal electrolyte abnormalities.
Sanders, David M.; Decker, Derek E.
1999-01-01
Optical patterns and lithographic techniques are used as part of a process to embed parallel and evenly spaced conductors in the non-planar surfaces of an insulator to produce high gradient insulators. The approach extends the size that high gradient insulating structures can be fabricated as well as improves the performance of those insulators by reducing the scale of the alternating parallel lines of insulator and conductor along the surface. This fabrication approach also substantially decreases the cost required to produce high gradient insulators.
2018-01-01
Effect-directed analysis (EDA) is a commonly used approach for effect-based identification of endocrine disruptive chemicals in complex (environmental) mixtures. However, for routine toxicity assessment of, for example, water samples, current EDA approaches are considered time-consuming and laborious. We achieved faster EDA and identification by downscaling of sensitive cell-based hormone reporter gene assays and increasing fractionation resolution to allow testing of smaller fractions with reduced complexity. The high-resolution EDA approach is demonstrated by analysis of four environmental passive sampler extracts. Downscaling of the assays to a 384-well format allowed analysis of 64 fractions in triplicate (or 192 fractions without technical replicates) without affecting sensitivity compared to the standard 96-well format. Through a parallel exposure method, agonistic and antagonistic androgen and estrogen receptor activity could be measured in a single experiment following a single fractionation. From 16 selected candidate compounds, identified through nontargeted analysis, 13 could be confirmed chemically and 10 were found to be biologically active, of which the most potent nonsteroidal estrogens were identified as oxybenzone and piperine. The increased fractionation resolution and the higher throughput that downscaling provides allow for future application in routine high-resolution screening of large numbers of samples in order to accelerate identification of (emerging) endocrine disruptors. PMID:29547277
First-trimester screening for chromosomal abnormalities: advantages of an instant results approach.
Norton, Mary E
2010-09-01
Protocols that include first trimester screening for fetal chromosome abnormalities have become standard of care throughout the United States. Earlier screening allows for first trimester diagnostic testing in cases found to be at increased risk. However, first trimester screening requires coordination of the nuchal translucency ultrasound screening (NT) and biochemical screening, during early, specific, narrow, but slightly different gestational age ranges. Instant results can often be provided at the time of the NT ultrasound if preceded by the programs that perform the biochemical analyses; this optimizes the benefits of the first trimester approach while improving efficiency and communication with the patient. This article discusses the benefits and logistics of such an approach. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Baregheh, Mandana; Mezentsev, Vladimir; Schmitz, Holger
2011-06-01
We describe a parallel multi-threaded approach for high performance modelling of wide class of phenomena in ultrafast nonlinear optics. Specific implementation has been performed using the highly parallel capabilities of a programmable graphics processor.
Wernike, Kerstin; Hoffmann, Bernd
2013-01-01
Detection of several pathogens with multiplexed real-time quantitative PCR (qPCR) assays in a one-step setup allows the simultaneous detection of two endemic porcine and four different selected transboundary viruses. Reverse transcription (RT)-qPCR systems for the detection of porcine reproductive and respiratory syndrome virus (PRRSV) and porcine circovirus type 2 (PCV2), two of the most economically important pathogens of swine worldwide, were combined with a screening system for diseases notifiable to the World Organization of Animal Health, namely, classical and African swine fever, foot-and-mouth disease, and Aujeszky's disease. Background screening was implemented using the identical fluorophore for all four different RT-qPCR assays. The novel multiplex RT-qPCR system was validated with a large panel of different body fluids and tissues from pigs and other animal species. Both reference samples and clinical specimens were used for a complete evaluation. It could be demonstrated that a highly sensitive and specific parallel detection of the different viruses was possible. The assays for the notifiable diseases were even not affected by the simultaneous amplification of very high loads of PRRSV- and PCV2-specific sequences. The novel broad-spectrum multiplex assay allows in a unique form the routine investigation for endemic porcine pathogens with exclusion diagnostics of the most important transboundary diseases in samples from pigs with unspecific clinical signs, such as fever or hemorrhages. The new system could significantly improve early detection of the most important notifiable diseases of swine and could lead to a new approach in syndromic surveillance. PMID:23303496
Knowledge-driven lead discovery.
Pirard, Bernard
2005-11-01
Virtual screening encompasses several computational approaches which have proven valuable for identifying novel leads. These approaches rely on available information. Herein, we review recent successful applications of virtual screening. The extension of virtual screening methodologies to target families is also briefly discussed.
Brühlmann, David; Sokolov, Michael; Butté, Alessandro; Sauer, Markus; Hemberger, Jürgen; Souquet, Jonathan; Broly, Hervé; Jordan, Martin
2017-07-01
Rational and high-throughput optimization of mammalian cell culture media has a great potential to modulate recombinant protein product quality. We present a process design method based on parallel design-of-experiment (DoE) of CHO fed-batch cultures in 96-deepwell plates to modulate monoclonal antibody (mAb) glycosylation using medium supplements. To reduce the risk of losing valuable information in an intricate joint screening, 17 compounds were separated into five different groups, considering their mode of biological action. The concentration ranges of the medium supplements were defined according to information encountered in the literature and in-house experience. The screening experiments produced wide glycosylation pattern ranges. Multivariate analysis including principal component analysis and decision trees was used to select the best performing glycosylation modulators. Subsequent D-optimal quadratic design with four factors (three promising compounds and temperature shift) in shake tubes confirmed the outcome of the selection process and provided a solid basis for sequential process development at a larger scale. The glycosylation profile with respect to the specifications for biosimilarity was greatly improved in shake tube experiments: 75% of the conditions were equally close or closer to the specifications for biosimilarity than the best 25% in 96-deepwell plates. Biotechnol. Bioeng. 2017;114: 1448-1458. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Quantifying Overdiagnosis in Cancer Screening: A Systematic Review to Evaluate the Methodology.
Ripping, Theodora M; Ten Haaf, Kevin; Verbeek, André L M; van Ravesteyn, Nicolien T; Broeders, Mireille J M
2017-10-01
Overdiagnosis is the main harm of cancer screening programs but is difficult to quantify. This review aims to evaluate existing approaches to estimate the magnitude of overdiagnosis in cancer screening in order to gain insight into the strengths and limitations of these approaches and to provide researchers with guidance to obtain reliable estimates of overdiagnosis in cancer screening. A systematic review was done of primary research studies in PubMed that were published before January 1, 2016, and quantified overdiagnosis in breast cancer screening. The studies meeting inclusion criteria were then categorized by their methods to adjust for lead time and to obtain an unscreened reference population. For each approach, we provide an overview of the data required, assumptions made, limitations, and strengths. A total of 442 studies were identified in the initial search. Forty studies met the inclusion criteria for the qualitative review. We grouped the approaches to adjust for lead time in two main categories: the lead time approach and the excess incidence approach. The lead time approach was further subdivided into the mean lead time approach, lead time distribution approach, and natural history modeling. The excess incidence approach was subdivided into the cumulative incidence approach and early vs late-stage cancer approach. The approaches used to obtain an unscreened reference population were grouped into the following categories: control group of a randomized controlled trial, nonattenders, control region, extrapolation of a prescreening trend, uninvited groups, adjustment for the effect of screening, and natural history modeling. Each approach to adjust for lead time and obtain an unscreened reference population has its own strengths and limitations, which should be taken into consideration when estimating overdiagnosis. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Managing Behavioral Symptoms in Dementia Using Nonpharmacologic Approaches: An Overview
Gitlin, Laura N.; Kales, Helen C.; Lyketsos, Constantine G.
2013-01-01
Behavioral symptoms such as repetitive statements and questions, wandering, and sleep disturbances are a core clinical feature of Alzheimer disease and related dementias, affecting patients and their families. These behaviors have devastating effects. If untreated, they can contribute to more rapid disease progression, earlier nursing home placement, worse quality of life, accelerated functional decline, greater caregiver distress, and higher health care utilization and costs. Patients with dementia are typically not screened for behavioral symptoms in primary care and even when clinically reported, tend to receive ineffective, inappropriate, and fragmented care. Yet, clinicians are often called upon to address behaviors that place the patient or others at risk or which families encounter as problematic. It is important to include on-going systematic screening for behavioral symptoms to facilitate prevention and early treatment as part of standard comprehensive dementia care. When identified, behaviors should be characterized and underlying causes sought in order to derive a treatment plan. Because available pharmacologic treatments used to treat behaviors have modest efficacy at best, are associated with notable risks, and do not address behaviors most distressing for families, nonpharmacologic options are recommended as first-line treatments or if necessary, in parallel with pharmacologic or other treatment options. Nonpharmacologic treatments may include a general approach (caregiver education and training in problem solving, communication and task simplification skills, patient exercise, and/or activity programs), or a targeted approach in which precipitating conditions of a specific behavior are identified and modified (eg, implementing nighttime routines to address sleep disturbances). Using the case of Mr A, we characterize common behavioral symptoms of dementia and describe an assessment strategy for selecting evidence-based nonpharmacologic treatments. We highlight the clinician's important role in facilitating collaboration with specialists and other health care professionals to implement nonpharmacological treatment plans. Substantial evidence shows that nonpharmacologic approaches can yield high levels of patient and caregiver satisfaction, quality of life improvements, and reductions in behavioral symptoms. Although access to nonpharmacologic approaches is currently limited, they should be part of standard dementia care. PMID:23168825
Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che
2014-01-16
To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.
2014-01-01
Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926
Advanced Lung Cancer Screening: An Individualized Molecular Nanotechnology Approach
2016-03-01
Award Number: W81XWH-12-1-0323 TITLE: Advanced Lung Cancer Screening: An Individualized Molecular Nanotechnology Approach PRINCIPAL...SUBTITLE Advanced Lung Cancer Screening: An Individualized Molecular Nanotechnology Approach 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...increasing its sensitivity and specificity through nanotechnology . Hypothesis: Detection of DNA methylation from individuals with cancer can be used to
A Primer for DoD Reliability, Maintainability and Safety Standards
1988-03-02
the project engineer and the concurrence of their respective managers. The primary consideration in such cases is the thoroughness of the ...basic approaches to the application of environmental stress screening. In one approach, the government explicitly specifies the screens and screening...TO USE DOD-HDBK-344 (USAF) There are two basic approaches to the application of environmental stress
Cervical screening in HPV-vaccinated populations.
Canfell, K
2018-06-01
Cervical screening with cytology has been the basis for substantial reductions in cervical cancer incidence and mortality in most high-income countries over the last few decades. More recently, there have been two key, parallel developments which have prompted a major re-consideration of cervical screening. The first is the emergence of evidence on the improved sensitivity of human papillomavirus (HPV) DNA testing compared to cytology, and the second is the large-scale deployment of prophylactic vaccination against HPV. A key challenge to be overcome before HPV screening could be introduced into national cervical screening programs was the specificity of an infection, for detection of precancerous lesions. This has been done in three ways: (1) by considering the appropriate age for starting HPV screening (30 years in unvaccinated populations and 25 years in populations with mature vaccination programs and high vaccine uptake) and the appropriate screening interval; (2) via development of clinical HPV tests, which are (by design) not as sensitive to low viral loads; and (3) by introducing effective triaging for HPV-positive women, which further risk-stratifies women before referral for diagnostic evaluation. This review discusses these major developments and describes how the benefits of HPV screening are being optimized in both unvaccinated and vaccinated populations.
Metabolomics Approach for Toxicity Screening of Volatile Substances
In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However, the ch...
Parallelized Seeded Region Growing Using CUDA
Park, Seongjin; Lee, Hyunna; Seo, Jinwook; Lee, Kyoung Ho; Shin, Yeong-Gil; Kim, Bohyoung
2014-01-01
This paper presents a novel method for parallelizing the seeded region growing (SRG) algorithm using Compute Unified Device Architecture (CUDA) technology, with intention to overcome the theoretical weakness of SRG algorithm of its computation time being directly proportional to the size of a segmented region. The segmentation performance of the proposed CUDA-based SRG is compared with SRG implementations on single-core CPUs, quad-core CPUs, and shader language programming, using synthetic datasets and 20 body CT scans. Based on the experimental results, the CUDA-based SRG outperforms the other three implementations, advocating that it can substantially assist the segmentation during massive CT screening tests. PMID:25309619
Transitioning NWChem to the Next Generation of Manycore Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bylaska, Eric J.; Apra, E; Kowalski, Karol
The NorthWest chemistry (NWChem) modeling software is a popular molecular chemistry simulation software that was designed from the start to work on massively parallel processing supercomputers [1-3]. It contains an umbrella of modules that today includes self-consistent eld (SCF), second order Møller-Plesset perturbation theory (MP2), coupled cluster (CC), multiconguration self-consistent eld (MCSCF), selected conguration interaction (CI), tensor contraction engine (TCE) many body methods, density functional theory (DFT), time-dependent density functional theory (TDDFT), real-time time-dependent density functional theory, pseudopotential plane-wave density functional theory (PSPW), band structure (BAND), ab initio molecular dynamics (AIMD), Car-Parrinello molecular dynamics (MD), classical MD, hybrid quantum mechanicsmore » molecular mechanics (QM/MM), hybrid ab initio molecular dynamics molecular mechanics (AIMD/MM), gauge independent atomic orbital nuclear magnetic resonance (GIAO NMR), conductor like screening solvation model (COSMO), conductor-like screening solvation model based on density (COSMO-SMD), and reference interaction site model (RISM) solvation models, free energy simulations, reaction path optimization, parallel in time, among other capabilities [4]. Moreover, new capabilities continue to be added with each new release.« less
36 CFR Appendix D to Part 1191 - Technical
Code of Federal Regulations, 2014 CFR
2014-07-01
... inch (13 mm) high shall be ramped, and shall comply with 405 or 406. 304Turning Space 304.1General... ground space allows a parallel approach to an element and the side reach is unobstructed, the high side....2Obstructed High Reach. Where a clear floor or ground space allows a parallel approach to an element and the...
Tranberg, Mette; Bech, Bodil Hammer; Blaakær, Jan; Jensen, Jørgen Skov; Svanholm, Hans; Andersen, Berit
2016-11-03
The effectiveness of cervical cancer screening programs is challenged by suboptimal participation and coverage. Offering cervico-vaginal self-sampling for human papillomavirus testing (HPV self-sampling) to non-participants can increase screening participation. However, the effect varies substantially among studies, especially depending on the approach used to offer HPV self-sampling. The present trial evaluates the effect on participation in an organized screening program of a HPV self-sampling kit mailed directly to the home of the woman or mailed to the woman's home on demand only, compared with the standard second reminder for regular screening. The CHOiCE trial is a parallel, randomized, controlled, open-label trial. It will include 9327 women aged 30-64 years who are living in the Central Denmark Region and who have not participated in cervical cancer screening after an invitation and one reminder. The women will be equally randomized into three arms: 1) Directly mailed a second reminder including a HPV self-sampling kit; 2) Mailed a second reminder offering a HPV self-sampling kit, to be ordered by e-mail, text message, phone, or through a webpage; and 3) Mailed a second reminder for a practitioner-collected sample (control group). The primary outcome will be the proportion of women in the intervention groups who participate by returning their HPV self-sampling kit or have a practitioner-collected sample compared with the proportion of women who have a practitioner-collected sample in the control group at 90 and 180 days after mail out of the second reminders. Per-protocol and intention-to-treat analyses will be performed. The secondary outcome will be the proportion of women with a positive HPV self-collected sample who attend follow-up testing at 30, 60, or 90 days after mail out of the results. The CHOiCE trial will provide strong and important evidence allowing us to determine if and how HPV self-sampling can be used to increase participation in cervical cancer screening. This trial therefore has the potential to improve prevention and reduce the number of deaths caused by cervical cancer. Current Controlled Trials NCT02680262 . Registered 10 February 2016.
On extending parallelism to serial simulators
NASA Technical Reports Server (NTRS)
Nicol, David; Heidelberger, Philip
1994-01-01
This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.
Stage-by-Stage and Parallel Flow Path Compressor Modeling for a Variable Cycle Engine
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph W.; Cheng, Larry
2015-01-01
This paper covers the development of stage-by-stage and parallel flow path compressor modeling approaches for a Variable Cycle Engine. The stage-by-stage compressor modeling approach is an extension of a technique for lumped volume dynamics and performance characteristic modeling. It was developed to improve the accuracy of axial compressor dynamics over lumped volume dynamics modeling. The stage-by-stage compressor model presented here is formulated into a parallel flow path model that includes both axial and rotational dynamics. This is done to enable the study of compressor and propulsion system dynamic performance under flow distortion conditions. The approaches utilized here are generic and should be applicable for the modeling of any axial flow compressor design.
Wassif, Christopher A; Cross, Joanna L; Iben, James; Sanchez-Pulido, Luis; Cougnoux, Antony; Platt, Frances M; Ory, Daniel S; Ponting, Chris P; Bailey-Wilson, Joan E; Biesecker, Leslie G; Porter, Forbes D
2016-01-01
Niemann-Pick disease type C (NPC) is a recessive, neurodegenerative, lysosomal storage disease caused by mutations in either NPC1 or NPC2. The diagnosis is difficult and frequently delayed. Ascertainment is likely incomplete because of both these factors and because the full phenotypic spectrum may not have been fully delineated. Given the recent development of a blood-based diagnostic test and the development of potential therapies, understanding the incidence of NPC and defining at-risk patient populations are important. We evaluated data from four large, massively parallel exome sequencing data sets. Variant sequences were identified and classified as pathogenic or nonpathogenic based on a combination of literature review and bioinformatic analysis. This methodology provided an unbiased approach to determining the allele frequency. Our data suggest an incidence rate for NPC1 and NPC2 of 1/92,104 and 1/2,858,998, respectively. Evaluation of common NPC1 variants, however, suggests that there may be a late-onset NPC1 phenotype with a markedly higher incidence, on the order of 1/19,000-1/36,000. We determined a combined incidence of classical NPC of 1/89,229, or 1.12 affected patients per 100,000 conceptions, but predict incomplete ascertainment of a late-onset phenotype of NPC1. This finding strongly supports the need for increased screening of potential patients.
Engineering shadows to fabricate optical metasurfaces.
Nemiroski, Alex; Gonidec, Mathieu; Fox, Jerome M; Jean-Remy, Philip; Turnage, Evan; Whitesides, George M
2014-11-25
Optical metasurfaces-patterned arrays of plasmonic nanoantennas that enable the precise manipulation of light-matter interactions-are emerging as critical components in many nanophotonic materials, including planar metamaterials, chemical and biological sensors, and photovoltaics. The development of these materials has been slowed by the difficulty of efficiently fabricating patterns with the required combinations of intricate nanoscale structure, high areal density, and/or heterogeneous composition. One convenient strategy that enables parallel fabrication of periodic nanopatterns uses self-assembled colloidal monolayers as shadow masks; this method has, however, not been extended beyond a small set of simple patterns and, thus, has remained incompatible with the broad design requirements of metasurfaces. This paper demonstrates a technique-shadow-sphere lithography (SSL)-that uses sequential deposition from multiple angles through plasma-etched microspheres to expand the variety and complexity of structures accessible by colloidal masks. SSL harnesses the entire, relatively unexplored, space of shadow-derived shapes and-with custom software to guide multiangled deposition-contains sufficient degrees of freedom to (i) design and fabricate a wide variety of metasurfaces that incorporate complex structures with small feature sizes and multiple materials and (ii) generate, in parallel, thousands of variations of structures for high-throughput screening of new patterns that may yield unexpected optical spectra. This generalized approach to engineering shadows of spheres provides a new strategy for efficient prototyping and discovery of periodic metasurfaces.
Vogel, Jörg; Bartels, Verena; Tang, Thean Hock; Churakov, Gennady; Slagter-Jäger, Jacoba G.; Hüttenhofer, Alexander; Wagner, E. Gerhart H.
2003-01-01
Recent bioinformatics-aided searches have identified many new small RNAs (sRNAs) in the intergenic regions of the bacterium Escherichia coli. Here, a shot-gun cloning approach (RNomics) was used to generate cDNA libraries of small sized RNAs. Besides many of the known sRNAs, we found new species that were not predicted previously. The present work brings the number of sRNAs in E.coli to 62. Experimental transcription start site mapping showed that some sRNAs were encoded from independent genes, while others were processed from mRNA leaders or trailers, indicative of a parallel transcriptional output generating sRNAs co-expressed with mRNAs. Two of these RNAs (SroA and SroG) consist of known (THI and RFN) riboswitch elements. We also show that two recently identified sRNAs (RyeB and SraC/RyeA) interact, resulting in RNase III-dependent cleavage. To the best of our knowledge, this represents the first case of two non-coding RNAs interacting by a putative antisense mechanism. In addition, intracellular metabolic stabilities of sRNAs were determined, including ones from previous screens. The wide range of half-lives (<2 to >32 min) indicates that sRNAs cannot generally be assumed to be metabolically stable. The experimental characterization of sRNAs analyzed here suggests that the definition of an sRNA is more complex than previously assumed. PMID:14602901
Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C
2011-01-01
Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.
Shieh, Yiwey; Eklund, Martin; Madlensky, Lisa; Sawyer, Sarah D; Thompson, Carlie K; Stover Fiscalini, Allison; Ziv, Elad; Van't Veer, Laura J; Esserman, Laura J; Tice, Jeffrey A
2017-01-01
Ongoing controversy over the optimal approach to breast cancer screening has led to discordant professional society recommendations, particularly in women age 40 to 49 years. One potential solution is risk-based screening, where decisions around the starting age, stopping age, frequency, and modality of screening are based on individual risk to maximize the early detection of aggressive cancers and minimize the harms of screening through optimal resource utilization. We present a novel approach to risk-based screening that integrates clinical risk factors, breast density, a polygenic risk score representing the cumulative effects of genetic variants, and sequencing for moderate- and high-penetrance germline mutations. We demonstrate how thresholds of absolute risk estimates generated by our prediction tools can be used to stratify women into different screening strategies (biennial mammography, annual mammography, annual mammography with adjunctive magnetic resonance imaging, defer screening at this time) while informing the starting age of screening for women age 40 to 49 years. Our risk thresholds and corresponding screening strategies are based on current evidence but need to be tested in clinical trials. The Women Informed to Screen Depending On Measures of risk (WISDOM) Study, a pragmatic, preference-tolerant randomized controlled trial of annual vs personalized screening, will study our proposed approach. WISDOM will evaluate the efficacy, safety, and acceptability of risk-based screening beginning in the fall of 2016. The adaptive design of this trial allows continued refinement of our risk thresholds as the trial progresses, and we discuss areas where we anticipate emerging evidence will impact our approach. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Parallel computation with molecular-motor-propelled agents in nanofabricated networks.
Nicolau, Dan V; Lard, Mercy; Korten, Till; van Delft, Falco C M J M; Persson, Malin; Bengtsson, Elina; Månsson, Alf; Diez, Stefan; Linke, Heiner; Nicolau, Dan V
2016-03-08
The combinatorial nature of many important mathematical problems, including nondeterministic-polynomial-time (NP)-complete problems, places a severe limitation on the problem size that can be solved with conventional, sequentially operating electronic computers. There have been significant efforts in conceiving parallel-computation approaches in the past, for example: DNA computation, quantum computation, and microfluidics-based computation. However, these approaches have not proven, so far, to be scalable and practical from a fabrication and operational perspective. Here, we report the foundations of an alternative parallel-computation system in which a given combinatorial problem is encoded into a graphical, modular network that is embedded in a nanofabricated planar device. Exploring the network in a parallel fashion using a large number of independent, molecular-motor-propelled agents then solves the mathematical problem. This approach uses orders of magnitude less energy than conventional computers, thus addressing issues related to power consumption and heat dissipation. We provide a proof-of-concept demonstration of such a device by solving, in a parallel fashion, the small instance {2, 5, 9} of the subset sum problem, which is a benchmark NP-complete problem. Finally, we discuss the technical advances necessary to make our system scalable with presently available technology.
ERIC Educational Resources Information Center
Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo
2012-01-01
A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…
NASA Astrophysics Data System (ADS)
Cai, Yong; Cui, Xiangyang; Li, Guangyao; Liu, Wenyang
2018-04-01
The edge-smooth finite element method (ES-FEM) can improve the computational accuracy of triangular shell elements and the mesh partition efficiency of complex models. In this paper, an approach is developed to perform explicit finite element simulations of contact-impact problems with a graphical processing unit (GPU) using a special edge-smooth triangular shell element based on ES-FEM. Of critical importance for this problem is achieving finer-grained parallelism to enable efficient data loading and to minimize communication between the device and host. Four kinds of parallel strategies are then developed to efficiently solve these ES-FEM based shell element formulas, and various optimization methods are adopted to ensure aligned memory access. Special focus is dedicated to developing an approach for the parallel construction of edge systems. A parallel hierarchy-territory contact-searching algorithm (HITA) and a parallel penalty function calculation method are embedded in this parallel explicit algorithm. Finally, the program flow is well designed, and a GPU-based simulation system is developed, using Nvidia's CUDA. Several numerical examples are presented to illustrate the high quality of the results obtained with the proposed methods. In addition, the GPU-based parallel computation is shown to significantly reduce the computing time.
Murphy, Mark; Alley, Marcus; Demmel, James; Keutzer, Kurt; Vasanawala, Shreyas; Lustig, Michael
2012-06-01
We present l₁-SPIRiT, a simple algorithm for auto calibrating parallel imaging (acPI) and compressed sensing (CS) that permits an efficient implementation with clinically-feasible runtimes. We propose a CS objective function that minimizes cross-channel joint sparsity in the wavelet domain. Our reconstruction minimizes this objective via iterative soft-thresholding, and integrates naturally with iterative self-consistent parallel imaging (SPIRiT). Like many iterative magnetic resonance imaging reconstructions, l₁-SPIRiT's image quality comes at a high computational cost. Excessively long runtimes are a barrier to the clinical use of any reconstruction approach, and thus we discuss our approach to efficiently parallelizing l₁-SPIRiT and to achieving clinically-feasible runtimes. We present parallelizations of l₁-SPIRiT for both multi-GPU systems and multi-core CPUs, and discuss the software optimization and parallelization decisions made in our implementation. The performance of these alternatives depends on the processor architecture, the size of the image matrix, and the number of parallel imaging channels. Fundamentally, achieving fast runtime requires the correct trade-off between cache usage and parallelization overheads. We demonstrate image quality via a case from our clinical experimentation, using a custom 3DFT spoiled gradient echo (SPGR) sequence with up to 8× acceleration via Poisson-disc undersampling in the two phase-encoded directions.
Murphy, Mark; Alley, Marcus; Demmel, James; Keutzer, Kurt; Vasanawala, Shreyas; Lustig, Michael
2012-01-01
We present ℓ1-SPIRiT, a simple algorithm for auto calibrating parallel imaging (acPI) and compressed sensing (CS) that permits an efficient implementation with clinically-feasible runtimes. We propose a CS objective function that minimizes cross-channel joint sparsity in the Wavelet domain. Our reconstruction minimizes this objective via iterative soft-thresholding, and integrates naturally with iterative Self-Consistent Parallel Imaging (SPIRiT). Like many iterative MRI reconstructions, ℓ1-SPIRiT’s image quality comes at a high computational cost. Excessively long runtimes are a barrier to the clinical use of any reconstruction approach, and thus we discuss our approach to efficiently parallelizing ℓ1-SPIRiT and to achieving clinically-feasible runtimes. We present parallelizations of ℓ1-SPIRiT for both multi-GPU systems and multi-core CPUs, and discuss the software optimization and parallelization decisions made in our implementation. The performance of these alternatives depends on the processor architecture, the size of the image matrix, and the number of parallel imaging channels. Fundamentally, achieving fast runtime requires the correct trade-off between cache usage and parallelization overheads. We demonstrate image quality via a case from our clinical experimentation, using a custom 3DFT Spoiled Gradient Echo (SPGR) sequence with up to 8× acceleration via poisson-disc undersampling in the two phase-encoded directions. PMID:22345529
Effect of Vibration on Retention Characteristics of Screen Acquisition Systems
NASA Technical Reports Server (NTRS)
Tegart, J. R.; Park, A. C.
1977-01-01
An analytical and experimental investigation of the effect of vibration on the retention characteristics of screen acquisition systems was performed. The functioning of surface tension devices using fine-mesh screens requires that the pressure differential acting on the screen be less than its pressure retention capability. When exceeded, screen breakdown will occur and gas-free expulsion of propellant will no longer be possible. An analytical approach to predicting the effect of vibration was developed. This approach considers the transmission of the vibration to the screens of the device and the coupling of the liquid and the screen in establishing the screen response. A method of evaluating the transient response of the gas/liquid interface within the screen was also developed.
3D mosquito screens to create window double screen traps for mosquito control.
Khattab, Ayman; Jylhä, Kaisa; Hakala, Tomi; Aalto, Mikko; Malima, Robert; Kisinza, William; Honkala, Markku; Nousiainen, Pertti; Meri, Seppo
2017-08-29
Mosquitoes are vectors for many diseases such as malaria. Insecticide-treated bed nets and indoor residual spraying of insecticides are the principal malaria vector control tools used to prevent malaria in the tropics. Other interventions aim at reducing man-vector contact. For example, house screening provides additive or synergistic effects to other implemented measures. We used commercial screen materials made of polyester, polyethylene or polypropylene to design novel mosquito screens that provide remarkable additional benefits to those commonly used in house screening. The novel design is based on a double screen setup made of a screen with 3D geometric structures parallel to a commercial mosquito screen creating a trap between the two screens. Owing to the design of the 3D screen, mosquitoes can penetrate the 3D screen from one side but cannot return through the other side, making it a unidirectional mosquito screen. Therefore, the mosquitoes are trapped inside the double screen system. The permissiveness of both sides of the 3D screens for mosquitoes to pass through was tested in a wind tunnel using the insectary strain of Anopheles stephensi. Among twenty-five tested 3D screen designs, three designs from the cone, prism, or cylinder design groups were the most efficient in acting as unidirectional mosquito screens. The three cone-, prism-, and cylinder-based screens allowed, on average, 92, 75 and 64% of Anopheles stephensi mosquitoes released into the wind tunnel to penetrate the permissive side and 0, 0 and 6% of mosquitoes to escape through the non-permissive side, respectively. A cone-based 3D screen fulfilled the study objective. It allowed capturing 92% of mosquitoes within the double screen setup inside the wind tunnel and blocked 100% from escaping. Thus, the cone-based screen effectively acted as a unidirectional mosquito screen. This 3D screen-based trap design could therefore be used in house screening as a means of avoiding infective bites and reducing mosquito population size.
Gaudin, Valérie
2017-09-01
Screening methods are used as a first-line approach to detect the presence of antibiotic residues in food of animal origin. The validation process guarantees that the method is fit-for-purpose, suited to regulatory requirements, and provides evidence of its performance. This article is focused on intra-laboratory validation. The first step in validation is characterisation of performance, and the second step is the validation itself with regard to pre-established criteria. The validation approaches can be absolute (a single method) or relative (comparison of methods), overall (combination of several characteristics in one) or criterion-by-criterion. Various approaches to validation, in the form of regulations, guidelines or standards, are presented and discussed to draw conclusions on their potential application for different residue screening methods, and to determine whether or not they reach the same conclusions. The approach by comparison of methods is not suitable for screening methods for antibiotic residues. The overall approaches, such as probability of detection (POD) and accuracy profile, are increasingly used in other fields of application. They may be of interest for screening methods for antibiotic residues. Finally, the criterion-by-criterion approach (Decision 2002/657/EC and of European guideline for the validation of screening methods), usually applied to the screening methods for antibiotic residues, introduced a major characteristic and an improvement in the validation, i.e. the detection capability (CCβ). In conclusion, screening methods are constantly evolving, thanks to the development of new biosensors or liquid chromatography coupled to tandem-mass spectrometry (LC-MS/MS) methods. There have been clear changes in validation approaches these last 20 years. Continued progress is required and perspectives for future development of guidelines, regulations and standards for validation are presented here.
Preparation of Protein Samples for NMR Structure, Function, and Small Molecule Screening Studies
Acton, Thomas B.; Xiao, Rong; Anderson, Stephen; Aramini, James; Buchwald, William A.; Ciccosanti, Colleen; Conover, Ken; Everett, John; Hamilton, Keith; Huang, Yuanpeng Janet; Janjua, Haleema; Kornhaber, Gregory; Lau, Jessica; Lee, Dong Yup; Liu, Gaohua; Maglaqui, Melissa; Ma, Lichung; Mao, Lei; Patel, Dayaban; Rossi, Paolo; Sahdev, Seema; Shastry, Ritu; Swapna, G.V.T.; Tang, Yeufeng; Tong, Saichiu; Wang, Dongyan; Wang, Huang; Zhao, Li; Montelione, Gaetano T.
2014-01-01
In this chapter, we concentrate on the production of high quality protein samples for NMR studies. In particular, we provide an in-depth description of recent advances in the production of NMR samples and their synergistic use with recent advancements in NMR hardware. We describe the protein production platform of the Northeast Structural Genomics Consortium, and outline our high-throughput strategies for producing high quality protein samples for nuclear magnetic resonance (NMR) studies. Our strategy is based on the cloning, expression and purification of 6X-His-tagged proteins using T7-based Escherichia coli systems and isotope enrichment in minimal media. We describe 96-well ligation-independent cloning and analytical expression systems, parallel preparative scale fermentation, and high-throughput purification protocols. The 6X-His affinity tag allows for a similar two-step purification procedure implemented in a parallel high-throughput fashion that routinely results in purity levels sufficient for NMR studies (> 97% homogeneity). Using this platform, the protein open reading frames of over 17,500 different targeted proteins (or domains) have been cloned as over 28,000 constructs. Nearly 5,000 of these proteins have been purified to homogeneity in tens of milligram quantities (see Summary Statistics, http://nesg.org/statistics.html), resulting in more than 950 new protein structures, including more than 400 NMR structures, deposited in the Protein Data Bank. The Northeast Structural Genomics Consortium pipeline has been effective in producing protein samples of both prokaryotic and eukaryotic origin. Although this paper describes our entire pipeline for producing isotope-enriched protein samples, it focuses on the major updates introduced during the last 5 years (Phase 2 of the National Institute of General Medical Sciences Protein Structure Initiative). Our advanced automated and/or parallel cloning, expression, purification, and biophysical screening technologies are suitable for implementation in a large individual laboratory or by a small group of collaborating investigators for structural biology, functional proteomics, ligand screening and structural genomics research. PMID:21371586
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacon, Luis; del-Castillo-Negrete, Diego; Hauck, Cory D.
2014-09-01
We propose a Lagrangian numerical algorithm for a time-dependent, anisotropic temperature transport equation in magnetized plasmas in the large guide field regime. The approach is based on an analytical integral formal solution of the parallel (i.e., along the magnetic field) transport equation with sources, and it is able to accommodate both local and non-local parallel heat flux closures. The numerical implementation is based on an operator-split formulation, with two straightforward steps: a perpendicular transport step (including sources), and a Lagrangian (field-line integral) parallel transport step. Algorithmically, the first step is amenable to the use of modern iterative methods, while themore » second step has a fixed cost per degree of freedom (and is therefore scalable). Accuracy-wise, the approach is free from the numerical pollution introduced by the discrete parallel transport term when the perpendicular to parallel transport coefficient ratio X ⊥ /X ∥ becomes arbitrarily small, and is shown to capture the correct limiting solution when ε = X⊥L 2 ∥/X1L 2 ⊥ → 0 (with L∥∙ L⊥ , the parallel and perpendicular diffusion length scales, respectively). Therefore, the approach is asymptotic-preserving. We demonstrate the capabilities of the scheme with several numerical experiments with varying magnetic field complexity in two dimensions, including the case of transport across a magnetic island.« less
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
Kingston, Dawn; McDonald, Sheila; Biringer, Anne; Austin, Marie-Paule; Hegadoren, Kathy; McDonald, Sarah; Giallo, Rebecca; Ohinmaa, Arto; Lasiuk, Gerri; MacQueen, Glenda; Sword, Wendy; Lane-Smith, Marie; van Zanten, Sander Veldhuyzen
2014-01-02
Stress, depression, and anxiety affect 15% to 25% of pregnant women. However, substantial barriers to psychosocial assessment exist, resulting in less than 20% of prenatal care providers assessing and treating mental health problems. Moreover, pregnant women are often reluctant to disclose their mental health concerns to a healthcare provider. Identifying screening and assessment tools and procedures that are acceptable to both women and service providers, cost-effective, and clinically useful is needed. The primary objective of this randomized, parallel-group, superiority trial is to evaluate the feasibility and acceptability of a computer tablet-based prenatal psychosocial assessment (e-screening) compared to paper-based screening. Secondary objectives are to compare the two modes of screening on: (1) the level of detection of prenatal depression and anxiety symptoms and psychosocial risk; (2) the level of disclosure of symptoms; (3) the factors associated with feasibility, acceptability, and disclosure; (4) the psychometric properties of the e-version of the assessment tools; and (5) cost-effectiveness. A sample of 542 women will be recruited from large, primary care maternity clinics and a high-risk antenatal unit in an urban Canadian city. Pregnant women are eligible to participate if they: (1) receive care at one of the recruitment sites; (2) are able to speak/read English; (3) are willing to be randomized to e-screening; and (4) are willing to participate in a follow-up diagnostic interview within 1 week of recruitment. Allocation is by computer-generated randomization. Women in the intervention group will complete an online psychosocial assessment on a computer tablet, while those in the control group will complete the same assessment in paper-based form. All women will complete baseline questionnaires at the time of recruitment and will participate in a diagnostic interview within 1 week of recruitment. Research assistants conducting diagnostic interviews and physicians will be blinded. A qualitative descriptive study involving healthcare providers from the recruitment sites and women will provide data on feasibility and acceptability of the intervention. We hypothesize that mental health e-screening in primary care maternity settings and high-risk antenatal units will be as or more feasible, acceptable, and capable of detecting depression, anxiety, and psychosocial risk compared to paper-based screening. ClinicalTrials.gov Identifier: NCT01899534.
A Concept for Airborne Precision Spacing for Dependent Parallel Approaches
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay
2012-01-01
The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.
Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D
2014-03-01
Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.
The Simplified Aircraft-Based Paired Approach With the ALAS Alerting Algorithm
NASA Technical Reports Server (NTRS)
Perry, Raleigh B.; Madden, Michael M.; Torres-Pomales, Wilfredo; Butler, Ricky W.
2013-01-01
This paper presents the results of an investigation of a proposed concept for closely spaced parallel runways called the Simplified Aircraft-based Paired Approach (SAPA). This procedure depends upon a new alerting algorithm called the Adjacent Landing Alerting System (ALAS). This study used both low fidelity and high fidelity simulations to validate the SAPA procedure and test the performance of the new alerting algorithm. The low fidelity simulation enabled a determination of minimum approach distance for the worst case over millions of scenarios. The high fidelity simulation enabled an accurate determination of timings and minimum approach distance in the presence of realistic trajectories, communication latencies, and total system error for 108 test cases. The SAPA procedure and the ALAS alerting algorithm were applied to the 750-ft parallel spacing (e.g., SFO 28L/28R) approach problem. With the SAPA procedure as defined in this paper, this study concludes that a 750-ft application does not appear to be feasible, but preliminary results for 1000-ft parallel runways look promising.
High-throughput cultivation and screening platform for unicellular phototrophs.
Tillich, Ulrich M; Wolter, Nick; Schulze, Katja; Kramer, Dan; Brödel, Oliver; Frohme, Marcus
2014-09-16
High-throughput cultivation and screening methods allow a parallel, miniaturized and cost efficient processing of many samples. These methods however, have not been generally established for phototrophic organisms such as microalgae or cyanobacteria. In this work we describe and test high-throughput methods with the model organism Synechocystis sp. PCC6803. The required technical automation for these processes was achieved with a Tecan Freedom Evo 200 pipetting robot. The cultivation was performed in 2.2 ml deepwell microtiter plates within a cultivation chamber outfitted with programmable shaking conditions, variable illumination, variable temperature, and an adjustable CO2 atmosphere. Each microtiter-well within the chamber functions as a separate cultivation vessel with reproducible conditions. The automated measurement of various parameters such as growth, full absorption spectrum, chlorophyll concentration, MALDI-TOF-MS, as well as a novel vitality measurement protocol, have already been established and can be monitored during cultivation. Measurement of growth parameters can be used as inputs for the system to allow for periodic automatic dilutions and therefore a semi-continuous cultivation of hundreds of cultures in parallel. The system also allows the automatic generation of mid and long term backups of cultures to repeat experiments or to retrieve strains of interest. The presented platform allows for high-throughput cultivation and screening of Synechocystis sp. PCC6803. The platform should be usable for many phototrophic microorganisms as is, and be adaptable for even more. A variety of analyses are already established and the platform is easily expandable both in quality, i.e. with further parameters to screen for additional targets and in quantity, i.e. size or number of processed samples.
Examining Parallelism of Sets of Psychometric Measures Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Patelis, Thanos; Marcoulides, George A.
2011-01-01
A latent variable modeling approach that can be used to examine whether several psychometric tests are parallel is discussed. The method consists of sequentially testing the properties of parallel measures via a corresponding relaxation of parameter constraints in a saturated model or an appropriately constructed latent variable model. The…
Evaluation of Parallel Analysis Methods for Determining the Number of Factors
ERIC Educational Resources Information Center
Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.
2010-01-01
Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…
SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation
NASA Technical Reports Server (NTRS)
Steinman, Jeff S.
1992-01-01
Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Chin; Corttrell, R. A.
This Technical Note provides an overview of high-performance parallel Big Data transfers with and without encryption for data in-transit over multiple network channels. It shows that with the parallel approach, it is feasible to carry out high-performance parallel "encrypted" Big Data transfers without serious impact to throughput. But other impacts, e.g. the energy-consumption part should be investigated. It also explains our rationales of using a statistics-based approach for gaining understanding from test results and for improving the system. The presentation is of high-level nature. Nevertheless, at the end we will pose some questions and identify potentially fruitful directions for futuremore » work.« less
An Approach Using Parallel Architecture to Storage DICOM Images in Distributed File System
NASA Astrophysics Data System (ADS)
Soares, Tiago S.; Prado, Thiago C.; Dantas, M. A. R.; de Macedo, Douglas D. J.; Bauer, Michael A.
2012-02-01
Telemedicine is a very important area in medical field that is expanding daily motivated by many researchers interested in improving medical applications. In Brazil was started in 2005, in the State of Santa Catarina has a developed server called the CyclopsDCMServer, which the purpose to embrace the HDF for the manipulation of medical images (DICOM) using a distributed file system. Since then, many researches were initiated in order to seek better performance. Our approach for this server represents an additional parallel implementation in I/O operations since HDF version 5 has an essential feature for our work which supports parallel I/O, based upon the MPI paradigm. Early experiments using four parallel nodes, provide good performance when compare to the serial HDF implemented in the CyclopsDCMServer.
Distributed and parallel approach for handle and perform huge datasets
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.
Pilot Non-Conformance to Alerting System Commands During Closely Spaced Parallel Approaches
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.; Hansman, R. John
1997-01-01
Pilot non-conformance to alerting system commands has been noted in general and to a TCAS-like collision avoidance system in a previous experiment. This paper details two experiments studying collision avoidance during closely-spaced parallel approaches in instrument meteorological conditions (IMC), and specifically examining possible causal factors of, and design solutions to, pilot non-conformance.
Steinmetz, Eric J; Auldridge, Michele E
2017-11-01
The simplicity, speed, and low cost of bacterial culture make E. coli the system of choice for most initial trials of recombinant protein expression. However, many heterologous proteins are either poorly expressed in bacteria, or are produced as incorrectly folded, insoluble aggregates that lack the activity of the native protein. In many cases, fusion to a partner protein can allow for improved expression and/or solubility of a difficult target protein. Although several different fusion partners have gained favor, none are universally effective, and identifying the one that best improves soluble expression of a given target protein is an empirical process. This unit presents a strategy for parallel screening of fusion partners for enhanced expression or solubility. The Expresso® Solubility and Expression Screening System includes a panel of seven distinct fusion partners and utilizes an extremely simple cloning strategy to enable rapid screening and identification of the most effective fusion partner. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Concurrent Collections (CnC): A new approach to parallel programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knobe, Kathleen
2010-05-07
A common approach in designing parallel languages is to provide some high level handles to manipulate the use of the parallel platform. This exposes some aspects of the target platform, for example, shared vs. distributed memory. It may expose some but not all types of parallelism, for example, data parallelism but not task parallelism. This approach must find a balance between the desire to provide a simple view for the domain expert and provide sufficient power for tuning. This is hard for any given architecture and harder if the language is to apply to a range of architectures. Either simplicitymore » or power is lost. Instead of viewing the language design problem as one of providing the programmer with high level handles, we view the problem as one of designing an interface. On one side of this interface is the programmer (domain expert) who knows the application but needs no knowledge of any aspects of the platform. On the other side of the interface is the performance expert (programmer or program) who demands maximal flexibility for optimizing the mapping to a wide range of target platforms (parallel / serial, shared / distributed, homogeneous / heterogeneous, etc.) but needs no knowledge of the domain. Concurrent Collections (CnC) is based on this separation of concerns. The talk will present CnC and its benefits. About the speaker. Kathleen Knobe has focused throughout her career on parallelism especially compiler technology, runtime system design and language design. She worked at Compass (aka Massachusetts Computer Associates) from 1980 to 1991 designing compilers for a wide range of parallel platforms for Thinking Machines, MasPar, Alliant, Numerix, and several government projects. In 1991 she decided to finish her education. After graduating from MIT in 1997, she joined Digital Equipment’s Cambridge Research Lab (CRL). She stayed through the DEC/Compaq/HP mergers and when CRL was acquired and absorbed by Intel. She currently works in the Software and Services Group / Technology Pathfinding and Innovation.« less
Concurrent Collections (CnC): A new approach to parallel programming
Knobe, Kathleen
2018-04-16
A common approach in designing parallel languages is to provide some high level handles to manipulate the use of the parallel platform. This exposes some aspects of the target platform, for example, shared vs. distributed memory. It may expose some but not all types of parallelism, for example, data parallelism but not task parallelism. This approach must find a balance between the desire to provide a simple view for the domain expert and provide sufficient power for tuning. This is hard for any given architecture and harder if the language is to apply to a range of architectures. Either simplicity or power is lost. Instead of viewing the language design problem as one of providing the programmer with high level handles, we view the problem as one of designing an interface. On one side of this interface is the programmer (domain expert) who knows the application but needs no knowledge of any aspects of the platform. On the other side of the interface is the performance expert (programmer or program) who demands maximal flexibility for optimizing the mapping to a wide range of target platforms (parallel / serial, shared / distributed, homogeneous / heterogeneous, etc.) but needs no knowledge of the domain. Concurrent Collections (CnC) is based on this separation of concerns. The talk will present CnC and its benefits. About the speaker. Kathleen Knobe has focused throughout her career on parallelism especially compiler technology, runtime system design and language design. She worked at Compass (aka Massachusetts Computer Associates) from 1980 to 1991 designing compilers for a wide range of parallel platforms for Thinking Machines, MasPar, Alliant, Numerix, and several government projects. In 1991 she decided to finish her education. After graduating from MIT in 1997, she joined Digital Equipmentâs Cambridge Research Lab (CRL). She stayed through the DEC/Compaq/HP mergers and when CRL was acquired and absorbed by Intel. She currently works in the Software and Services Group / Technology Pathfinding and Innovation.
2014-09-01
simulation time frame from 30 days to one year. This was enabled by porting the simulation to the Pleiades supercomputer at NASA Ames Research Center, a...including the motivation for changes to our past approach. We then present the software implementation (3) on the NASA Ames Pleiades supercomputer...significantly updated since last year’s paper [25]. The main incentive for that was the shift to a highly parallel approach in order to utilize the Pleiades
A kinase-focused compound collection: compilation and screening strategy.
Sun, Dongyu; Chuaqui, Claudio; Deng, Zhan; Bowes, Scott; Chin, Donovan; Singh, Juswinder; Cullen, Patrick; Hankins, Gretchen; Lee, Wen-Cherng; Donnelly, Jason; Friedman, Jessica; Josiah, Serene
2006-06-01
Lead identification by high-throughput screening of large compound libraries has been supplemented with virtual screening and focused compound libraries. To complement existing approaches for lead identification at Biogen Idec, a kinase-focused compound collection was designed, developed and validated. Two strategies were adopted to populate the compound collection: a ligand shape-based virtual screening and a receptor-based approach (structural interaction fingerprint). Compounds selected with the two approaches were cherry-picked from an existing high-throughput screening compound library, ordered from suppliers and supplemented with specific medicinal compounds from internal programs. Promising hits and leads have been generated from the kinase-focused compound collection against multiple kinase targets. The principle of the collection design and screening strategy was validated and the use of the kinase-focused compound collection for lead identification has been added to existing strategies.
Segre, Lisa S; Brock, Rebecca L; O'Hara, Michael W; Gorman, Laura L; Engeldinger, Jane
2011-08-01
This case report describes the development and implementation of the Train-the-Trainer: Maternal Depression Screening Program (TTT), a novel approach to disseminating perinatal depression screening. We trained screeners according to a standard pyramid scheme of train-the-trainer programs: three experts trained representatives from health care agencies (the TTT trainers), who in turn trained their staff and implemented depression screening at their home agencies. The TTT trainers had little or no prior mental health experience so "enhanced" components were added to ensure thorough instruction. Although TTT was implemented primarily as a services project, we evaluated both the statewide dissemination and the screening rates achieved by TTT programs. Thirty-two social service or health agencies implemented maternal depression screening in 20 counties throughout Iowa; this reached 58.2% of the Iowa population. For the 16 agencies that provided screening data, the average screening rate (number of women screened/number eligible to be screened) for the first 3 months of screening was 73.2%, 80.5% and 79.0%. We compared screening rates of our TTT programs with those of Healthy Start, a program in which screening was established via an intensive consultation model. We found the screening rates in 62.5% of TTT agencies were comparable to those in Healthy Start. Our "enhanced" train-the-trainer method is a promising approach for broadly implementing depression-screening programs in agencies serving pregnant and postpartum women.
Gray, Andrea; Maguire, Timothy; Schloss, Rene; Yarmush, Martin L
2015-01-01
Induction of therapeutic mesenchymal stromal cell (MSC) function is dependent upon activating factors present in diseased or injured tissue microenvironments. These functions include modulation of macrophage phenotype via secreted molecules including prostaglandin E2 (PGE2). Many approaches aim to optimize MSC-based therapies, including preconditioning using soluble factors and cell immobilization in biomaterials. However, optimization of MSC function is usually inefficient as only a few factors are manipulated in parallel. We utilized fractional factorial design of experiments to screen a panel of 6 molecules (lipopolysaccharide [LPS], polyinosinic-polycytidylic acid [poly(I:C)], interleukin [IL]-6, IL-1β, interferon [IFN]-β, and IFN-γ), individually and in combinations, for the upregulation of MSC PGE2 secretion and attenuation of macrophage secretion of tumor necrosis factor (TNF)-α, a pro-inflammatory molecule, by activated-MSC conditioned medium (CM). We used multivariable linear regression (MLR) and analysis of covariance to determine differences in functions of optimal factors on monolayer MSCs and alginate-encapsulated MSCs (eMSCs). The screen revealed that LPS and IL-1β potently activated monolayer MSCs to enhance PGE2 production and attenuate macrophage TNF-α. Activation by LPS and IL-1β together synergistically increased MSC PGE2, but did not synergistically reduce macrophage TNF-α. MLR and covariate analysis revealed that macrophage TNF-α was strongly dependent on the MSC activation factor, PGE2 level, and macrophage donor but not MSC culture format (monolayer versus encapsulated). The results demonstrate the feasibility and utility of using statistical approaches for higher throughput cell analysis. This approach can be extended to develop activation schemes to maximize MSC and MSC-biomaterial functions prior to transplantation to improve MSC therapies. © 2015 American Institute of Chemical Engineers.
The Masterson Approach with play therapy: a parallel process between mother and child.
Mulherin, M A
2001-01-01
This paper discusses a case in which the Masterson Approach was used with play therapy to treat a child with a developing personality disorder. It describes the parallel progression of the child and mother in adjunct therapy throughout a six-year period. The unique value of the Masterson Approach is that it provides the therapist with a framework and tool to diagnose and treat a child during the dynamic process of play. The case describes the mother-child dyad throughout therapy. It traces their parallel processes that involve separation, individuation, rapprochement, and the recovery of real self-capacities. Each stage of treatment is described, including verbal interventions. The child's internal affective state and intrapsychic structure during the various stages of treatment are illustrated by representative pictures.
Leung, Doris Y P; Wong, Eliza M L; Chan, Carmen W H
2016-04-01
The prevalence of colorectal cancer (CRC) among older people is high. Screening for CRC presents a cost-effective secondary prevention and control strategy which results in a significant reduction in mortality. This study aims to describe the prevalence of CRC screening and examine its risk factors among Chinese community-dwelling older people guided by a comprehensive model combining Health Belief Model and Extended Parallel Processing Model. A descriptive correlational study was conducted. A convenience sample of 240 community-dwelling adults aged ≥60 was recruited in May-July in 2012 in Hong Kong. Participants were asked to complete a questionnaire which collected information on demographic variables, CRC-related psychosocial variables and whether they had a CRC screening in the past 10 years. Among the participants, 25.4% reported having a CRC screening test. Results of logistic regression analyses indicated that participants with a higher level in cue to action, and lower perceived knowledge barriers and severity-fear were significantly associated with participation in CRC screening. But there were no significant associations between fatalism and cancer fear with screening. The prevalence of CRC screening was low in Hong Kong Chinese community-dwelling elders. A number of modifiable factors associated with CRC screening were identified which provides specific targets for interventions. This study also adds to the knowledge regarding the associations between fatalism and fear with CRC screening behaviors among Chinese older people. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan
2010-01-01
Calibration of groundwater models involves hundreds to thousands of forward solutions, each of which may solve many transient coupled nonlinear partial differential equations, resulting in a computationally intensive problem. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelisms in software and hardware to reduce calibration time on multi-core computers. HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for direct solutions for a reactive transport model application, and a field-scale coupled flow and transport model application. In the reactive transport model, a single parallelizable loop is identified to account for over 97% of the total computational time using GPROF.more » Addition of a few lines of OpenMP compiler directives to the loop yields a speedup of about 10 on a 16-core compute node. For the field-scale model, parallelizable loops in 14 of 174 HGC5 subroutines that require 99% of the execution time are identified. As these loops are parallelized incrementally, the scalability is found to be limited by a loop where Cray PAT detects over 90% cache missing rates. With this loop rewritten, similar speedup as the first application is achieved. The OpenMP-parallelized code can be run efficiently on multiple workstations in a network or multiple compute nodes on a cluster as slaves using parallel PEST to speedup model calibration. To run calibration on clusters as a single task, the Levenberg Marquardt algorithm is added to HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, 100 200 compute cores are used to reduce the calibration time from weeks to a few hours for these two applications. This approach is applicable to most of the existing groundwater model codes for many applications.« less
A Hybrid MPI/OpenMP Approach for Parallel Groundwater Model Calibration on Multicore Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan
2010-01-01
Groundwater model calibration is becoming increasingly computationally time intensive. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelism in software and hardware to reduce calibration time on multicore computers with minimal parallelization effort. At first, HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for a uranium transport model with over a hundred species involving nearly a hundred reactions, and a field scale coupled flow and transport model. In the first application, a single parallelizable loop is identified to consume over 97% of the total computational time. With a few lines of OpenMP compiler directives inserted into the code,more » the computational time reduces about ten times on a compute node with 16 cores. The performance is further improved by selectively parallelizing a few more loops. For the field scale application, parallelizable loops in 15 of the 174 subroutines in HGC5 are identified to take more than 99% of the execution time. By adding the preconditioned conjugate gradient solver and BICGSTAB, and using a coloring scheme to separate the elements, nodes, and boundary sides, the subroutines for finite element assembly, soil property update, and boundary condition application are parallelized, resulting in a speedup of about 10 on a 16-core compute node. The Levenberg-Marquardt (LM) algorithm is added into HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, compute nodes at the number of adjustable parameters (when the forward difference is used for Jacobian approximation), or twice that number (if the center difference is used), are used to reduce the calibration time from days and weeks to a few hours for the two applications. This approach can be extended to global optimization scheme and Monte Carol analysis where thousands of compute nodes can be efficiently utilized.« less
Scheikl, Ute; Tsao, Han-Fei; Horn, Matthias; Indra, Alexander; Walochnik, Julia
2016-09-01
Free-living amoebae (FLA) are widely spread in the environment and known to cause rare but often serious infections. Besides this, FLA may serve as vehicles for bacterial pathogens. In particular, Legionella pneumophila is known to replicate within FLA thereby also gaining enhanced infectivity. Cooling towers have been the source of outbreaks of Legionnaires' disease in the past and are thus usually screened for legionellae on a routine basis, not considering, however, FLA and their vehicle function. The aim of this study was to incorporate a screening system for host amoebae into a Legionella routine screening. A new real-time PCR-based screening system for various groups of FLA was established. Three cooling towers were screened every 2 weeks over the period of 1 year for FLA and Legionella spp., by culture and molecular methods in parallel. Altogether, 83.3 % of the cooling tower samples were positive for FLA, Acanthamoeba being the dominating genus. Interestingly, 69.7 % of the cooling tower samples were not suitable for the standard Legionella screening due to their high organic burden. In the remaining samples, positivity for Legionella spp. was 25 % by culture, but overall positivity was 50 % by molecular methods. Several amoebal isolates revealed intracellular bacteria.
NASA Astrophysics Data System (ADS)
Matsuda, Y.; Nonomura, T.; Kakutani, K.; Kimbara, J.; Osamura, K.; Kusakari, S.; Toyoda, H.
2015-10-01
An electric field screen is a physical device used to exclude pest insects from greenhouses and warehouses to protect crop production and storage. The screen consists of iron insulated conductor wires (ICWs) arrayed in parallel and linked to each other, an electrostatic DC voltage generator used to supply a negative charge to the ICWs, and an earthed stainless net placed on one side of the ICW layer. The ICW was negatively charged to polarize the earthed net to create a positive charge on the ICW side surface, and an electric field formed between the opposite charges of the ICW and earthed net. The current study focused on the ability of the screen to repel insects reaching the screen net. This repulsion was a result of the insect's behaviour, i.e., the insects were deterred from entering the electric field of the screen. In fact, when the screen was negatively charged with the appropriate voltages, the insects placed their antennae inside the screen and then flew away without entering. Obviously, the insects recognized the electric field using their antennae and thereby avoided entering. Using a wide range of insects and spiders belonging to different taxonomic groups, we confirmed that the avoidance response to the electric field was common in these animals.
Cross-species extrapolation of toxicity information using the ...
In the United States, the Endocrine Disruptor Screening Program (EDSP) was established to identify chemicals that may lead to adverse effects via perturbation of the endocrine system (i.e., estrogen, androgen, and thyroid hormone systems). In the mid-1990s the EDSP adopted a two tiered approach for screening chemicals that applied standardized in vitro and in vivo toxicity tests. The Tier 1 screening assays were designed to identify substances that have the potential of interacting with the endocrine system and Tier 2 testing was developed to identify adverse effects caused by the chemical, with documentation of dose-response relationships. While this tiered approach was effective in identifying possible endocrine disrupting chemicals, the cost and time to screen a single chemical was significant. Therefore, in 2012 the EDSP proposed a transition to make greater use of computational approaches (in silico) and high-throughput screening (HTS; in vitro) assays to more rapidly and cost-efficiently screen chemicals for endocrine activity. This transition from resource intensive, primarily in vivo, screening methods to more pathway-based approaches aligns with the simultaneously occurring transformation in toxicity testing termed “Toxicity Testing in the 21st Century” which shifts the focus to the disturbance of the biological pathway predictive of the observable toxic effects. An example of such screening tools include the US Environmental Protection Agency’s
Minucci, Angelo; De Paolis, Elisa; Concolino, Paola; De Bonis, Maria; Rizza, Roberta; Canu, Giulia; Scaglione, Giovanni Luca; Mignone, Flavio; Scambia, Giovanni; Zuppi, Cecilia; Capoluongo, Ettore
2017-07-01
Evaluation of copy number variation (CNV) in BRCA1/2 genes, due to large genomic rearrangements (LGRs), is a mandatory analysis in hereditary breast and ovarian cancers families, if no pathogenic variants are found by sequencing. LGRs cannot be detected by conventional methods and several alternative methods have been developed. Since these approaches are expensive and time consuming, identification of alternative screening methods for LGRs detection is needed in order to reduce and optimize the diagnostic procedure. The aim of this study was to investigate a Competitive PCR-High Resolution Melting Analysis (C-PCR-HRMA) as molecular tool to detect recurrent BRCA1 LGRs. C-PCR-HRMA was performed on exons 3, 14, 18, 19, 20 and 21 of the BRCA1 gene; exons 4, 6 and 7 of the ALB gene were used as reference fragments. This study showed that it is possible to identify recurrent BRCA1 LGRs, by melting peak height ratio between target (BRCA1) and reference (ALB) fragments. Furthermore, we underline that a peculiar amplicon-melting profile is associated to a specific BRCA1 LGR. All C-PCR-HRMA results were confirmed by Multiplex ligation-dependent probe amplification. C-PCR-HRMA has proved to be an innovative, efficient and fast method for BRCA1 LGRs detection. Given the sensitivity, specificity and ease of use, c-PCR-HRMA can be considered an attractive and powerful alternative to other methods for BRCA1 CNVs screening, improving molecular strategies for BRCA testing in the context of Massive Parallel Sequencing. Copyright © 2017 Elsevier B.V. All rights reserved.
Manzanares-Palenzuela, C Lorena; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz
2015-06-15
Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.
Comfortably engaging: which approach to alcohol screening should we use?
Vinson, Daniel C; Galliher, James M; Reidinger, Carol; Kappus, Jennifer A
2004-01-01
We wanted to compare 2 screening instruments for problem drinking, the CAGE and a single question, assessing frequency of use, patient and clinician comfort, and patient engagement in change. The study was a crossover, cluster-randomized clinical trial with 31 clinicians in Missouri and 13 in the American Academy of Family Physicians (AAFP) National Network for Family Practice and Primary Care Research; 2,800 patients provided data. The clinician was the unit of randomization. Clinicians decided whether to screen each patient; if they chose to screen, they used the screening approach assigned for that block of patients. The clinician and patient separately completed questionnaires immediately after the office visit to assess each one's comfort with screening (and any ensuing discussion) and the patient's engagement in change. Missouri clinicians screened more patients when assigned the single question (81%) than the CAGE (69%, P = .001 in weighted analysis). There was no difference among AAFP network clinicians (96% of patients screened with the CAGE, 97% with the single question). Eighty percent to 90% of clinicians and 70% of patients reported being comfortable with screening and the ensuing discussion, with no difference between approaches in either network. About one third of patients who were identified as problem drinkers reported thinking about or planning to change their drinking behavior, with no difference in engagement between screening approaches. Clinicians and patients reported similar comfort with the CAGE questions and the single-question screening tools for problem drinking, and the 2 instruments were equal in their ability to engage the patient. In Missouri, the single question was more likely to be used.
Gross, Cary P; Fried, Terri R; Tinetti, Mary E; Ross, Joseph S; Genao, Inginia; Hossain, Sabina; Wolf, Elizabeth; Lewis, Carmen L
2015-03-01
To understand how older persons with multiple chronic conditions (MCC) approach decisions about cancer screening. We conducted interviews with adults >65 years old with at least two chronic conditions who were taking ≥five medications daily. Patients were first asked how age and multimorbidity influence their cancer screening decisions. After showing them an educational prompt that explained the relationship between life expectancy and the benefits of cancer screening, respondents were then asked about screening in the context of specific health scenarios. Using grounded theory, three independent readers coded responses for salient themes. Sample size was determined by thematic saturation. Most respondents (26 of 28) initially indicated that their overall health or medical conditions do not influence their cancer screening decisions. After viewing the educational prompt, respondents described two broad approaches to cancer screening in the setting of increasing age or multi-morbidity. The first was a "benefits versus harms" approach in which participants weighed direct health benefits (e.g. reducing cancer incidence or mortality) and harms (e.g. complications or inconvenience). The second was a heuristic approach. Some heuristics favored screening, such as a persistent belief in unspecified benefits from screening, value of knowledge about cancer status, and not wanting to "give up", whereas other heuristics discouraged screening, such as fatalism or a reluctance to learn about their cancer status. When considering cancer screening, some older persons with MCC employ heuristics which circumvent the traditional quantitative comparison of risks and benefits, providing an important challenge to informed decision making. Copyright © 2014 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
McPherson, Lyn; Ware, Robert S.; Carrington, Suzanne; Lennox, Nicholas
2017-01-01
Background: Adolescents with intellectual disability have high levels of unrecognized disease and inadequate health screening/promotion which might be addressed by improving health advocacy skills. Methods: A parallel-group cluster randomized controlled trial was conducted to investigate whether a health intervention package, consisting of…
USDA-ARS?s Scientific Manuscript database
The Comprehensive Assessment of the Long-term Effects of Reducing Intake of Energy Phase 2 (CALERIE) study is a systematic investigation of sustained 25% calorie restriction (CR) in non-obese humans. CALERIE is a multicenter (3 clinical sites, one coordinating center), parallel group, randomized con...
Larkey, Linda K; McClain, Darya; Roe, Denise J; Hector, Richard D; Lopez, Ana Maria; Sillanpaa, Brian; Gonzalez, Julie
2015-01-01
Screening rates for colorectal cancer (CRC) lag for low-income, minority populations, contributing to poorer survival rates. A model of storytelling as culture-centric health promotion was tested for promoting CRC screening. A two-group parallel randomized controlled trial. Primary care, safety-net clinics. Low-income patients due for CRC screening, ages 50 to 75 years, speaking English or Spanish. Patients were exposed to either a video created from personal stories composited into a drama about "Papa" receiving CRC screening, or an instrument estimating level of personal cancer risk. Patients received a health care provider referral for CRC screening and were followed up for 3 months to document adherence. Behavioral factors related to the narrative model (identification and engagement) and theory of planned behavior. Main effects of the interventions on screening were tested, controlling for attrition factors, and demographic factor associations were assessed. Path analysis with model variables was used to test the direct effects and multiple mediator models. Main effects on CRC screening (roughly half stool-based tests, half colonoscopy) did not indicate significant differences (37% and 42% screened for storytelling and risk-based messages, respectively; n = 539; 33.6% male; 62% Hispanic). Factors positively associated with CRC screening included being female, Hispanic, married or living with a partner, speaking Spanish, having a primary care provider, lower income, and no health insurance. Engagement, working through positive attitudes toward the behavior, predicted CRC screening. A storytelling and a personalized risk-tool intervention achieved similar levels of screening among unscreened/underscreened, low-income patients. Factors usually associated with lower rates of screening (e.g., no insurance, being Hispanic) were related to more adherence. Both interventions' engagement factor facilitated positive attitudes about CRC screening associated with behavior change.
The paradigm compiler: Mapping a functional language for the connection machine
NASA Technical Reports Server (NTRS)
Dennis, Jack B.
1989-01-01
The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.
A path-level exact parallelization strategy for sequential simulation
NASA Astrophysics Data System (ADS)
Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.
2018-01-01
Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.
Guo, L-X; Li, J; Zeng, H
2009-11-01
We present an investigation of the electromagnetic scattering from a three-dimensional (3-D) object above a two-dimensional (2-D) randomly rough surface. A Message Passing Interface-based parallel finite-difference time-domain (FDTD) approach is used, and the uniaxial perfectly matched layer (UPML) medium is adopted for truncation of the FDTD lattices, in which the finite-difference equations can be used for the total computation domain by properly choosing the uniaxial parameters. This makes the parallel FDTD algorithm easier to implement. The parallel performance with different number of processors is illustrated for one rough surface realization and shows that the computation time of our parallel FDTD algorithm is dramatically reduced relative to a single-processor implementation. Finally, the composite scattering coefficients versus scattered and azimuthal angle are presented and analyzed for different conditions, including the surface roughness, the dielectric constants, the polarization, and the size of the 3-D object.
NASA Astrophysics Data System (ADS)
Rastogi, Richa; Srivastava, Abhishek; Khonde, Kiran; Sirasala, Kirannmayi M.; Londhe, Ashutosh; Chavhan, Hitesh
2015-07-01
This paper presents an efficient parallel 3D Kirchhoff depth migration algorithm suitable for current class of multicore architecture. The fundamental Kirchhoff depth migration algorithm exhibits inherent parallelism however, when it comes to 3D data migration, as the data size increases the resource requirement of the algorithm also increases. This challenges its practical implementation even on current generation high performance computing systems. Therefore a smart parallelization approach is essential to handle 3D data for migration. The most compute intensive part of Kirchhoff depth migration algorithm is the calculation of traveltime tables due to its resource requirements such as memory/storage and I/O. In the current research work, we target this area and develop a competent parallel algorithm for post and prestack 3D Kirchhoff depth migration, using hybrid MPI+OpenMP programming techniques. We introduce a concept of flexi-depth iterations while depth migrating data in parallel imaging space, using optimized traveltime table computations. This concept provides flexibility to the algorithm by migrating data in a number of depth iterations, which depends upon the available node memory and the size of data to be migrated during runtime. Furthermore, it minimizes the requirements of storage, I/O and inter-node communication, thus making it advantageous over the conventional parallelization approaches. The developed parallel algorithm is demonstrated and analysed on Yuva II, a PARAM series of supercomputers. Optimization, performance and scalability experiment results along with the migration outcome show the effectiveness of the parallel algorithm.
Performance Studies on Distributed Virtual Screening
Krüger, Jens; de la Garza, Luis; Kohlbacher, Oliver; Nagel, Wolfgang E.
2014-01-01
Virtual high-throughput screening (vHTS) is an invaluable method in modern drug discovery. It permits screening large datasets or databases of chemical structures for those structures binding possibly to a drug target. Virtual screening is typically performed by docking code, which often runs sequentially. Processing of huge vHTS datasets can be parallelized by chunking the data because individual docking runs are independent of each other. The goal of this work is to find an optimal splitting maximizing the speedup while considering overhead and available cores on Distributed Computing Infrastructures (DCIs). We have conducted thorough performance studies accounting not only for the runtime of the docking itself, but also for structure preparation. Performance studies were conducted via the workflow-enabled science gateway MoSGrid (Molecular Simulation Grid). As input we used benchmark datasets for protein kinases. Our performance studies show that docking workflows can be made to scale almost linearly up to 500 concurrent processes distributed even over large DCIs, thus accelerating vHTS campaigns significantly. PMID:25032219
Formation and evolution of bubbly screens in confined oscillating bubbly liquids.
Shklyaev, Sergey; Straube, Arthur V
2010-01-01
We consider the dynamics of dilute monodisperse bubbly liquid confined by two plane solid walls and subject to small-amplitude high-frequency oscillations normal to the walls. The initial state corresponds to the uniform distribution of bubbles and motionless liquid. The period of external driving is assumed much smaller than typical relaxation times for a single bubble but larger than the period of volume eigenoscillations. The time-averaged description accounting for the two-way coupling between the liquid and the bubbles is applied. We show that the model predicts accumulation of bubbles in thin sheets parallel to the walls. These singular structures, which are formally characterized by infinitely thin width and infinitely high concentration, are referred to as bubbly screens. The formation of a bubbly screen is described analytically in terms of a self-similar solution, which is in agreement with numerical simulations. We study the evolution of bubbly screens and detect a one-dimensional stationary state, which is shown to be unconditionally unstable.
Formation and evolution of bubbly screens in confined oscillating bubbly liquids
NASA Astrophysics Data System (ADS)
Shklyaev, Sergey; Straube, Arthur V.
2010-01-01
We consider the dynamics of dilute monodisperse bubbly liquid confined by two plane solid walls and subject to small-amplitude high-frequency oscillations normal to the walls. The initial state corresponds to the uniform distribution of bubbles and motionless liquid. The period of external driving is assumed much smaller than typical relaxation times for a single bubble but larger than the period of volume eigenoscillations. The time-averaged description accounting for the two-way coupling between the liquid and the bubbles is applied. We show that the model predicts accumulation of bubbles in thin sheets parallel to the walls. These singular structures, which are formally characterized by infinitely thin width and infinitely high concentration, are referred to as bubbly screens. The formation of a bubbly screen is described analytically in terms of a self-similar solution, which is in agreement with numerical simulations. We study the evolution of bubbly screens and detect a one-dimensional stationary state, which is shown to be unconditionally unstable.
CHAM: a fast algorithm of modelling non-linear matter power spectrum in the sCreened HAlo Model
NASA Astrophysics Data System (ADS)
Hu, Bin; Liu, Xue-Wen; Cai, Rong-Gen
2018-05-01
We present a fast numerical screened halo model algorithm (CHAM, which stands for the sCreened HAlo Model) for modelling non-linear power spectrum for the alternative models to Λ cold dark matter. This method has three obvious advantages. First of all, it is not being restricted to a specific dark energy/modified gravity model. In principle, all of the screened scalar-tensor theories can be applied. Secondly, the least assumptions are made in the calculation. Hence, the physical picture is very easily understandable. Thirdly, it is very predictable and does not rely on the calibration from N-body simulation. As an example, we show the case of the Hu-Sawicki f(R) gravity. In this case, the typical CPU time with the current parallel PYTHON script (eight threads) is roughly within 10 min. The resulting spectra are in a good agreement with N-body data within a few percentage accuracy up to k ˜ 1 h Mpc-1.
Utilization of Stop-flow Micro-tubing Reactors for the Development of Organic Transformations.
Toh, Ren Wei; Li, Jie Sheng; Wu, Jie
2018-01-04
A new reaction screening technology for organic synthesis was recently demonstrated by combining elements from both continuous micro-flow and conventional batch reactors, coined stop-flow micro-tubing (SFMT) reactors. In SFMT, chemical reactions that require high pressure can be screened in parallel through a safer and convenient way. Cross-contamination, which is a common problem in reaction screening for continuous flow reactors, is avoided in SFMT. Moreover, the commercially available light-permeable micro-tubing can be incorporated into SFMT, serving as an excellent choice for light-mediated reactions due to a more effective uniform light exposure, compared to batch reactors. Overall, the SFMT reactor system is similar to continuous flow reactors and more superior than batch reactors for reactions that incorporate gas reagents and/or require light-illumination, which enables a simple but highly efficient reaction screening system. Furthermore, any successfully developed reaction in the SFMT reactor system can be conveniently translated to continuous-flow synthesis for large scale production.
Melagraki, G; Afantitis, A
2011-01-01
Virtual Screening (VS) has experienced increased attention into the recent years due to the large datasets made available, the development of advanced VS techniques and the encouraging fact that VS has contributed to the discovery of several compounds that have either reached the market or entered clinical trials. Hepatitis C Virus (HCV) nonstructural protein 5B (NS5B) has become an attractive target for the development of antiviral drugs and many small molecules have been explored as possible HCV NS5B inhibitors. In parallel with experimental practices, VS can serve as a valuable tool in the identification of novel effective inhibitors. Different techniques and workflows have been reported in literature with the goal to prioritize possible potent hits. In this context, different virtual screening strategies have been deployed for the identification of novel Hepatitis C Virus (HCV) inhibitors. This work reviews recent applications of virtual screening in an effort to identify novel potent HCV inhibitors.
Tandem screening of toxic compounds on GFP-labeled bacteria and cancer cells in microtiter plates.
Montoya, Jessica; Varela-Ramirez, Armando; Shanmugasundram, Muthian; Martinez, Luis E; Primm, Todd P; Aguilera, Renato J
2005-09-23
A 96-well fluorescence-based assay has been developed for the rapid screening of potential cytotoxic and bacteriocidal compounds. The assay is based on detection of green fluorescent protein (GFP) in HeLa human carcinoma cells as well as gram negative (Escherichia coli) and gram positive bacteria (Mycobacterium avium). Addition of a toxic compound to the GFP marked cells resulted in the loss of the GFP fluorescence which was readily detected by fluorometry. Thirty-nine distinct naphthoquinone derivatives were screened and several of these compounds were found to be toxic to all cell types. Apart from differences in overall toxicity, two general types of toxic compounds were detected, those that exhibited toxicity to two or all three of the cell types and those that were primarily toxic to the HeLa cells. Our results demonstrate that the parallel screening of both eukaryotic and prokaryotic cells is not only feasible and reproducible but also cost effective.
A catalog of putative adverse outcome pathways (AOPs) that ...
A number of putative AOPs for several distinct MIEs of thyroid disruption have been formulated for amphibian metamorphosis and fish swim bladder inflation. These have been entered into the AOP knowledgebase on the OECD WIKI. The EDSP has been actively advancing high-throughput screening for chemical activity toward estrogen, androgen and thyroid targets. However, it has been recently identified that coverage for thyroid-related targets is lagging behind estrogen and androgen assay coverage. As thyroid-related medium-high throughput assays are actively being developed for inclusion in the ToxCast chemical screening program, a parallel effort is underway to characterize putative adverse outcome pathways (AOPs) specific to these thyroid-related targets. This effort is intended to provide biological and ecological context that will enhance the utility of ToxCast high throughput screening data for hazard identification.
NASA Astrophysics Data System (ADS)
Lim, Jiseok; Vrignon, Jérémy; Gruner, Philipp; Karamitros, Christos S.; Konrad, Manfred; Baret, Jean-Christophe
2013-11-01
We demonstrate the use of a hybrid microfluidic-micro-optical system for the screening of enzymatic activity at the single cell level. Escherichia coli β-galactosidase activity is revealed by a fluorogenic assay in 100 pl droplets. Individual droplets containing cells are screened by measuring their fluorescence signal using a high-speed camera. The measurement is parallelized over 100 channels equipped with microlenses and analyzed by image processing. A reinjection rate of 1 ml of emulsion per minute was reached corresponding to more than 105 droplets per second, an analytical throughput larger than those obtained using flow cytometry.
Apparatus for combinatorial screening of electrochemical materials
Kepler, Keith Douglas [Belmont, CA; Wang, Yu [Foster City, CA
2009-12-15
A high throughput combinatorial screening method and apparatus for the evaluation of electrochemical materials using a single voltage source (2) is disclosed wherein temperature changes arising from the application of an electrical load to a cell array (1) are used to evaluate the relative electrochemical efficiency of the materials comprising the array. The apparatus may include an array of electrochemical cells (1) that are connected to each other in parallel or in series, an electronic load (2) for applying a voltage or current to the electrochemical cells (1), and a device (3), external to the cells, for monitoring the relative temperature of each cell when the load is applied.
The Nano-Patch-Clamp Array: Microfabricated Glass Chips for High-Throughput Electrophysiology
NASA Astrophysics Data System (ADS)
Fertig, Niels
2003-03-01
Electrophysiology (i.e. patch clamping) remains the gold standard for pharmacological testing of putative ion channel active drugs (ICADs), but suffers from low throughput. A new ion channel screening technology based on microfabricated glass chip devices will be presented. The glass chips contain very fine apertures, which are used for whole-cell voltage clamp recordings as well as single channel recordings from mammalian cell lines. Chips containing multiple patch clamp wells will be used in a first bench-top device, which will allow perfusion and electrical readout of each well. This scalable technology will allow for automated, rapid and parallel screening on ion channel drug targets.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
A divide and conquer approach to the nonsymmetric eigenvalue problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1991-01-01
Serial computation combined with high communication costs on distributed-memory multiprocessors make parallel implementations of the QR method for the nonsymmetric eigenvalue problem inefficient. This paper introduces an alternative algorithm for the nonsymmetric tridiagonal eigenvalue problem based on rank two tearing and updating of the matrix. The parallelism of this divide and conquer approach stems from independent solution of the updating problems. 11 refs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Liying; Sedykh, Alexander; Tripathi, Ashutosh
2013-10-01
Identification of endocrine disrupting chemicals is one of the important goals of environmental chemical hazard screening. We report on the development of validated in silico predictors of chemicals likely to cause estrogen receptor (ER)-mediated endocrine disruption to facilitate their prioritization for future screening. A database of relative binding affinity of a large number of ERα and/or ERβ ligands was assembled (546 for ERα and 137 for ERβ). Both single-task learning (STL) and multi-task learning (MTL) continuous quantitative structure–activity relationship (QSAR) models were developed for predicting ligand binding affinity to ERα or ERβ. High predictive accuracy was achieved for ERα bindingmore » affinity (MTL R{sup 2} = 0.71, STL R{sup 2} = 0.73). For ERβ binding affinity, MTL models were significantly more predictive (R{sup 2} = 0.53, p < 0.05) than STL models. In addition, docking studies were performed on a set of ER agonists/antagonists (67 agonists and 39 antagonists for ERα, 48 agonists and 32 antagonists for ERβ, supplemented by putative decoys/non-binders) using the following ER structures (in complexes with respective ligands) retrieved from the Protein Data Bank: ERα agonist (PDB ID: 1L2I), ERα antagonist (PDB ID: 3DT3), ERβ agonist (PDB ID: 2NV7), and ERβ antagonist (PDB ID: 1L2J). We found that all four ER conformations discriminated their corresponding ligands from presumed non-binders. Finally, both QSAR models and ER structures were employed in parallel to virtually screen several large libraries of environmental chemicals to derive a ligand- and structure-based prioritized list of putative estrogenic compounds to be used for in vitro and in vivo experimental validation. - Highlights: • This is the largest curated dataset inclusive of ERα and β (the latter is unique). • New methodology that for the first time affords acceptable ERβ models. • A combination of QSAR and docking enables prediction of affinity and function. • The results have potential applications to green chemistry. • Models are publicly available for virtual screening via a web portal.« less
Nadkarni, P M; Miller, P L
1991-01-01
A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations.
Interlayer tunneling in double-layer quantum hall pseudoferromagnets.
Balents, L; Radzihovsky, L
2001-02-26
We show that the interlayer tunneling I-V in double-layer quantum Hall states displays a rich behavior which depends on the relative magnitude of sample size, voltage length scale, current screening, disorder, and thermal lengths. For weak tunneling, we predict a negative differential conductance of a power-law shape crossing over to a sharp zero-bias peak. An in-plane magnetic field splits this zero-bias peak, leading instead to a "derivative" feature at V(B)(B(parallel)) = 2 pi Planck's over 2 pi upsilon B(parallel)d/e phi(0), which gives a direct measurement of the dispersion of the Goldstone mode corresponding to the spontaneous symmetry breaking of the double-layer Hall state.
Webinar Presentation: Assessing Neurodevelopment in Parallel Animal and Human Studies
This presentation, Assessing Neurodevelopment in Parallel Animal and Human Studies, was given at the NIEHS/EPA Children's Centers 2015 Webinar Series: Interdisciplinary Approaches to Neurodevelopment held on Sept. 9, 2015.
Cervical cancer screening with naked-eye visual inspection in Colombia.
Murillo, Raul; Luna, Joaquin; Gamboa, Oscar; Osorio, Elkin; Bonilla, Jairo; Cendales, Ricardo
2010-06-01
To assess the accuracy of visual inspection provided by nurses through combining acetic acid (VIA) and Lugol's iodine (VILI) in a low-resource region of Colombia. A cross-sectional study with 4957 women was conducted to evaluate visual inspection techniques as the basis for see-and-treat approaches in cervical cancer control. All women underwent conventional cytology, VIA performed by nurses, and a combination of VIA and VILI. All women underwent colposcopy and biopsies were obtained for any positive test. A total of 762 women underwent biopsy, 4945 women were included in the analysis of conventional cytology, and 4957 were included in the analysis of VIA and VIA-VILI. Positivity rates were 1.3% and 4.3% for HSIL and LSIL cytology, 7.4% for VIA, and 10.1% for VIA-VILI. Sensitivity for cytology was 52.9% and 36.8% for LSIL and HSIL thresholds, 53.6% for VIA, and 68.1% for VIA-VILI. The corresponding specificity was 95.0%, 99.2%, 93.2%, and 90.8% respectively. The parallel combination of VIA-VILI and cytology LSIL-threshold revealed the best performance as a screening strategy. The use of VIA-VILI simulating colposcopic procedures and provided by nurses represents a good alternative for implementing see-and-treat programs in Latin America. Program constraints should be taken into account. Copyright 2010 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.
Development of AN Innovative Three-Dimensional Complete Body Screening Device - 3D-CBS
NASA Astrophysics Data System (ADS)
Crosetto, D. B.
2004-07-01
This article describes an innovative technological approach that increases the efficiency with which a large number of particles (photons) can be detected and analyzed. The three-dimensional complete body screening (3D-CBS) combines the functional imaging capability of the Positron Emission Tomography (PET) with those of the anatomical imaging capability of Computed Tomography (CT). The novel techniques provide better images in a shorter time with less radiation to the patient. A primary means of accomplishing this is the use of a larger solid angle, but this requires a new electronic technique capable of handling the increased data rate. This technique, combined with an improved and simplified detector assembly, enables executing complex real-time algorithms and allows more efficiently use of economical crystals. These are the principal features of this invention. A good synergy of advanced techniques in particle detection, together with technological progress in industry (latest FPGA technology) and simple, but cost-effective ideas provide a revolutionary invention. This technology enables over 400 times PET efficiency improvement at once compared to two to three times improvements achieved every five years during the past decades. Details of the electronics are provided, including an IBM PC board with a parallel-processing architecture implemented in FPGA, enabling the execution of a programmable complex real-time algorithm for best detection of photons.
A phenotypic screening approach to identify anticancer compounds derived from marine fungi.
Ellinger, Bernhard; Silber, Johanna; Prashar, Anjali; Landskron, Johannes; Weber, Jonas; Rehermann, Sarah; Müller, Franz-Josef; Smith, Stephen; Wrigley, Stephen; Taskén, Kjetil; Gribbon, Philip; Labes, Antje; Imhoff, Johannes F
2014-04-01
This study covers the isolation, testing, and identification of natural products with anticancer properties. Secondary metabolites were isolated from fungal strains originating from a variety of marine habitats. Strain culture protocols were optimized with respect to growth media composition and fermentation conditions. From these producers, isolated compounds were screened for their effect on the viability and proliferation of a subset of the NCI60 panel of cancer cell lines. Active compounds of interest were identified and selected for detailed assessments and structural elucidation using nuclear magnetic resonance. This revealed the majority of fungal-derived compounds represented known anticancer chemotypes, confirming the integrity of the process and the ability to identify suitable compounds. Examination of effects of selected compounds on cancer-associated cell signaling pathways used phospho flow cytometry in combination with 3D fluorescent cell barcoding. In parallel, the study addressed the logistical aspects of maintaining multiple cancer cell lines in culture simultaneously. A potential solution involving microbead-based cell culture was investigated (BioLevitator, Hamilton). Selected cell lines were cultured in microbead and 2D methods and cell viability tests showed comparable compound inhibition in both methods (R2=0.95). In a further technology assessment, an image-based assay system was investigated for its utility as a possible complement to ATP-based detection for quantifying cell growth and viability in a label-free manner.
Nonlinear Plasma Response to Resonant Magnetic Perturbation in Rutherford Regime
NASA Astrophysics Data System (ADS)
Zhu, Ping; Yan, Xingting; Huang, Wenlong
2017-10-01
Recently a common analytic relation for both the locked mode and the nonlinear plasma response in the Rutherford regime has been developed based on the steady-state solution to the coupled dynamic system of magnetic island evolution and torque balance equations. The analytic relation predicts the threshold and the island size for the full penetration of resonant magnetic perturbation (RMP). It also rigorously proves a screening effect of the equilibrium toroidal flow. In this work, we test the theory by solving for the nonlinear plasma response to a single-helicity RMP of a circular-shaped limiter tokamak equilibrium with a constant toroidal flow, using the initial-value, full MHD simulation code NIMROD. Time evolution of the parallel flow or ``slip frequency'' profile and its asymptotic approach to steady state obtained from the NIMROD simulations qualitatively agree with the theory predictions. Further comparisons are carried out for the saturated island size, the threshold for full mode penetration, as well as the screening effects of equilibrium toroidal flow in order to understand the physics of nonlinear plasma response in the Rutherford regime. Supported by National Magnetic Confinement Fusion Science Program of China Grants 2014GB124002 and 2015GB101004, the 100 Talent Program of the Chinese Academy of Sciences, and U.S. Department of Energy Grants DE-FG02-86ER53218 and DE-FC02-08ER54975.
Functional genomics platform for pooled screening and mammalian genetic interaction maps
Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.
2014-01-01
Systematic genetic interaction maps in microorganisms are powerful tools for identifying functional relationships between genes and defining the function of uncharacterized genes. We have recently implemented this strategy in mammalian cells as a two-stage approach. First, genes of interest are robustly identified in a pooled genome-wide screen using complex shRNA libraries. Second, phenotypes for all pairwise combinations of hit genes are measured in a double-shRNA screen and used to construct a genetic interaction map. Our protocol allows for rapid pooled screening under various conditions without a requirement for robotics, in contrast to arrayed approaches. Each stage of the protocol can be implemented in ~2 weeks, with additional time for analysis and generation of reagents. We discuss considerations for screen design, and present complete experimental procedures as well as a full computational analysis suite for identification of hits in pooled screens and generation of genetic interaction maps. While the protocols outlined here were developed for our original shRNA-based approach, they can be applied more generally, including to CRISPR-based approaches. PMID:24992097
Integration experiences and performance studies of A COTS parallel archive systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Bary
2010-01-01
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less
Integration experiments and performance studies of a COTS parallel archive system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Gary
2010-06-16
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less
NASA Astrophysics Data System (ADS)
Qin, Cheng-Zhi; Zhan, Lijun
2012-06-01
As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU-based algorithms based on existing parallelization strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamness, Mickie
2006-06-01
Pacific Northwest National Laboratory (PNNL) evaluated two fish screen facilities in the Walla Walla River basin in 2005 and early 2006. The Garden City/Lowden screen site was evaluated in April and June 2005 to determine whether the fish screens met National Marine Fisheries Service criteria to provide safe passage for juvenile salmonids. Louvers behind the screens at the Nursery Bridge Fishway were modified in fall 2005 in an attempt to minimize high approach velocities. PNNL evaluated the effects of those modifications in March 2006. Results of the Garden City/Lowden evaluations indicate the site performs well at varying river levels andmore » canal flows. Approach velocities did not exceed 0.4 feet per second (fps) at any time. Sweep velocities increased toward the fish ladder in March but not in June. The air-burst mechanism appears to keep large debris off the screens, although it does not prevent algae and periphyton from growing on the screen face, especially near the bottom of the screens. At Nursery Bridge, results indicate all the approach velocities were below 0.4 fps under the moderate river levels and operational conditions encountered on March 7, 2006. Sweep did not consistently increase toward the fish ladder, but the site generally met the criteria for safe passage of juvenile salmonids. Modifications to the louvers seem to allow more control over the amount of water moving through the screens. We will measure approach velocities when river levels are higher to determine whether the louver modifications can help correct excessive approach velocities under a range of river levels and auxiliary water supply flows.« less
Parallelized direct execution simulation of message-passing parallel programs
NASA Technical Reports Server (NTRS)
Dickens, Phillip M.; Heidelberger, Philip; Nicol, David M.
1994-01-01
As massively parallel computers proliferate, there is growing interest in findings ways by which performance of massively parallel codes can be efficiently predicted. This problem arises in diverse contexts such as parallelizing computers, parallel performance monitoring, and parallel algorithm development. In this paper we describe one solution where one directly executes the application code, but uses a discrete-event simulator to model details of the presumed parallel machine such as operating system and communication network behavior. Because this approach is computationally expensive, we are interested in its own parallelization specifically the parallelization of the discrete-event simulator. We describe methods suitable for parallelized direct execution simulation of message-passing parallel programs, and report on the performance of such a system, Large Application Parallel Simulation Environment (LAPSE), we have built on the Intel Paragon. On all codes measured to date, LAPSE predicts performance well typically within 10 percent relative error. Depending on the nature of the application code, we have observed low slowdowns (relative to natively executing code) and high relative speedups using up to 64 processors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Xujun; Li, Jiyuan; Jiang, Xikai
An efficient parallel Stokes’s solver is developed towards the complete inclusion of hydrodynamic interactions of Brownian particles in any geometry. A Langevin description of the particle dynamics is adopted, where the long-range interactions are included using a Green’s function formalism. We present a scalable parallel computational approach, where the general geometry Stokeslet is calculated following a matrix-free algorithm using the General geometry Ewald-like method. Our approach employs a highly-efficient iterative finite element Stokes’ solver for the accurate treatment of long-range hydrodynamic interactions within arbitrary confined geometries. A combination of mid-point time integration of the Brownian stochastic differential equation, the parallelmore » Stokes’ solver, and a Chebyshev polynomial approximation for the fluctuation-dissipation theorem result in an O(N) parallel algorithm. We also illustrate the new algorithm in the context of the dynamics of confined polymer solutions in equilibrium and non-equilibrium conditions. Our method is extended to treat suspended finite size particles of arbitrary shape in any geometry using an Immersed Boundary approach.« less
Zhao, Xujun; Li, Jiyuan; Jiang, Xikai; ...
2017-06-29
An efficient parallel Stokes’s solver is developed towards the complete inclusion of hydrodynamic interactions of Brownian particles in any geometry. A Langevin description of the particle dynamics is adopted, where the long-range interactions are included using a Green’s function formalism. We present a scalable parallel computational approach, where the general geometry Stokeslet is calculated following a matrix-free algorithm using the General geometry Ewald-like method. Our approach employs a highly-efficient iterative finite element Stokes’ solver for the accurate treatment of long-range hydrodynamic interactions within arbitrary confined geometries. A combination of mid-point time integration of the Brownian stochastic differential equation, the parallelmore » Stokes’ solver, and a Chebyshev polynomial approximation for the fluctuation-dissipation theorem result in an O(N) parallel algorithm. We also illustrate the new algorithm in the context of the dynamics of confined polymer solutions in equilibrium and non-equilibrium conditions. Our method is extended to treat suspended finite size particles of arbitrary shape in any geometry using an Immersed Boundary approach.« less
Advanced Boundary Electrode Modeling for tES and Parallel tES/EEG.
Pursiainen, Sampsa; Agsten, Britte; Wagner, Sven; Wolters, Carsten H
2018-01-01
This paper explores advanced electrode modeling in the context of separate and parallel transcranial electrical stimulation (tES) and electroencephalography (EEG) measurements. We focus on boundary condition-based approaches that do not necessitate adding auxiliary elements, e.g., sponges, to the computational domain. In particular, we investigate the complete electrode model (CEM) which incorporates a detailed description of the skin-electrode interface including its contact surface, impedance, and normal current distribution. The CEM can be applied for both tES and EEG electrodes which are advantageous when a parallel system is used. In comparison to the CEM, we test two important reduced approaches: the gap model (GAP) and the point electrode model (PEM). We aim to find out the differences of these approaches for a realistic numerical setting based on the stimulation of the auditory cortex. The results obtained suggest, among other things, that GAP and GAP/PEM are sufficiently accurate for the practical application of tES and parallel tES/EEG, respectively. Differences between CEM and GAP were observed mainly in the skin compartment, where only CEM explains the heating effects characteristic to tES.
Bannon, Catherine C; Campbell, Douglas A
2017-01-01
Diatoms are marine primary producers that sink in part due to the density of their silica frustules. Sinking of these phytoplankters is crucial for both the biological pump that sequesters carbon to the deep ocean and for the life strategy of the organism. Sinking rates have been previously measured through settling columns, or with fluorimeters or video microscopy arranged perpendicularly to the direction of sinking. These side-view techniques require large volumes of culture, specialized equipment and are difficult to scale up to multiple simultaneous measures for screening. We established a method for parallel, large scale analysis of multiple phytoplankton sinking rates through top-view monitoring of chlorophyll a fluorescence in microtitre well plates. We verified the method through experimental analysis of known factors that influence sinking rates, including exponential versus stationary growth phase in species of different cell sizes; Thalassiosira pseudonana CCMP1335, chain-forming Skeletonema marinoi RO5A and Coscinodiscus radiatus CCMP312. We fit decay curves to an algebraic transform of the decrease in fluorescence signal as cells sank away from the fluorometer detector, and then used minimal mechanistic assumptions to extract a sinking rate (m d-1) using an RStudio script, SinkWORX. We thereby detected significant differences in sinking rates as larger diatom cells sank faster than smaller cells, and cultures in stationary phase sank faster than those in exponential phase. Our sinking rate estimates accord well with literature values from previously established methods. This well plate-based method can operate as a high throughput integrative phenotypic screen for factors that influence sinking rates including macromolecular allocations, nutrient availability or uptake rates, chain-length or cell size, degree of silification and progression through growth stages. Alternately the approach can be used to phenomically screen libraries of mutants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, Chung-Yan; Piccini, Matthew Ernest; Schaff, Ulrich Y.
Multiple cases of attempted bioterrorism events using biotoxins have highlighted the urgent need for tools capable of rapid screening of suspect samples in the field (e.g., mailroom and public events). We present a portable microfluidic device capable of analyzing environmental (e.g., white powder), food (e.g., milk) and clinical (e.g., blood) samples for multiplexed detection of biotoxins. The device is rapid (<15-30 min sample-to-answer), sensitive (< 0.08 pg/mL detection limit for botulinum toxin), multiplexed (up to 64 parallel assays) and capable of analyzing small volume samples (< 20 μL total sample input). The immunoassay approach (SpinDx) is based on binding ofmore » toxins in a sample to antibody-laden capture particles followed by sedimentation of particles through a density-media in a microfluidic disk and quantification using a laser-induced fluorescence detector. A direct, blinded comparison with a gold standard ELISA revealed a 5-fold more sensitive detection limit for botulinum toxin while requiring 250-fold less sample volume and a 30 minute assay time with a near unity correlation. A key advantage of the technique is its compatibility with a variety of sample matrices with no additional sample preparation required. Ultrasensitive quantification has been demonstrated from direct analysis of multiple clinical, environmental and food samples, including white powder, whole blood, saliva, salad dressing, whole milk, peanut butter, half and half, honey, and canned meat. We believe that this device can met an urgent need in screening both potentially exposed people as well as suspicious samples in mail-rooms, airports, public sporting venues and emergency rooms. The general-purpose immunodiagnostics device can also find applications in screening of infectious and systemic diseases or serve as a lab device for conducting rapid immunoassays.« less
Falck, David; de Vlieger, Jon S. B.; Niessen, Wilfried M. A.; Kool, Jeroen; Honing, Maarten; Irth, Hubertus
2010-01-01
A high-resolution screening method was developed for the p38α mitogen-activated protein kinase to detect and identify small-molecule binders. Its central role in inflammatory diseases makes this enzyme a very important drug target. The setup integrates separation by high-performance liquid chromatography with two parallel detection techniques. High-resolution mass spectrometry gives structural information to identify small molecules while an online enzyme binding detection method provides data on p38α binding. The separation step allows the individual assessment of compounds in a mixture and links affinity and structure information via the retention time. Enzyme binding detection was achieved with a competitive binding assay based on fluorescence enhancement which has a simple principle, is inexpensive, and is easy to interpret. The concentrations of p38α and the fluorescence tracer SK&F86002 were optimized as well as incubation temperature, formic acid content of the LC eluents, and the material of the incubation tubing. The latter notably improved the screening of highly lipophilic compounds. For optimization and validation purposes, the known kinase inhibitors BIRB796, TAK715, and MAPKI1 were used among others. The result is a high-quality assay with Z′ factors around 0.8, which is suitable for semi-quantitative affinity measurements and applicable to various binding modes. Furthermore, the integrated approach gives affinity data on individual compounds instead of averaged ones for mixtures. Figure P38 α online screening platform Electronic supplementary material The online version of this article (doi:10.1007/s00216-010-4087-8) contains supplementary material, which is available to authorized users. PMID:20730527
Ni, Yizhao; Kennebeck, Stephanie; Dexheimer, Judith W; McAneney, Constance M; Tang, Huaxiu; Lingren, Todd; Li, Qi; Zhai, Haijun; Solti, Imre
2015-01-01
Objectives (1) To develop an automated eligibility screening (ES) approach for clinical trials in an urban tertiary care pediatric emergency department (ED); (2) to assess the effectiveness of natural language processing (NLP), information extraction (IE), and machine learning (ML) techniques on real-world clinical data and trials. Data and methods We collected eligibility criteria for 13 randomly selected, disease-specific clinical trials actively enrolling patients between January 1, 2010 and August 31, 2012. In parallel, we retrospectively selected data fields including demographics, laboratory data, and clinical notes from the electronic health record (EHR) to represent profiles of all 202795 patients visiting the ED during the same period. Leveraging NLP, IE, and ML technologies, the automated ES algorithms identified patients whose profiles matched the trial criteria to reduce the pool of candidates for staff screening. The performance was validated on both a physician-generated gold standard of trial–patient matches and a reference standard of historical trial–patient enrollment decisions, where workload, mean average precision (MAP), and recall were assessed. Results Compared with the case without automation, the workload with automated ES was reduced by 92% on the gold standard set, with a MAP of 62.9%. The automated ES achieved a 450% increase in trial screening efficiency. The findings on the gold standard set were confirmed by large-scale evaluation on the reference set of trial–patient matches. Discussion and conclusion By exploiting the text of trial criteria and the content of EHRs, we demonstrated that NLP-, IE-, and ML-based automated ES could successfully identify patients for clinical trials. PMID:25030032
Zhao, Siwei; Zhu, Kan; Zhang, Yan; Zhu, Zijie; Xu, Zhengping; Zhao, Min; Pan, Tingrui
2014-11-21
Both endogenous and externally applied electrical stimulation can affect a wide range of cellular functions, including growth, migration, differentiation and division. Among those effects, the electrical field (EF)-directed cell migration, also known as electrotaxis, has received broad attention because it holds great potential in facilitating clinical wound healing. Electrotaxis experiment is conventionally conducted in centimetre-sized flow chambers built in Petri dishes. Despite the recent efforts to adapt microfluidics for electrotaxis studies, the current electrotaxis experimental setup is still cumbersome due to the needs of an external power supply and EF controlling/monitoring systems. There is also a lack of parallel experimental systems for high-throughput electrotaxis studies. In this paper, we present a first independently operable microfluidic platform for high-throughput electrotaxis studies, integrating all functional components for cell migration under EF stimulation (except microscopy) on a compact footprint (the same as a credit card), referred to as ElectroTaxis-on-a-Chip (ETC). Inspired by the R-2R resistor ladder topology in digital signal processing, we develop a systematic approach to design an infinitely expandable microfluidic generator of EF gradients for high-throughput and quantitative studies of EF-directed cell migration. Furthermore, a vacuum-assisted assembly method is utilized to allow direct and reversible attachment of our device to existing cell culture media on biological surfaces, which separates the cell culture and device preparation/fabrication steps. We have demonstrated that our ETC platform is capable of screening human cornea epithelial cell migration under the stimulation of an EF gradient spanning over three orders of magnitude. The screening results lead to the identification of the EF-sensitive range of that cell type, which can provide valuable guidance to the clinical application of EF-facilitated wound healing.
Wells, Dagan; Kaur, Kulvinder; Grifo, Jamie; Glassner, Michael; Taylor, Jenny C; Fragouli, Elpida; Munne, Santiago
2014-08-01
The majority of human embryos created using in vitro fertilisation (IVF) techniques are aneuploid. Comprehensive chromosome screening methods, applicable to single cells biopsied from preimplantation embryos, allow reliable identification and transfer of euploid embryos. Recently, randomised trials using such methods have indicated that aneuploidy screening improves IVF success rates. However, the high cost of testing has restricted the availability of this potentially beneficial strategy. This study aimed to harness next-generation sequencing (NGS) technology, with the intention of lowering the costs of preimplantation aneuploidy screening. Embryo biopsy, whole genome amplification and semiconductor sequencing. A rapid (<15 h) NGS protocol was developed, with consumable cost only two-thirds that of the most widely used method for embryo aneuploidy detection. Validation involved blinded analysis of 54 cells from cell lines or biopsies from human embryos. Sensitivity and specificity were 100%. The method was applied clinically, assisting in the selection of euploid embryos in two IVF cycles, producing healthy children in both cases. The NGS approach was also able to reveal specified mutations in the nuclear or mitochondrial genomes in parallel with chromosome assessment. Interestingly, elevated mitochondrial DNA content was associated with aneuploidy (p<0.05), a finding suggestive of a link between mitochondria and chromosomal malsegregation. This study demonstrates that NGS provides highly accurate, low-cost diagnosis of aneuploidy in cells from human preimplantation embryos and is rapid enough to allow testing without embryo cryopreservation. The method described also has the potential to shed light on other aspects of embryo genetics of relevance to health and viability. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Long-term detection of methyltestosterone (ab-) use by a yeast transactivation system.
Wolf, Sylvi; Diel, Patrick; Parr, Maria Kristina; Rataj, Felicitas; Schänzer, Willhelm; Vollmer, Günter; Zierau, Oliver
2011-04-01
The routinely used analytical method for detecting the abuse of anabolic steroids only allows the detection of molecules with known analytical properties. In our supplementary approach to structure-independent detection, substances are identified by their biological activity. In the present study, urines excreted after oral methyltestosterone (MT) administration were analyzed by a yeast androgen screen (YAS). The aim was to trace the excretion of MT or its metabolites in human urine samples and to compare the results with those from the established analytical method. MT and its two major metabolites were tested as pure compounds in the YAS. In a second step, the ability of the YAS to detect MT and its metabolites in urine samples was analyzed. For this purpose, a human volunteer ingested of a single dose of 5 mg methyltestosterone. Urine samples were collected after different time intervals (0-307 h) and were analyzed in the YAS and in parallel by GC/MS. Whereas the YAS was able to trace MT in urine samples at least for 14 days, the detection limits of the GC/MS method allowed follow-up until day six. In conclusion, our results demonstrate that the yeast reporter gene system could detect the activity of anabolic steroids like methyltestosterone with high sensitivity even in urine. Furthermore, the YAS was able to detect MT abuse for a longer period of time than classical GC/MS. Obviously, the system responds to long-lasting metabolites yet unidentified. Therefore, the YAS can be a powerful (pre-) screening tool with the potential that to be used to identify persistent or late screening metabolites of anabolic steroids, which could be used for an enhancement of the sensitivity of GC/MS detection techniques.
Wells, Dagan; Kaur, Kulvinder; Grifo, Jamie; Glassner, Michael; Taylor, Jenny C; Fragouli, Elpida; Munne, Santiago
2014-01-01
Background The majority of human embryos created using in vitro fertilisation (IVF) techniques are aneuploid. Comprehensive chromosome screening methods, applicable to single cells biopsied from preimplantation embryos, allow reliable identification and transfer of euploid embryos. Recently, randomised trials using such methods have indicated that aneuploidy screening improves IVF success rates. However, the high cost of testing has restricted the availability of this potentially beneficial strategy. This study aimed to harness next-generation sequencing (NGS) technology, with the intention of lowering the costs of preimplantation aneuploidy screening. Methods Embryo biopsy, whole genome amplification and semiconductor sequencing. Results A rapid (<15 h) NGS protocol was developed, with consumable cost only two-thirds that of the most widely used method for embryo aneuploidy detection. Validation involved blinded analysis of 54 cells from cell lines or biopsies from human embryos. Sensitivity and specificity were 100%. The method was applied clinically, assisting in the selection of euploid embryos in two IVF cycles, producing healthy children in both cases. The NGS approach was also able to reveal specified mutations in the nuclear or mitochondrial genomes in parallel with chromosome assessment. Interestingly, elevated mitochondrial DNA content was associated with aneuploidy (p<0.05), a finding suggestive of a link between mitochondria and chromosomal malsegregation. Conclusions This study demonstrates that NGS provides highly accurate, low-cost diagnosis of aneuploidy in cells from human preimplantation embryos and is rapid enough to allow testing without embryo cryopreservation. The method described also has the potential to shed light on other aspects of embryo genetics of relevance to health and viability. PMID:25031024
Parallel computations and control of adaptive structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Alvin, Kenneth F.; Belvin, W. Keith; Chong, K. P. (Editor); Liu, S. C. (Editor); Li, J. C. (Editor)
1991-01-01
The equations of motion for structures with adaptive elements for vibration control are presented for parallel computations to be used as a software package for real-time control of flexible space structures. A brief introduction of the state-of-the-art parallel computational capability is also presented. Time marching strategies are developed for an effective use of massive parallel mapping, partitioning, and the necessary arithmetic operations. An example is offered for the simulation of control-structure interaction on a parallel computer and the impact of the approach presented for applications in other disciplines than aerospace industry is assessed.
One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...
Adaptive multi-GPU Exchange Monte Carlo for the 3D Random Field Ising Model
NASA Astrophysics Data System (ADS)
Navarro, Cristóbal A.; Huang, Wei; Deng, Youjin
2016-08-01
This work presents an adaptive multi-GPU Exchange Monte Carlo approach for the simulation of the 3D Random Field Ising Model (RFIM). The design is based on a two-level parallelization. The first level, spin-level parallelism, maps the parallel computation as optimal 3D thread-blocks that simulate blocks of spins in shared memory with minimal halo surface, assuming a constant block volume. The second level, replica-level parallelism, uses multi-GPU computation to handle the simulation of an ensemble of replicas. CUDA's concurrent kernel execution feature is used in order to fill the occupancy of each GPU with many replicas, providing a performance boost that is more notorious at the smallest values of L. In addition to the two-level parallel design, the work proposes an adaptive multi-GPU approach that dynamically builds a proper temperature set free of exchange bottlenecks. The strategy is based on mid-point insertions at the temperature gaps where the exchange rate is most compromised. The extra work generated by the insertions is balanced across the GPUs independently of where the mid-point insertions were performed. Performance results show that spin-level performance is approximately two orders of magnitude faster than a single-core CPU version and one order of magnitude faster than a parallel multi-core CPU version running on 16-cores. Multi-GPU performance is highly convenient under a weak scaling setting, reaching up to 99 % efficiency as long as the number of GPUs and L increase together. The combination of the adaptive approach with the parallel multi-GPU design has extended our possibilities of simulation to sizes of L = 32 , 64 for a workstation with two GPUs. Sizes beyond L = 64 can eventually be studied using larger multi-GPU systems.
NASA Technical Reports Server (NTRS)
Bailey, David (Editor); Barton, John (Editor); Lasinski, Thomas (Editor); Simon, Horst (Editor)
1993-01-01
A new set of benchmarks was developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of a set of kernels, the 'Parallel Kernels,' and a simulated application benchmark. Together they mimic the computation and data movement characteristics of large scale computational fluid dynamics (CFD) applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification - all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
Ryan, Patricia Y; Graves, Kristi D; Pavlik, Edward J; Andrykowski, Michael A
2007-01-01
Considerable effort has been devoted to the identification of cost-effective approaches to screening for ovarian cancer (OC). Transvaginal ultrasound (TVS) is one such screening approach. Approximately 5-7% of routine TVS screening tests yield abnormal results. Some women experience significant distress after receipt of an abnormal TVS screening test. Four focus groups provided in-depth, qualitative data regarding the informational, psychological, and practical needs of women after the receipt of an abnormal TVS result. Through question and content analytic procedures, we identified four themes: anticipation, emotional response, role of the screening technician, and impact of prior cancer experiences. Results provide initial guidance toward development of interventions to promote adaptive responses after receipt of an abnormal cancer screening test result.
Box schemes and their implementation on the iPSC/860
NASA Technical Reports Server (NTRS)
Chattot, J. J.; Merriam, M. L.
1991-01-01
Research on algoriths for efficiently solving fluid flow problems on massively parallel computers is continued in the present paper. Attention is given to the implementation of a box scheme on the iPSC/860, a massively parallel computer with a peak speed of 10 Gflops and a memory of 128 Mwords. A domain decomposition approach to parallelism is used.
Feature Clustering for Accelerating Parallel Coordinate Descent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherrer, Chad; Tewari, Ambuj; Halappanavar, Mahantesh
2012-12-06
We demonstrate an approach for accelerating calculation of the regularization path for L1 sparse logistic regression problems. We show the benefit of feature clustering as a preconditioning step for parallel block-greedy coordinate descent algorithms.
NASA Astrophysics Data System (ADS)
Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A.; Oliveira, Micael J. T.; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G.; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A. L.
2012-06-01
Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.
Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A L
2012-06-13
Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.
Parallel Tensor Compression for Large-Scale Scientific Data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolda, Tamara G.; Ballard, Grey; Austin, Woody Nathan
As parallel computing trends towards the exascale, scientific data produced by high-fidelity simulations are growing increasingly massive. For instance, a simulation on a three-dimensional spatial grid with 512 points per dimension that tracks 64 variables per grid point for 128 time steps yields 8 TB of data. By viewing the data as a dense five way tensor, we can compute a Tucker decomposition to find inherent low-dimensional multilinear structure, achieving compression ratios of up to 10000 on real-world data sets with negligible loss in accuracy. So that we can operate on such massive data, we present the first-ever distributed memorymore » parallel implementation for the Tucker decomposition, whose key computations correspond to parallel linear algebra operations, albeit with nonstandard data layouts. Our approach specifies a data distribution for tensors that avoids any tensor data redistribution, either locally or in parallel. We provide accompanying analysis of the computation and communication costs of the algorithms. To demonstrate the compression and accuracy of the method, we apply our approach to real-world data sets from combustion science simulations. We also provide detailed performance results, including parallel performance in both weak and strong scaling experiments.« less
Common pitfalls in preclinical cancer target validation.
Kaelin, William G
2017-07-01
An alarming number of papers from laboratories nominating new cancer drug targets contain findings that cannot be reproduced by others or are simply not robust enough to justify drug discovery efforts. This problem probably has many causes, including an underappreciation of the danger of being misled by off-target effects when using pharmacological or genetic perturbants in complex biological assays. This danger is particularly acute when, as is often the case in cancer pharmacology, the biological phenotype being measured is a 'down' readout (such as decreased proliferation, decreased viability or decreased tumour growth) that could simply reflect a nonspecific loss of cellular fitness. These problems are compounded by multiple hypothesis testing, such as when candidate targets emerge from high-throughput screens that interrogate multiple targets in parallel, and by a publication and promotion system that preferentially rewards positive findings. In this Perspective, I outline some of the common pitfalls in preclinical cancer target identification and some potential approaches to mitigate them.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.
Schorpp, Kenji; Rothenaigner, Ina; Maier, Julia; Traenkle, Bjoern; Rothbauer, Ulrich; Hadian, Kamyar
2016-10-01
Many screening hits show relatively poor quality regarding later efficacy and safety. Therefore, small-molecule screening efforts shift toward high-content analysis providing more detailed information. Here, we describe a novel screening approach to identify cell cycle modulators with low toxicity by combining the Cell Cycle Chromobody (CCC) technology with the CytoTox-Glo (CTG) cytotoxicity assay. The CCC technology employs intracellularly functional single-domain antibodies coupled to a fluorescent protein (chromobodies) to visualize the cell cycle-dependent redistribution of the proliferating cell nuclear antigen (PCNA) in living cells. This image-based cell cycle analysis was combined with determination of dead-cell protease activity in cell culture supernatants by the CTG assay. We adopted this multiplex approach to high-throughput format and screened 960 Food and Drug Administration (FDA)-approved drugs. By this, we identified nontoxic compounds, which modulate different cell cycle stages, and validated selected hits in diverse cell lines stably expressing CCC. Additionally, we independently validated these hits by flow cytometry as the current state-of-the-art format for cell cycle analysis. This study demonstrates that CCC imaging is a versatile high-content screening approach to identify cell cycle modulators, which can be multiplexed with cytotoxicity assays for early elimination of toxic compounds during screening. © 2016 Society for Laboratory Automation and Screening.
Afrashtehfar, Kelvin I
2016-06-01
Data sourcesMedline, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Cochrane Central Register of Controlled Trials, Virtual Health Library and Web of Science were systematically searched up to July 2015 without limitations. Scopus, Google Scholar, ClinicalTrials.gov, the ISRCTN registry as well as reference lists of the trials included and relevant reviews were manually searched.Study selectionRandomised (RCTs) and prospective non-randomised clinical trials (non-RCTs) on human patients that compared therapeutic and adverse effects of lingual and labial appliances were considered. One reviewer initially screened titles and subsequently two reviewers independently screened the selected abstracts and full texts.Data extraction and synthesisThe data were extracted independently by the reviewers. Missing or unclear information, ongoing trials and raw data from split-mouth trials were requested from the authors of the trials. The quality of the included trials and potential bias across studies were assessed using Cochrane's risk of bias tool and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. For parallel trials, mean difference (MD) and the relative risk (RR) were used for continuous (objective speech performance, subjective speech performance, intercanine width, intermolar width and sagittal anchorage loss) and binary outcomes (eating difficulty), respectively. The standardised mean difference (SMD) was chosen to pool, after conversion, the outcome (oral discomfort) that assessed both binary and continuous. Random-effects meta-analyses were conducted, followed by subgroup and sensitivity analyses.ResultsThirteen papers pertaining to 11 clinical trials (three parallel RCTs, one split-mouth RCT and seven parallel prospective non-RCTs) were included with a total of 407 (34% male/66% female) patients. All trials had at least one bias domain at high risk of bias. Compared with labial appliances, lingual appliances were associated with increased overall oral discomfort, increased speech impediment (measured using auditory analysis), worse speech performance assessed by laypersons, increased eating difficulty and decreased intermolar width. On the other hand, lingual appliances were associated with increased intercanine width and significantly decreased anchorage loss of the maxillary first molar during space closure. However, the quality of all analyses included was judged as very low because of the high risk of bias of the included trials, inconsistency and imprecision.ConclusionsBased on existing trials there is insufficient evidence to make robust recommendations for lingual fixed orthodontic appliances regarding their therapeutic or adverse effects, as the quality of evidence was low.
Hollands, Emma C; Dale, Tim J; Baxter, Andrew W; Meadows, Helen J; Powell, Andrew J; Clare, Jeff J; Trezise, Derek J
2009-08-01
Gamma-amino butyric acid (GABA)-activated Cl- channels are critical mediators of inhibitory postsynaptic potentials in the CNS. To date, rational design efforts to identify potent and selective GABA(A) subtype ligands have been hampered by the absence of suitable high-throughput screening approaches. The authors describe 384-well population patch-clamp (PPC) planar array electrophysiology methods for the study of GABA(A) receptor pharmacology. In HEK293 cells stably expressing human alpha1beta3gamma2 GABA(A) channels, GABA evoked outward currents at 0 mV of 1.05 +/- 0.08 nA, measured 8 s post GABA addition. The I(GABA) was linear and reversed close to the theoretical E(Cl) (-56 mV). Concentration-response curve analysis yielded a mean pEC(50) value of 5.4 and Hill slope of 1.5, and for a series of agonists, the rank order of potency was muscimol > GABA > isoguvacine. A range of known positive modulators, including diazepam and pentobarbital, produced concentration-dependent augmentation of the GABA EC( 20) response (1 microM). The competitive antagonists bicuculline and gabazine produced concentration-dependent, parallel, rightward displacement of GABA curves with pA(2) and slope values of 5.7 and 1.0 and 6.7 and 1.0, respectively. In contrast, picrotoxin (0.2-150 microM) depressed the maximal GABA response, implying a non-competitive antagonism. Overall, the pharmacology of human alpha1beta3gamma2 GABA(A) determined by PPC was highly similar to that obtained by conventional patch-clamp methods. In small-scale single-shot screens, Z' values of >0.5 were obtained in agonist, modulator, and antagonist formats with hit rates of 0% to 3%. The authors conclude that despite the inability of the method to resolve the peak agonist responses, PPC can rapidly and usefully quantify pharmacology for the alpha1beta3gamma2 GABA(A) isoform. These data suggest that PPC may be a valuable approach for a focused set and secondary screening of GABA(A) receptors and other slow ligand-gated ion channels.
Perturbation Experiments: Approaches for Metabolic Pathway Analysis in Bioreactors.
Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk
2016-01-01
In the last decades, targeted metabolic engineering of microbial cells has become one of the major tools in bioprocess design and optimization. For successful application, a detailed knowledge is necessary about the relevant metabolic pathways and their regulation inside the cells. Since in vitro experiments cannot display process conditions and behavior properly, process data about the cells' metabolic state have to be collected in vivo. For this purpose, special techniques and methods are necessary. Therefore, most techniques enabling in vivo characterization of metabolic pathways rely on perturbation experiments, which can be divided into dynamic and steady-state approaches. To avoid any process disturbance, approaches which enable perturbation of cell metabolism in parallel to the continuing production process are reasonable. Furthermore, the fast dynamics of microbial production processes amplifies the need of parallelized data generation. These points motivate the development of a parallelized approach for multiple metabolic perturbation experiments outside the operating production reactor. An appropriate approach for in vivo characterization of metabolic pathways is presented and applied exemplarily to a microbial L-phenylalanine production process on a 15 L-scale.
A GPU-Accelerated Approach for Feature Tracking in Time-Varying Imagery Datasets.
Peng, Chao; Sahani, Sandip; Rushing, John
2017-10-01
We propose a novel parallel connected component labeling (CCL) algorithm along with efficient out-of-core data management to detect and track feature regions of large time-varying imagery datasets. Our approach contributes to the big data field with parallel algorithms tailored for GPU architectures. We remove the data dependency between frames and achieve pixel-level parallelism. Due to the large size, the entire dataset cannot fit into cached memory. Frames have to be streamed through the memory hierarchy (disk to CPU main memory and then to GPU memory), partitioned, and processed as batches, where each batch is small enough to fit into the GPU. To reconnect the feature regions that are separated due to data partitioning, we present a novel batch merging algorithm to extract the region connection information across multiple batches in a parallel fashion. The information is organized in a memory-efficient structure and supports fast indexing on the GPU. Our experiment uses a commodity workstation equipped with a single GPU. The results show that our approach can efficiently process a weather dataset composed of terabytes of time-varying radar images. The advantages of our approach are demonstrated by comparing to the performance of an efficient CPU cluster implementation which is being used by the weather scientists.
American Indian Men's Perceptions of Breast Cancer Screening for American Indian Women.
Filippi, Melissa K; Pacheco, Joseph; James, Aimee S; Brown, Travis; Ndikum-Moffor, Florence; Choi, Won S; Greiner, K Allen; Daley, Christine M
2014-01-01
Screening, especially screening mammography, is vital for decreasing breast cancer incidence and mortality. Screening rates in American Indian women are low compared to other racial/ethnic groups. In addition, American Indian women are diagnosed at more advanced stages and have lower 5-year survival rate than others. To better address the screening rates of American Indian women, focus groups (N=8) were conducted with American Indian men (N=42) to explore their perceptions of breast cancer screening for American Indian women. Our intent was to understand men's support level toward screening. Using a community-based participatory approach, focus groups were audio-taped, transcribed verbatim, and analyzed using a text analysis approach developed by our team. Topics discussed included breast cancer and screening knowledge, barriers to screening, and suggestions to improve screening rates. These findings can guide strategies to improve knowledge and awareness, communication among families and health care providers, and screening rates in American Indian communities.
Hübner, Joachim; Waldmann, Annika; Eisemann, Nora; Noftz, Maria; Geller, Alan C; Weinstock, Martin A; Volkmer, Beate; Greinert, Rüdiger; Breitbart, Eckhard W; Katalinic, Alexander
2017-07-07
Early detection is considered to improve the prognosis of cutaneous melanoma. The value of population-based screening for melanoma, however, is still controversial. The aim of this study was to evaluate the predictive power of established risk factors in the setting of a population-based screening and to provide empirical evidence for potential risk stratifications. We reanalyzed data (including age, sex, risk factors, and screening results) of 354 635 participants in the Skin Cancer Research to provide Evidence for Effectiveness of Screening in Northern Germany project conducted in the German state of Schleswig-Holstein (2003-2004). In multivariable analysis, atypical nevi [odds ratio (OR): 17.4; 95% confidence interval (CI): 14.4-20.1], personal history of melanoma (OR: 5.3; 95% CI: 3.6-7.6), and multiple (≥40) common nevi (OR: 1.3; 95% CI: 1.1-1.6) were associated with an increased risk of melanoma detection. Family history and congenital nevi were not significantly associated with melanoma detection in the Skin Cancer Research to provide Evidence for Effectiveness of Screening in Northern Germany population. The effects of several risk-adapted screening strategies were evaluated. Hypothesizing a screening of individuals aged more than or equal to 35 years, irrespective of risk factors (age approach), the number needed to screen is 559 (95% CI: 514-612), whereas a screening of adults (aged ≥20) with at least one risk factor (risk approach) leads to an number needed to screen of 178 (95% CI: 163-196). Converted into one screen-detected melanoma, the number of missed melanomas is 0.15 (95% CI: 0.12-0.18) with the age approach and 0.22 (95% CI: 0.19-0.26) with the risk approach. The results indicate that focusing on individuals at high risk for melanoma may improve the cost-effectiveness and the benefit-to-harm balance of melanoma screening programs.
ERIC Educational Resources Information Center
Bullock, Karen; McGraw, Sarah A.
2006-01-01
In the Screening Older Minority Women project, the authors applied a community capacity-enhancement approach to promoting breast and cervical cancer screening among older women of color. Members of informal support networks were recruited for this health promotion intervention to empower Latina and African American women to engage in positive…
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors, parallel programming techniques have evolved that support parallelism beyond a single level. When comparing the performance of applications based on different programming paradigms, it is important to differentiate between the influence of the programming model itself and other factors, such as implementation specific behavior of the operating system (OS) or architectural issues. Rewriting-a large scientific application in order to employ a new programming paradigms is usually a time consuming and error prone task. Before embarking on such an endeavor it is important to determine that there is really a gain that would not be possible with the current implementation. A detailed performance analysis is crucial to clarify these issues. The multilevel programming paradigms considered in this study are hybrid MPI/OpenMP, MLP, and nested OpenMP. The hybrid MPI/OpenMP approach is based on using MPI [7] for the coarse grained parallelization and OpenMP [9] for fine grained loop level parallelism. The MPI programming paradigm assumes a private address space for each process. Data is transferred by explicitly exchanging messages via calls to the MPI library. This model was originally designed for distributed memory architectures but is also suitable for shared memory systems. The second paradigm under consideration is MLP which was developed by Taft. The approach is similar to MPi/OpenMP, using a mix of coarse grain process level parallelization and loop level OpenMP parallelization. As it is the case with MPI, a private address space is assumed for each process. The MLP approach was developed for ccNUMA architectures and explicitly takes advantage of the availability of shared memory. A shared memory arena which is accessible by all processes is required. Communication is done by reading from and writing to the shared memory.
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreepathi, Sarat; Sripathi, Vamsi; Mills, Richard T
2013-01-01
Inefficient parallel I/O is known to be a major bottleneck among scientific applications employed on supercomputers as the number of processor cores grows into the thousands. Our prior experience indicated that parallel I/O libraries such as HDF5 that rely on MPI-IO do not scale well beyond 10K processor cores, especially on parallel file systems (like Lustre) with single point of resource contention. Our previous optimization efforts for a massively parallel multi-phase and multi-component subsurface simulator (PFLOTRAN) led to a two-phase I/O approach at the application level where a set of designated processes participate in the I/O process by splitting themore » I/O operation into a communication phase and a disk I/O phase. The designated I/O processes are created by splitting the MPI global communicator into multiple sub-communicators. The root process in each sub-communicator is responsible for performing the I/O operations for the entire group and then distributing the data to rest of the group. This approach resulted in over 25X speedup in HDF I/O read performance and 3X speedup in write performance for PFLOTRAN at over 100K processor cores on the ORNL Jaguar supercomputer. This research describes the design and development of a general purpose parallel I/O library, SCORPIO (SCalable block-ORiented Parallel I/O) that incorporates our optimized two-phase I/O approach. The library provides a simplified higher level abstraction to the user, sitting atop existing parallel I/O libraries (such as HDF5) and implements optimized I/O access patterns that can scale on larger number of processors. Performance results with standard benchmark problems and PFLOTRAN indicate that our library is able to maintain the same speedups as before with the added flexibility of being applicable to a wider range of I/O intensive applications.« less
2006-10-15
EM 61 and a Schonstedt Magnetometer . The magnetometer was used to screen locations for stakes and other intrusive activities. Digital geophysics was...patterns from direct-fire weapons, such as anti-tank rockets, are expected to form ellipses that are highly elongated parallel to the line of fire...Trajectory of Indirect Fire Weapons Historically, precision development progressed more slowly for indirect-fire weapons because hitting an unseen target
Magnetothermoelectric properties of layered structures for ion impurity scattering
NASA Astrophysics Data System (ADS)
Figarova, S. R.; Huseynov, H. I.; Figarov, V. R.
2018-05-01
In the paper, longitudinal and transverse thermoelectric powers are considered in a magnetic field parallel to the layer plane for scattering of charge carriers by weakly screened impurity ions. Based on the semiclassical approximation, it is obtained that, depending on the position of the Fermi level relative to the miniband top and superlattice period, the thermoelectric power can change sign and amplify.
2012-11-01
vitamin B12. Additionally, a reductant reacts directly with hexavalent chromium to reduce it to the trivalent state. SRS®-M provides a readily...experiments ......................................................................... 27 Figure 8. Hexavalent chromium detected in ISMA effluent post in situ...ground surface cis-DCE cis-dichloroethene CERCLA Comprehensive Environmental Response, Compensation, and Liability Act Cr(VI) hexavalent chromium
Nadkarni, P. M.; Miller, P. L.
1991-01-01
A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations. PMID:1807632
Krill, Michael K; Rosas, Samuel; Kwon, KiHyun; Dakkak, Andrew; Nwachukwu, Benedict U; McCormick, Frank
2018-02-01
The clinical examination of the shoulder joint is an undervalued diagnostic tool for evaluating acromioclavicular (AC) joint pathology. Applying evidence-based clinical tests enables providers to make an accurate diagnosis and minimize costly imaging procedures and potential delays in care. The purpose of this study was to create a decision tree analysis enabling simple and accurate diagnosis of AC joint pathology. A systematic review of the Medline, Ovid and Cochrane Review databases was performed to identify level one and two diagnostic studies evaluating clinical tests for AC joint pathology. Individual test characteristics were combined in series and in parallel to improve sensitivities and specificities. A secondary analysis utilized subjective pre-test probabilities to create a clinical decision tree algorithm with post-test probabilities. The optimal special test combination to screen and confirm AC joint pathology combined Paxinos sign and O'Brien's Test, with a specificity of 95.8% when performed in series; whereas, Paxinos sign and Hawkins-Kennedy Test demonstrated a sensitivity of 93.7% when performed in parallel. Paxinos sign and O'Brien's Test demonstrated the greatest positive likelihood ratio (2.71); whereas, Paxinos sign and Hawkins-Kennedy Test reported the lowest negative likelihood ratio (0.35). No combination of special tests performed in series or in parallel creates more than a small impact on post-test probabilities to screen or confirm AC joint pathology. Paxinos sign and O'Brien's Test is the only special test combination that has a small and sometimes important impact when used both in series and in parallel. Physical examination testing is not beneficial for diagnosis of AC joint pathology when pretest probability is unequivocal. In these instances, it is of benefit to proceed with procedural tests to evaluate AC joint pathology. Ultrasound-guided corticosteroid injections are diagnostic and therapeutic. An ultrasound-guided AC joint corticosteroid injection may be an appropriate new standard for treatment and surgical decision-making. II - Systematic Review.
Cardarelli, Roberto; Reese, David; Roper, Karen L.; Cardarelli, Kathryn; Feltner, Frances J.; Studts, Jamie L.; Knight, Jennifer R.; Armstrong, Debra; Weaver, Anthony; Shaffer, Dana
2017-01-01
For low dose CT lung cancer screening to be effective in curbing disease mortality, efforts are needed to overcome barriers to awareness and facilitate uptake of the current evidence-based screening guidelines. A sequential mixed-methods approach was employed to design a screening campaign utilizing messages developed from community focus groups, followed by implementation of the outreach campaign intervention in two high-risk Kentucky regions. This study reports on rates of awareness and screening in intervention regions, as compared to a control region. PMID:27866066
NASA Technical Reports Server (NTRS)
Hall, Lawrence O.; Bennett, Bonnie H.; Tello, Ivan
1994-01-01
A parallel version of CLIPS 5.1 has been developed to run on Intel Hypercubes. The user interface is the same as that for CLIPS with some added commands to allow for parallel calls. A complete version of CLIPS runs on each node of the hypercube. The system has been instrumented to display the time spent in the match, recognize, and act cycles on each node. Only rule-level parallelism is supported. Parallel commands enable the assertion and retraction of facts to/from remote nodes working memory. Parallel CLIPS was used to implement a knowledge-based command, control, communications, and intelligence (C(sup 3)I) system to demonstrate the fusion of high-level, disparate sources. We discuss the nature of the information fusion problem, our approach, and implementation. Parallel CLIPS has also be used to run several benchmark parallel knowledge bases such as one to set up a cafeteria. Results show from running Parallel CLIPS with parallel knowledge base partitions indicate that significant speed increases, including superlinear in some cases, are possible.
Wake Encounter Analysis for a Closely Spaced Parallel Runway Paired Approach Simulation
NASA Technical Reports Server (NTRS)
Mckissick,Burnell T.; Rico-Cusi, Fernando J.; Murdoch, Jennifer; Oseguera-Lohr, Rosa M.; Stough, Harry P, III; O'Connor, Cornelius J.; Syed, Hazari I.
2009-01-01
A Monte Carlo simulation of simultaneous approaches performed by two transport category aircraft from the final approach fix to a pair of closely spaced parallel runways was conducted to explore the aft boundary of the safe zone in which separation assurance and wake avoidance are provided. The simulation included variations in runway centerline separation, initial longitudinal spacing of the aircraft, crosswind speed, and aircraft speed during the approach. The data from the simulation showed that the majority of the wake encounters occurred near or over the runway and the aft boundaries of the safe zones were identified for all simulation conditions.
All That Glitters: A Glimpse into the Future of Cancer Screening
Developing new screening approaches and rigorously establishing their validity is challenging. Researchers are actively searching for new screening tests that improve the benefits of screening while limiting the harms.
Kinetic treatment of nonlinear magnetized plasma motions - General geometry and parallel waves
NASA Technical Reports Server (NTRS)
Khabibrakhmanov, I. KH.; Galinskii, V. L.; Verheest, F.
1992-01-01
The expansion of kinetic equations in the limit of a strong magnetic field is presented. This gives a natural description of the motions of magnetized plasmas, which are slow compared to the particle gyroperiods and gyroradii. Although the approach is 3D, this very general result is used only to focus on the parallel propagation of nonlinear Alfven waves. The derivative nonlinear Schroedinger-like equation is obtained. Two new terms occur compared to earlier treatments, a nonlinear term proportional to the heat flux along the magnetic field line and a higher-order dispersive term. It is shown that kinetic description avoids the singularities occurring in magnetohydrodynamic or multifluid approaches, which correspond to the degenerate case of sound speeds equal to the Alfven speed, and that parallel heat fluxes cannot be neglected, not even in the case of low parallel plasma beta. A truly stationary soliton solution is derived.
Demi, Libertario; Viti, Jacopo; Kusters, Lieneke; Guidi, Francesco; Tortoli, Piero; Mischi, Massimo
2013-11-01
The speed of sound in the human body limits the achievable data acquisition rate of pulsed ultrasound scanners. To overcome this limitation, parallel beamforming techniques are used in ultrasound 2-D and 3-D imaging systems. Different parallel beamforming approaches have been proposed. They may be grouped into two major categories: parallel beamforming in reception and parallel beamforming in transmission. The first category is not optimal for harmonic imaging; the second category may be more easily applied to harmonic imaging. However, inter-beam interference represents an issue. To overcome these shortcomings and exploit the benefit of combining harmonic imaging and high data acquisition rate, a new approach has been recently presented which relies on orthogonal frequency division multiplexing (OFDM) to perform parallel beamforming in transmission. In this paper, parallel transmit beamforming using OFDM is implemented for the first time on an ultrasound scanner. An advanced open platform for ultrasound research is used to investigate the axial resolution and interbeam interference achievable with parallel transmit beamforming using OFDM. Both fundamental and second-harmonic imaging modalities have been considered. Results show that, for fundamental imaging, axial resolution in the order of 2 mm can be achieved in combination with interbeam interference in the order of -30 dB. For second-harmonic imaging, axial resolution in the order of 1 mm can be achieved in combination with interbeam interference in the order of -35 dB.
Correction factors for self-selection when evaluating screening programmes.
Spix, Claudia; Berthold, Frank; Hero, Barbara; Michaelis, Jörg; Schilling, Freimut H
2016-03-01
In screening programmes there is recognized bias introduced through participant self-selection (the healthy screenee bias). Methods used to evaluate screening programmes include Intention-to-screen, per-protocol, and the "post hoc" approach in which, after introducing screening for everyone, the only evaluation option is participants versus non-participants. All methods are prone to bias through self-selection. We present an overview of approaches to correct for this bias. We considered four methods to quantify and correct for self-selection bias. Simple calculations revealed that these corrections are actually all identical, and can be converted into each other. Based on this, correction factors for further situations and measures were derived. The application of these correction factors requires a number of assumptions. Using as an example the German Neuroblastoma Screening Study, no relevant reduction in mortality or stage 4 incidence due to screening was observed. The largest bias (in favour of screening) was observed when comparing participants with non-participants. Correcting for bias is particularly necessary when using the post hoc evaluation approach, however, in this situation not all required data are available. External data or further assumptions may be required for estimation. © The Author(s) 2015.
Computational design of d-peptide inhibitors of hepatitis delta antigen dimerization
NASA Astrophysics Data System (ADS)
Elkin, Carl D.; Zuccola, Harmon J.; Hogle, James M.; Joseph-McCarthy, Diane
2000-11-01
Hepatitis delta virus (HDV) encodes a single polypeptide called hepatitis delta antigen (DAg). Dimerization of DAg is required for viral replication. The structure of the dimerization region, residues 12 to 60, consists of an anti-parallel coiled coil [Zuccola et al., Structure, 6 (1998) 821]. Multiple Copy Simultaneous Searches (MCSS) of the hydrophobic core region formed by the bend in the helix of one monomer of this structure were carried out for many diverse functional groups. Six critical interaction sites were identified. The Protein Data Bank was searched for backbone templates to use in the subsequent design process by matching to these sites. A 14 residue helix expected to bind to the d-isomer of the target structure was selected as the template. Over 200 000 mutant sequences of this peptide were generated based on the MCSS results. A secondary structure prediction algorithm was used to screen all sequences, and in general only those that were predicted to be highly helical were retained. Approximately 100 of these 14-mers were model built as d-peptides and docked with the l-isomer of the target monomer. Based on calculated interaction energies, predicted helicity, and intrahelical salt bridge patterns, a small number of peptides were selected as the most promising candidates. The ligand design approach presented here is the computational analogue of mirror image phage display. The results have been used to characterize the interactions responsible for formation of this model anti-parallel coiled coil and to suggest potential ligands to disrupt it.
Lammers, Joris; Stoker, Janka I; Stapel, Diederik A
2009-12-01
How does power affect behavior? We posit that this depends on the type of power. We distinguish between social power (power over other people) and personal power (freedom from other people) and argue that these two types of power have opposite associations with independence and interdependence. We propose that when the distinction between independence and interdependence is relevant, social power and personal power will have opposite effects; however, they will have parallel effects when the distinction is irrelevant. In two studies (an experimental study and a large field study), we demonstrate this by showing that social power and personal power have opposite effects on stereotyping, but parallel effects on behavioral approach.
Multiple Strategies for Spatial Integration of 2D Layouts within Working Memory
Meilinger, Tobias; Watanabe, Katsumi
2016-01-01
Prior results on the spatial integration of layouts within a room differed regarding the reference frame that participants used for integration. We asked whether these differences also occur when integrating 2D screen views and, if so, what the reasons for this might be. In four experiments we showed that integrating reference frames varied as a function of task familiarity combined with processing time, cues for spatial transformation, and information about action requirements paralleling results in the 3D case. Participants saw part of an object layout in screen 1, another part in screen 2, and reacted on the integrated layout in screen 3. Layout presentations between two screens coincided or differed in orientation. Aligning misaligned screens for integration is known to increase errors/latencies. The error/latency pattern was thus indicative of the reference frame used for integration. We showed that task familiarity combined with self-paced learning, visual updating, and knowing from where to act prioritized the integration within the reference frame of the initial presentation, which was updated later, and from where participants acted respectively. Participants also heavily relied on layout intrinsic frames. The results show how humans flexibly adjust their integration strategy to a wide variety of conditions. PMID:27101011
1001 Ways to run AutoDock Vina for virtual screening
NASA Astrophysics Data System (ADS)
Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.
2016-03-01
Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.
1001 Ways to run AutoDock Vina for virtual screening.
Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D
2016-03-01
Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.
Automated Analysis of siRNA Screens of Virus Infected Cells Based on Immunofluorescence Microscopy
NASA Astrophysics Data System (ADS)
Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl
We present an image analysis approach as part of a high-throughput microscopy screening system based on cell arrays for the identification of genes involved in Hepatitis C and Dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in cells, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behavior of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.
NASA Astrophysics Data System (ADS)
Srinivasa, K. G.; Shree Devi, B. N.
2017-10-01
String searching in documents has become a tedious task with the evolution of Big Data. Generation of large data sets demand for a high performance search algorithm in areas such as text mining, information retrieval and many others. The popularity of GPU's for general purpose computing has been increasing for various applications. Therefore it is of great interest to exploit the thread feature of a GPU to provide a high performance search algorithm. This paper proposes an optimized new approach to N-gram model for string search in a number of lengthy documents and its GPU implementation. The algorithm exploits GPGPUs for searching strings in many documents employing character level N-gram matching with parallel Score Table approach and search using CUDA API. The new approach of Score table used for frequency storage of N-grams in a document, makes the search independent of the document's length and allows faster access to the frequency values, thus decreasing the search complexity. The extensive thread feature in a GPU has been exploited to enable parallel pre-processing of trigrams in a document for Score Table creation and parallel search in huge number of documents, thus speeding up the whole search process even for a large pattern size. Experiments were carried out for many documents of varied length and search strings from the standard Lorem Ipsum text on NVIDIA's GeForce GT 540M GPU with 96 cores. Results prove that the parallel approach for Score Table creation and searching gives a good speed up than the same approach executed serially.
A microfluidic approach to parallelized transcriptional profiling of single cells.
Sun, Hao; Olsen, Timothy; Zhu, Jing; Tao, Jianguo; Ponnaiya, Brian; Amundson, Sally A; Brenner, David J; Lin, Qiao
2015-12-01
The ability to correlate single-cell genetic information with cellular phenotypes is of great importance to biology and medicine, as it holds the potential to gain insight into disease pathways that is unavailable from ensemble measurements. We present a microfluidic approach to parallelized, rapid, quantitative analysis of messenger RNA from single cells via RT-qPCR. The approach leverages an array of single-cell RT-qPCR analysis units formed by a set of parallel microchannels concurrently controlled by elastomeric pneumatic valves, thereby enabling parallelized handling and processing of single cells in a drastically simplified operation procedure using a relatively small number of microvalves. All steps for single-cell RT-qPCR, including cell isolation and immobilization, cell lysis, mRNA purification, reverse transcription and qPCR, are integrated on a single chip, eliminating the need for off-chip manual cell and reagent transfer and qPCR amplification as commonly used in existing approaches. Additionally, the approach incorporates optically transparent microfluidic components to allow monitoring of single-cell trapping without the need for molecular labeling that can potentially alter the targeted gene expression and utilizes a polycarbonate film as a barrier against evaporation to minimize the loss of reagents at elevated temperatures during the analysis. We demonstrate the utility of the approach by the transcriptional profiling for the induction of the cyclin-dependent kinase inhibitor 1a and the glyceraldehyde 3-phosphate dehydrogenase in single cells from the MCF-7 breast cancer cell line. Furthermore, the methyl methanesulfonate is employed to allow measurement of the expression of the genes in individual cells responding to a genotoxic stress.
High-throughput detection of ethanol-producing cyanobacteria in a microdroplet platform.
Abalde-Cela, Sara; Gould, Anna; Liu, Xin; Kazamia, Elena; Smith, Alison G; Abell, Chris
2015-05-06
Ethanol production by microorganisms is an important renewable energy source. Most processes involve fermentation of sugars from plant feedstock, but there is increasing interest in direct ethanol production by photosynthetic organisms. To facilitate this, a high-throughput screening technique for the detection of ethanol is required. Here, a method for the quantitative detection of ethanol in a microdroplet-based platform is described that can be used for screening cyanobacterial strains to identify those with the highest ethanol productivity levels. The detection of ethanol by enzymatic assay was optimized both in bulk and in microdroplets. In parallel, the encapsulation of engineered ethanol-producing cyanobacteria in microdroplets and their growth dynamics in microdroplet reservoirs were demonstrated. The combination of modular microdroplet operations including droplet generation for cyanobacteria encapsulation, droplet re-injection and pico-injection, and laser-induced fluorescence, were used to create this new platform to screen genetically engineered strains of cyanobacteria with different levels of ethanol production.
Posture and performance: sitting vs. standing for security screening.
Drury, C G; Hsiao, Y L; Joseph, C; Joshi, S; Lapp, J; Pennathur, P R
2008-03-01
A classification of the literature on the effects of workplace posture on performance of different mental tasks showed few consistent patterns. A parallel classification of the complementary effect of performance on postural variables gave similar results. Because of a lack of data for signal detection tasks, an experiment was performed using 12 experienced security operators performing an X-ray baggage-screening task with three different workplace arrangements. The current workplace, sitting on a high chair viewing a screen placed on top of the X-ray machine, was compared to a standing workplace and a conventional desk-sitting workplace. No performance effects of workplace posture were found, although the experiment was able to measure performance effects of learning and body part discomfort effects of workplace posture. There are implications for the classification of posture and performance and for the justification of ergonomics improvements based on performance increases.
Thin-phase screen estimates of TID effects on midlatitude transionospheric radio paths
NASA Astrophysics Data System (ADS)
Reilly, Michael H.
1993-11-01
The thin-phase screen model for ionospheric irregularity perturbations to transionospheric radio propagation is redefined. It is argued that the phase screen normal should be along the line of sight (LOS) between a receiver on the ground and a space transmitter, rather than in the zenith direction at the point of intersection with the LOS, which is traditional. The model is applied to a calculation of TID strength thresholds for the occurrence of multipath and scintillation. The results are in sharp disagreement with the traditional model, which predicts thresholds lower by an order of magnitude in typical cases. Midlatitude observations of TID strengths are reviewed, and it is found that multipath thresholds can be exceeded under one or more favorable circumstances, which include frequencies below about 100 MHz, low elevation angles, winter, night, atmospheric gravity wave velocity near the magnetic field direction and away from parallel with the LOS, and low solar activity.
Ulaczyk-Lesanko, Agnieszka; Pelletier, Eric; Lee, Maria; Prinz, Heino; Waldmann, Herbert; Hall, Dennis G
2007-01-01
Several solid- and solution-phase strategies were evaluated for the preparation of libraries of polysubstituted piperidines of type 7 using the tandem aza[4+2]cycloaddition/allylboration multicomponent reaction between 1-aza-4-boronobutadienes, maleimides, and aldehydes. A novel four-component variant of this chemistry was developed in solution phase, and it circumvents the need for pre-forming the azabutadiene component. A parallel synthesis coupled with compound purification by HPLC with mass-based fraction collection allowed the preparation of a library of 944 polysubstituted piperidines in a high degree of purity suitable for biological screening. A representative subset of 244 compounds was screened against a panel of phosphatase enzymes, and despite the modest levels of activity obtained, this study demonstrated that piperidines of type 7 display the right physical properties (e.g., solubility) to be assayed effectively in high-throughput enzymatic tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrisochoides, N.; Sukup, F.
In this paper we present a parallel implementation of the Bowyer-Watson (BW) algorithm using the task-parallel programming model. The BW algorithm constitutes an ideal mesh refinement strategy for implementing a large class of unstructured mesh generation techniques on both sequential and parallel computers, by preventing the need for global mesh refinement. Its implementation on distributed memory multicomputes using the traditional data-parallel model has been proven very inefficient due to excessive synchronization needed among processors. In this paper we demonstrate that with the task-parallel model we can tolerate synchronization costs inherent to data-parallel methods by exploring concurrency in the processor level.more » Our preliminary performance data indicate that the task- parallel approach: (i) is almost four times faster than the existing data-parallel methods, (ii) scales linearly, and (iii) introduces minimum overheads compared to the {open_quotes}best{close_quotes} sequential implementation of the BW algorithm.« less
In vitro, high-throughput approaches have been widely recommended as an approach to screen chemicals for the potential to cause developmental neurotoxicity and prioritize them for additional testing. The choice of cellular models for such an approach will have important ramificat...
A parallel approach of COFFEE objective function to multiple sequence alignment
NASA Astrophysics Data System (ADS)
Zafalon, G. F. D.; Visotaky, J. M. V.; Amorim, A. R.; Valêncio, C. R.; Neves, L. A.; de Souza, R. C. G.; Machado, J. M.
2015-09-01
The computational tools to assist genomic analyzes show even more necessary due to fast increasing of data amount available. With high computational costs of deterministic algorithms for sequence alignments, many works concentrate their efforts in the development of heuristic approaches to multiple sequence alignments. However, the selection of an approach, which offers solutions with good biological significance and feasible execution time, is a great challenge. Thus, this work aims to show the parallelization of the processing steps of MSA-GA tool using multithread paradigm in the execution of COFFEE objective function. The standard objective function implemented in the tool is the Weighted Sum of Pairs (WSP), which produces some distortions in the final alignments when sequences sets with low similarity are aligned. Then, in studies previously performed we implemented the COFFEE objective function in the tool to smooth these distortions. Although the nature of COFFEE objective function implies in the increasing of execution time, this approach presents points, which can be executed in parallel. With the improvements implemented in this work, we can verify the execution time of new approach is 24% faster than the sequential approach with COFFEE. Moreover, the COFFEE multithreaded approach is more efficient than WSP, because besides it is slightly fast, its biological results are better.
ERIC Educational Resources Information Center
Babor, Thomas F.; McRee, Bonnie G.; Kassebaum, Patricia A.; Grimaldi, Paul L.; Ahmed, Kazi; Bray, Jeremy
2007-01-01
Screening, Brief Intervention, and Referral to Treatment (SBIRT) is a comprehensive and integrated approach to the delivery of early intervention and treatment services through universal screening for persons with substance use disorders and those at risk. This paper describes research on the components of SBIRT conducted during the past 25 years,…
A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor
NASA Technical Reports Server (NTRS)
Rao, Hariprasad Nannapaneni
1989-01-01
The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.
Some fast elliptic solvers on parallel architectures and their complexities
NASA Technical Reports Server (NTRS)
Gallopoulos, E.; Saad, Y.
1989-01-01
The discretization of separable elliptic partial differential equations leads to linear systems with special block tridiagonal matrices. Several methods are known to solve these systems, the most general of which is the Block Cyclic Reduction (BCR) algorithm which handles equations with nonconstant coefficients. A method was recently proposed to parallelize and vectorize BCR. In this paper, the mapping of BCR on distributed memory architectures is discussed, and its complexity is compared with that of other approaches including the Alternating-Direction method. A fast parallel solver is also described, based on an explicit formula for the solution, which has parallel computational compelxity lower than that of parallel BCR.
Some fast elliptic solvers on parallel architectures and their complexities
NASA Technical Reports Server (NTRS)
Gallopoulos, E.; Saad, Youcef
1989-01-01
The discretization of separable elliptic partial differential equations leads to linear systems with special block triangular matrices. Several methods are known to solve these systems, the most general of which is the Block Cyclic Reduction (BCR) algorithm which handles equations with nonconsistant coefficients. A method was recently proposed to parallelize and vectorize BCR. Here, the mapping of BCR on distributed memory architectures is discussed, and its complexity is compared with that of other approaches, including the Alternating-Direction method. A fast parallel solver is also described, based on an explicit formula for the solution, which has parallel computational complexity lower than that of parallel BCR.
A parallel algorithm for generation and assembly of finite element stiffness and mass matrices
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Carmona, E. A.; Nguyen, D. T.; Baddourah, M. A.
1991-01-01
A new algorithm is proposed for parallel generation and assembly of the finite element stiffness and mass matrices. The proposed assembly algorithm is based on a node-by-node approach rather than the more conventional element-by-element approach. The new algorithm's generality and computation speed-up when using multiple processors are demonstrated for several practical applications on multi-processor Cray Y-MP and Cray 2 supercomputers.
Krylov subspace methods on supercomputers
NASA Technical Reports Server (NTRS)
Saad, Youcef
1988-01-01
A short survey of recent research on Krylov subspace methods with emphasis on implementation on vector and parallel computers is presented. Conjugate gradient methods have proven very useful on traditional scalar computers, and their popularity is likely to increase as three-dimensional models gain importance. A conservative approach to derive effective iterative techniques for supercomputers has been to find efficient parallel/vector implementations of the standard algorithms. The main source of difficulty in the incomplete factorization preconditionings is in the solution of the triangular systems at each step. A few approaches consisting of implementing efficient forward and backward triangular solutions are described in detail. Polynomial preconditioning as an alternative to standard incomplete factorization techniques is also discussed. Another efficient approach is to reorder the equations so as to improve the structure of the matrix to achieve better parallelism or vectorization. An overview of these and other ideas and their effectiveness or potential for different types of architectures is given.
Highly efficient spatial data filtering in parallel using the opensource library CPPPO
NASA Astrophysics Data System (ADS)
Municchi, Federico; Goniva, Christoph; Radl, Stefan
2016-10-01
CPPPO is a compilation of parallel data processing routines developed with the aim to create a library for "scale bridging" (i.e. connecting different scales by mean of closure models) in a multi-scale approach. CPPPO features a number of parallel filtering algorithms designed for use with structured and unstructured Eulerian meshes, as well as Lagrangian data sets. In addition, data can be processed on the fly, allowing the collection of relevant statistics without saving individual snapshots of the simulation state. Our library is provided with an interface to the widely-used CFD solver OpenFOAM®, and can be easily connected to any other software package via interface modules. Also, we introduce a novel, extremely efficient approach to parallel data filtering, and show that our algorithms scale super-linearly on multi-core clusters. Furthermore, we provide a guideline for choosing the optimal Eulerian cell selection algorithm depending on the number of CPU cores used. Finally, we demonstrate the accuracy and the parallel scalability of CPPPO in a showcase focusing on heat and mass transfer from a dense bed of particles.
Shaker, M; Stukus, D; Chan, E S; Fleischer, D M; Spergel, J M; Greenhawt, M
2018-03-30
Early peanut introduction (EPI) in the first year of life is associated with reduced risk of developing peanut allergy in children with either severe eczema and/or egg allergy. However, EPI recommendations differ among countries with formal guidelines. Using simulation and Markov modeling over a 20-year horizon to attempt to explore optimal EPI strategies applied to the US population, we compared high-risk infant-specific IgE peanut screening (US/Canadian) with the Australiasian Society for Clinical Immunology and Allergy (Australia/New Zealand) (ASCIA) and the United Kingdom Department of Health (UKDOH)-published EPI approaches. Screening peanut skin testing of all children with early-onset eczema and/or egg allergy before in-office peanut introduction was dominated by a no screening approach, in terms of number of cases of peanut allergy prevented, quality-adjusted life years (QALY), and healthcare costs, although screening resulted in a slightly lower rate of allergic reactions to peanut per patient in high-risk children. Considering costs of peanut allergy in high-risk children, the per-patient cost of early introduction without screening over the model horizon was $6556.69 (95%CI, $6512.76-$6600.62), compared with a cost of $7576.32 (95%CI, $7531.38-$7621.26) for skin test screening prior to introduction. From a US societal perspective, screening prior to introduction cost $654 115 322 and resulted in 3208 additional peanut allergy diagnoses. Both screening and nonscreening approaches dominated deliberately delayed peanut introduction. A no-screening approach for EPI has superior health and economic benefits in terms of number of peanut allergy cases prevented, QALY, and total healthcare costs compared to screening and in-office peanut introduction. © 2018 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.
Li, Chuan; Petukh, Marharyta; Li, Lin; Alexov, Emil
2013-08-15
Due to the enormous importance of electrostatics in molecular biology, calculating the electrostatic potential and corresponding energies has become a standard computational approach for the study of biomolecules and nano-objects immersed in water and salt phase or other media. However, the electrostatics of large macromolecules and macromolecular complexes, including nano-objects, may not be obtainable via explicit methods and even the standard continuum electrostatics methods may not be applicable due to high computational time and memory requirements. Here, we report further development of the parallelization scheme reported in our previous work (Li, et al., J. Comput. Chem. 2012, 33, 1960) to include parallelization of the molecular surface and energy calculations components of the algorithm. The parallelization scheme utilizes different approaches such as space domain parallelization, algorithmic parallelization, multithreading, and task scheduling, depending on the quantity being calculated. This allows for efficient use of the computing resources of the corresponding computer cluster. The parallelization scheme is implemented in the popular software DelPhi and results in speedup of several folds. As a demonstration of the efficiency and capability of this methodology, the electrostatic potential, and electric field distributions are calculated for the bovine mitochondrial supercomplex illustrating their complex topology, which cannot be obtained by modeling the supercomplex components alone. Copyright © 2013 Wiley Periodicals, Inc.
Cao, Jianfang; Chen, Lichao; Wang, Min; Tian, Yun
2018-01-01
The Canny operator is widely used to detect edges in images. However, as the size of the image dataset increases, the edge detection performance of the Canny operator decreases and its runtime becomes excessive. To improve the runtime and edge detection performance of the Canny operator, in this paper, we propose a parallel design and implementation for an Otsu-optimized Canny operator using a MapReduce parallel programming model that runs on the Hadoop platform. The Otsu algorithm is used to optimize the Canny operator's dual threshold and improve the edge detection performance, while the MapReduce parallel programming model facilitates parallel processing for the Canny operator to solve the processing speed and communication cost problems that occur when the Canny edge detection algorithm is applied to big data. For the experiments, we constructed datasets of different scales from the Pascal VOC2012 image database. The proposed parallel Otsu-Canny edge detection algorithm performs better than other traditional edge detection algorithms. The parallel approach reduced the running time by approximately 67.2% on a Hadoop cluster architecture consisting of 5 nodes with a dataset of 60,000 images. Overall, our approach system speeds up the system by approximately 3.4 times when processing large-scale datasets, which demonstrates the obvious superiority of our method. The proposed algorithm in this study demonstrates both better edge detection performance and improved time performance.
Biocellion: accelerating computer simulation of multicellular biological system models
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-01-01
Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572
The role of human papillomavirus in screening for cervical cancer.
McFadden, S E; Schumann, L
2001-03-01
To review the options for effectively screening for cervical cancer, including human papilloma virus (HPV) identification, cytologic screening, colposcopy, or a combination approach. Current pathophysiology, diagnostic criteria, treatment approaches, and patient preparation and education related to cervical cancer screening and prevention are also included. Comprehensive review of current literature, including research and review articles. Because the Papanicolau (Pap) smear is a screening tool, not a diagnostic tool, further studies must be done to identify the actual nature of discovered abnormalities. Of particular concern is the classification of atypical squamous cells of undetermined significance (ASCUS), which may simply indicate inflammation, or may be the first indicator of serious pathology. Following ASCUS Pap smears with HPV screening will allow for a clarification of the best approach to treatment. A screening algorithm supported by a review of the literature is proposed. Cervical cancer is a preventable disease caused by certain forms of HPV. Current screening protocols are based on the use of the Pap smear; and in areas where this test is routine and available, morbidity and mortality rates have dropped dramatically. Many women throughout the world and in underserved regions of the U. S. do not have adequate access to routine screening with Pap smear technology. As long as women continue to die needlessly of cervical cancer, more comprehensive and accessible screening methods must be explored. (Cutting the unnecessary worldwide and in the U. S.).
A Parallel Cartesian Approach for External Aerodynamics of Vehicles with Complex Geometry
NASA Technical Reports Server (NTRS)
Aftosmis, M. J.; Berger, M. J.; Adomavicius, G.
2001-01-01
This workshop paper presents the current status in the development of a new approach for the solution of the Euler equations on Cartesian meshes with embedded boundaries in three dimensions on distributed and shared memory architectures. The approach uses adaptively refined Cartesian hexahedra to fill the computational domain. Where these cells intersect the geometry, they are cut by the boundary into arbitrarily shaped polyhedra which receive special treatment by the solver. The presentation documents a newly developed multilevel upwind solver based on a flexible domain-decomposition strategy. One novel aspect of the work is its use of space-filling curves (SFC) for memory efficient on-the-fly parallelization, dynamic re-partitioning and automatic coarse mesh generation. Within each subdomain the approach employs a variety reordering techniques so that relevant data are on the same page in memory permitting high-performance on cache-based processors. Details of the on-the-fly SFC based partitioning are presented as are construction rules for the automatic coarse mesh generation. After describing the approach, the paper uses model problems and 3- D configurations to both verify and validate the solver. The model problems demonstrate that second-order accuracy is maintained despite the presence of the irregular cut-cells in the mesh. In addition, it examines both parallel efficiency and convergence behavior. These investigations demonstrate a parallel speed-up in excess of 28 on 32 processors of an SGI Origin 2000 system and confirm that mesh partitioning has no effect on convergence behavior.
Wake turbulence limits on paired approaches to parallel runways
DOT National Transportation Integrated Search
2002-07-01
Wake turbulence considerations currently restrict the use of parallel runways less than 2500 ft (762 m) apart. : However, wake turbulence is not a factor if there are appropriate limits on allowed longitudinal pair spacings : and/or allowed crosswind...
Barata, David; van Blitterswijk, Clemens; Habibovic, Pamela
2016-04-01
From the first microfluidic devices used for analysis of single metabolic by-products to highly complex multicompartmental co-culture organ-on-chip platforms, efforts of many multidisciplinary teams around the world have been invested in overcoming the limitations of conventional research methods in the biomedical field. Close spatial and temporal control over fluids and physical parameters, integration of sensors for direct read-out as well as the possibility to increase throughput of screening through parallelization, multiplexing and automation are some of the advantages of microfluidic over conventional, 2D tissue culture in vitro systems. Moreover, small volumes and relatively small cell numbers used in experimental set-ups involving microfluidics, can potentially decrease research cost. On the other hand, these small volumes and numbers of cells also mean that many of the conventional molecular biology or biochemistry assays cannot be directly applied to experiments that are performed in microfluidic platforms. Development of different types of assays and evidence that such assays are indeed a suitable alternative to conventional ones is a step that needs to be taken in order to have microfluidics-based platforms fully adopted in biomedical research. In this review, rather than providing a comprehensive overview of the literature on microfluidics, we aim to discuss developments in the field of microfluidics that can aid advancement of biomedical research, with emphasis on the field of biomaterials. Three important topics will be discussed, being: screening, in particular high-throughput and combinatorial screening; mimicking of natural microenvironment ranging from 3D hydrogel-based cellular niches to organ-on-chip devices; and production of biomaterials with closely controlled properties. While important technical aspects of various platforms will be discussed, the focus is mainly on their applications, including the state-of-the-art, future perspectives and challenges. Microfluidics, being a technology characterized by the engineered manipulation of fluids at the submillimeter scale, offers some interesting tools that can advance biomedical research and development. Screening platforms based on microfluidic technologies that allow high-throughput and combinatorial screening may lead to breakthrough discoveries not only in basic research but also relevant to clinical application. This is further strengthened by the fact that reliability of such screens may improve, since microfluidic systems allow close mimicking of physiological conditions. Finally, microfluidic systems are also very promising as micro factories of a new generation of natural or synthetic biomaterials and constructs, with finely controlled properties. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Linker, Kevin L.; Brusseau, Charles A.
2002-01-01
A portal apparatus for screening persons or objects for the presence of trace amounts of target substances such as explosives, narcotics, radioactive materials, and certain chemical materials. The portal apparatus can have a one-sided exhaust for an exhaust stream, an interior wall configuration with a concave-shape across a horizontal cross-section for each of two facing sides to result in improved airflow and reduced washout relative to a configuration with substantially flat parallel sides; air curtains to reduce washout; ionizing sprays to collect particles bound by static forces, as well as gas jet nozzles to dislodge particles bound by adhesion to the screened person or object. The portal apparatus can be included in a detection system with a preconcentrator and a detector.
Backes, Bradley J; Longenecker, Kenton; Hamilton, Gregory L; Stewart, Kent; Lai, Chunqiu; Kopecka, Hana; von Geldern, Thomas W; Madar, David J; Pei, Zhonghua; Lubben, Thomas H; Zinker, Bradley A; Tian, Zhenping; Ballaron, Stephen J; Stashko, Michael A; Mika, Amanda K; Beno, David W A; Kempf-Grote, Anita J; Black-Schaefer, Candace; Sham, Hing L; Trevillyan, James M
2007-04-01
A novel series of pyrrolidine-constrained phenethylamines were developed as dipeptidyl peptidase IV (DPP4) inhibitors for the treatment of type 2 diabetes. The cyclohexene ring of lead-like screening hit 5 was replaced with a pyrrolidine to enable parallel chemistry, and protein co-crystal structural data guided the optimization of N-substituents. Employing this strategy, a >400x improvement in potency over the initial hit was realized in rapid fashion. Optimized compounds are potent and selective inhibitors with excellent pharmacokinetic profiles. Compound 30 was efficacious in vivo, lowering blood glucose in ZDF rats that were allowed to feed freely on a mixed meal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winther, Hans A.; Koyama, Kazuya; Wright, Bill S.
We present a general parallelized and easy-to-use code to perform numerical simulations of structure formation using the COLA (COmoving Lagrangian Acceleration) method for cosmological models that exhibit scale-dependent growth at the level of first and second order Lagrangian perturbation theory. For modified gravity theories we also include screening using a fast approximate method that covers all the main examples of screening mechanisms in the literature. We test the code by comparing it to full simulations of two popular modified gravity models, namely f ( R ) gravity and nDGP, and find good agreement in the modified gravity boost-factors relative tomore » ΛCDM even when using a fairly small number of COLA time steps.« less
Walla Walla River Basin Fish Screens Evaluations, 2006 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamness, Mickie; Abernethy, Scott; Tunnicliffe, Cherylyn
2007-01-01
Pacific Northwest National Laboratory evaluated Gardena Farms, Little Walla Walla, and Garden City/Lowden II Phase II fish screen facilities and provided underwater videography beneath a leaking rubber dam in the Walla Walla River basin in 2006. Evaluations of the fish screen facilities took place in early May 2006, when juvenile salmonids are generally outmigrating. At the Gardena Farms site, extended high river levels caused accumulations of debris and sediment in the forebay. This debris covered parts of the bottom drum seals, which could lead to early deterioration of the seals and drum screen. Approach velocities were excessive at the upstreammore » corners of most of the drums, leading to 14% of the total approach velocities exceeding 0.4 feet per second (ft/s). Consequently, the approach velocities did not meet National Marine Fisheries Service (NMFS) design criteria guidelines for juvenile fish screens. The Little Walla Walla site was found to be in good condition, with all approach, sweep, and bypass velocities within NMFS criteria. Sediment buildup was minor and did not affect the effectiveness of the screens. At Garden City/Lowden II, 94% of approach velocities met NMFS criteria of 0.4 ft/s at any time. Sweep velocities increased toward the fish ladder. The air-burst mechanism appears to keep large debris off the screens, although it does not prevent algae and periphyton from growing on the screen face, especially near the bottom of the screens. In August 2006, the Gardena Farm Irrigation District personnel requested that we look for a leak beneath the inflatable rubber dam at the Garden City/Lowden II site that was preventing water movement through the fish ladder. Using our underwater video equipment, we were able to find a gap in the sheet piling beneath the dam. Erosion of the riverbed was occurring around this gap, allowing water and cobbles to move beneath the dam. The construction engineers and irrigation district staff were able to use the video footage to resolve the problem within a couple weeks. We had hoped to also evaluate the effectiveness of modifications to louvers behind the Nursery Bridge screens when flows were higher than 350 cubic feet per second, (cfs) but were unable to do so. Based on the one measurement made in early 2006 after the modified louvers were set, it appears the modified louvers may help reduce approach velocities. The auxiliary supply water system gates also control water through the screens. Evaluating the effect of different combinations of gate and louver positions on approach velocities through the screens may help identify optimum settings for both at different river discharges.« less
NASA Technical Reports Server (NTRS)
Bailey, D. H.; Barszcz, E.; Barton, J. T.; Carter, R. L.; Lasinski, T. A.; Browning, D. S.; Dagum, L.; Fatoohi, R. A.; Frederickson, P. O.; Schreiber, R. S.
1991-01-01
A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers in the framework of the NASA Ames Numerical Aerodynamic Simulation (NAS) Program. These consist of five 'parallel kernel' benchmarks and three 'simulated application' benchmarks. Together they mimic the computation and data movement characteristics of large-scale computational fluid dynamics applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification-all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
Backtracking and Re-execution in the Automatic Debugging of Parallelized Programs
NASA Technical Reports Server (NTRS)
Matthews, Gregory; Hood, Robert; Johnson, Stephen; Leggett, Peter; Biegel, Bryan (Technical Monitor)
2002-01-01
In this work we describe a new approach using relative debugging to find differences in computation between a serial program and a parallel version of th it program. We use a combination of re-execution and backtracking in order to find the first difference in computation that may ultimately lead to an incorrect value that the user has indicated. In our prototype implementation we use static analysis information from a parallelization tool in order to perform the backtracking as well as the mapping required between serial and parallel computations.
PRESTO-Tango as an open-source resource for interrogation of the druggable human GPCRome.
Kroeze, Wesley K; Sassano, Maria F; Huang, Xi-Ping; Lansu, Katherine; McCorvy, John D; Giguère, Patrick M; Sciaky, Noah; Roth, Bryan L
2015-05-01
G protein-coupled receptors (GPCRs) are essential mediators of cellular signaling and are important targets of drug action. Of the approximately 350 nonolfactory human GPCRs, more than 100 are still considered to be 'orphans' because their endogenous ligands remain unknown. Here, we describe a unique open-source resource that allows interrogation of the druggable human GPCRome via a G protein-independent β-arrestin-recruitment assay. We validate this unique platform at more than 120 nonorphan human GPCR targets, demonstrate its utility for discovering new ligands for orphan human GPCRs and describe a method (parallel receptorome expression and screening via transcriptional output, with transcriptional activation following arrestin translocation (PRESTO-Tango)) for the simultaneous and parallel interrogation of the entire human nonolfactory GPCRome.
Flexible CRISPR library construction using parallel oligonucleotide retrieval
Read, Abigail; Gao, Shaojian; Batchelor, Eric
2017-01-01
Abstract CRISPR/Cas9-based gene knockout libraries have emerged as a powerful tool for functional screens. We present here a set of pre-designed human and mouse sgRNA sequences that are optimized for both high on-target potency and low off-target effect. To maximize the chance of target gene inactivation, sgRNAs were curated to target both 5΄ constitutive exons and exons that encode conserved protein domains. We describe here a robust and cost-effective method to construct multiple small sized CRISPR library from a single oligo pool generated by array synthesis using parallel oligonucleotide retrieval. Together, these resources provide a convenient means for individual labs to generate customized CRISPR libraries of variable size and coverage depth for functional genomics application. PMID:28334828