Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Hal: an automated pipeline for phylogenetic analyses of genomic data.
Robbertse, Barbara; Yoder, Ryan J; Boyd, Alex; Reeves, John; Spatafora, Joseph W
2011-02-07
The rapid increase in genomic and genome-scale data is resulting in unprecedented levels of discrete sequence data available for phylogenetic analyses. Major analytical impasses exist, however, prior to analyzing these data with existing phylogenetic software. Obstacles include the management of large data sets without standardized naming conventions, identification and filtering of orthologous clusters of proteins or genes, and the assembly of alignments of orthologous sequence data into individual and concatenated super alignments. Here we report the production of an automated pipeline, Hal that produces multiple alignments and trees from genomic data. These alignments can be produced by a choice of four alignment programs and analyzed by a variety of phylogenetic programs. In short, the Hal pipeline connects the programs BLASTP, MCL, user specified alignment programs, GBlocks, ProtTest and user specified phylogenetic programs to produce species trees. The script is available at sourceforge (http://sourceforge.net/projects/bio-hal/). The results from an example analysis of Kingdom Fungi are briefly discussed.
NASA Astrophysics Data System (ADS)
Henclik, Sławomir
2018-03-01
The influence of dynamic fluid-structure interaction (FSI) onto the course of water hammer (WH) can be significant in non-rigid pipeline systems. The essence of this effect is the dynamic transfer of liquid energy to the pipeline structure and back, which is important for elastic structures and can be negligible for rigid ones. In the paper a special model of such behavior is analyzed. A straight pipeline with a steady flow, fixed to the floor with several rigid supports is assumed. The transient is generated by a quickly closed valve installed at the end of the pipeline. FSI effects are assumed to be present mainly at the valve which is fixed with a spring dash-pot attachment. Analysis of WH runs, especially transient pressure changes, for various stiffness and damping parameters of the spring dash-pot valve attachment is presented in the paper. The solutions are found analytically and numerically. Numerical results have been computed with the use of an own computer program developed on the basis of the four equation model of WH-FSI and the specific boundary conditions formulated at the valve. Analytical solutions have been found with the separation of variables method for slightly simplified assumptions. Damping at the dash-pot is taken into account within the numerical study. The influence of valve attachment parameters onto the WH courses was discovered and it was found the transient amplitudes can be reduced. Such a system, elastically attached shut-off valve in a pipeline or other, equivalent design can be a real solution applicable in practice.
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
Text-based Analytics for Biosurveillance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah
The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less
Yan, Yifei; Zhang, Lisong; Yan, Xiangzhen
2016-01-01
In this paper, a single-slope tunnel pipeline was analysed considering the effects of vertical earth pressure, horizontal soil pressure, inner pressure, thermal expansion force and pipeline—soil friction. The concept of stagnation point for the pipeline was proposed. Considering the deformation compatibility condition of the pipeline elbow, the push force of anchor blocks of a single-slope tunnel pipeline was derived based on an energy method. Then, the theoretical formula for this force is thus generated. Using the analytical equation, the push force of the anchor block of an X80 large-diameter pipeline from the West—East Gas Transmission Project was determined. Meanwhile, to verify the results of the analytical method, and the finite element method, four categories of finite element codes were introduced to calculate the push force, including CAESARII, ANSYS, AutoPIPE and ALGOR. The results show that the analytical results agree well with the numerical results, and the maximum relative error is only 4.1%. Therefore, the results obtained with the analytical method can satisfy engineering requirements. PMID:26963097
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.
2015-02-01
Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-08-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-01-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/. PMID:23657089
Haller, Toomas; Leitsalu, Liis; Fischer, Krista; Nuotio, Marja-Liisa; Esko, Tõnu; Boomsma, Dorothea Irene; Kyvik, Kirsten Ohm; Spector, Tim D; Perola, Markus; Metspalu, Andres
2017-01-01
Ancestry information at the individual level can be a valuable resource for personalized medicine, medical, demographical and history research, as well as for tracing back personal history. We report a new method for quantitatively determining personal genetic ancestry based on genome-wide data. Numerical ancestry component scores are assigned to individuals based on comparisons with reference populations. These comparisons are conducted with an existing analytical pipeline making use of genotype phasing, similarity matrix computation and our addition-multidimensional best fitting by MixFit. The method is demonstrated by studying Estonian and Finnish populations in geographical context. We show the main differences in the genetic composition of these otherwise close European populations and how they have influenced each other. The components of our analytical pipeline are freely available computer programs and scripts one of which was developed in house (available at: www.geenivaramu.ee/en/tools/mixfit).
Closha: bioinformatics workflow system for the analysis of massive sequencing data.
Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook
2018-02-19
While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .
Creation and Implementation of a Workforce Development Pipeline Program at MSFC
NASA Technical Reports Server (NTRS)
Hix, Billy
2003-01-01
Within the context of NASA's Education Programs, this Workforce Development Pipeline guide describes the goals and objectives of MSFC's Workforce Development Pipeline Program as well as the principles and strategies for guiding implementation. It is designed to support the initiatives described in the NASA Implementation Plan for Education, 1999-2003 (EP-1998-12-383-HQ) and represents the vision of the members of the Education Programs office at MSFC. This document: 1) Outlines NASA s Contribution to National Priorities; 2) Sets the context for the Workforce Development Pipeline Program; 3) Describes Workforce Development Pipeline Program Strategies; 4) Articulates the Workforce Development Pipeline Program Goals and Aims; 5) List the actions to build a unified approach; 6) Outlines the Workforce Development Pipeline Programs guiding Principles; and 7) The results of implementation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
Identification of missing variants by combining multiple analytic pipelines.
Ren, Yingxue; Reddy, Joseph S; Pottier, Cyril; Sarangi, Vivekananda; Tian, Shulan; Sinnwell, Jason P; McDonnell, Shannon K; Biernacka, Joanna M; Carrasquillo, Minerva M; Ross, Owen A; Ertekin-Taner, Nilüfer; Rademakers, Rosa; Hudson, Matthew; Mainzer, Liudmila Sergeevna; Asmann, Yan W
2018-04-16
After decades of identifying risk factors using array-based genome-wide association studies (GWAS), genetic research of complex diseases has shifted to sequencing-based rare variants discovery. This requires large sample sizes for statistical power and has brought up questions about whether the current variant calling practices are adequate for large cohorts. It is well-known that there are discrepancies between variants called by different pipelines, and that using a single pipeline always misses true variants exclusively identifiable by other pipelines. Nonetheless, it is common practice today to call variants by one pipeline due to computational cost and assume that false negative calls are a small percent of total. We analyzed 10,000 exomes from the Alzheimer's Disease Sequencing Project (ADSP) using multiple analytic pipelines consisting of different read aligners and variant calling strategies. We compared variants identified by using two aligners in 50,100, 200, 500, 1000, and 1952 samples; and compared variants identified by adding single-sample genotyping to the default multi-sample joint genotyping in 50,100, 500, 2000, 5000 and 10,000 samples. We found that using a single pipeline missed increasing numbers of high-quality variants correlated with sample sizes. By combining two read aligners and two variant calling strategies, we rescued 30% of pass-QC variants at sample size of 2000, and 56% at 10,000 samples. The rescued variants had higher proportions of low frequency (minor allele frequency [MAF] 1-5%) and rare (MAF < 1%) variants, which are the very type of variants of interest. In 660 Alzheimer's disease cases with earlier onset ages of ≤65, 4 out of 13 (31%) previously-published rare pathogenic and protective mutations in APP, PSEN1, and PSEN2 genes were undetected by the default one-pipeline approach but recovered by the multi-pipeline approach. Identification of the complete variant set from sequencing data is the prerequisite of genetic association analyses. The current analytic practice of calling genetic variants from sequencing data using a single bioinformatics pipeline is no longer adequate with the increasingly large projects. The number and percentage of quality variants that passed quality filters but are missed by the one-pipeline approach rapidly increased with sample size.
77 FR 15453 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... information collection titled, ``Gas Pipeline Safety Program Certification and Hazardous Liquid Pipeline... collection request that PHMSA will be submitting to OMB for renewal titled, ``Gas Pipeline Safety Program...
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.
NASA Technical Reports Server (NTRS)
Brownston, Lee; Jenkins, Jon M.
2015-01-01
The Kepler Mission was launched in 2009 as NASAs first mission capable of finding Earth-size planets in the habitable zone of Sun-like stars. Its telescope consists of a 1.5-m primary mirror and a 0.95-m aperture. The 42 charge-coupled devices in its focal plane are read out every half hour, compressed, and then downlinked monthly. After four years, the second of four reaction wheels failed, ending the original mission. Back on earth, the Science Operations Center developed the Science Pipeline to analyze about 200,000 target stars in Keplers field of view, looking for evidence of periodic dimming suggesting that one or more planets had crossed the face of its host star. The Pipeline comprises several steps, from pixel-level calibration, through noise and artifact removal, to detection of transit-like signals and the construction of a suite of diagnostic tests to guard against false positives. The Kepler Science Pipeline consists of a pipeline infrastructure written in the Java programming language, which marshals data input to and output from MATLAB applications that are executed as external processes. The pipeline modules, which underwent continuous development and refinement even after data started arriving, employ several analytic techniques, many developed for the Kepler Project. Because of the large number of targets, the large amount of data per target and the complexity of the pipeline algorithms, the processing demands are daunting. Some pipeline modules require days to weeks to process all of their targets, even when run on NASA's 128-node Pleiades supercomputer. The software developers are still seeking ways to increase the throughput. To date, the Kepler project has discovered more than 4000 planetary candidates, of which more than 1000 have been independently confirmed or validated to be exoplanets. Funding for this mission is provided by NASAs Science Mission Directorate.
49 CFR 192.911 - What are the elements of an integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.911 What are the elements of an integrity management program...
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545
49 CFR 192.913 - When may an operator deviate its program from certain requirements of this subpart?
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.913 When may an operator deviate its program...
49 CFR 192.945 - What methods must an operator use to measure program effectiveness?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.945 What methods must an operator use to measure program...
Algorithms for parallel flow solvers on message passing architectures
NASA Technical Reports Server (NTRS)
Vanderwijngaart, Rob F.
1995-01-01
The purpose of this project has been to identify and test suitable technologies for implementation of fluid flow solvers -- possibly coupled with structures and heat equation solvers -- on MIMD parallel computers. In the course of this investigation much attention has been paid to efficient domain decomposition strategies for ADI-type algorithms. Multi-partitioning derives its efficiency from the assignment of several blocks of grid points to each processor in the parallel computer. A coarse-grain parallelism is obtained, and a near-perfect load balance results. In uni-partitioning every processor receives responsibility for exactly one block of grid points instead of several. This necessitates fine-grain pipelined program execution in order to obtain a reasonable load balance. Although fine-grain parallelism is less desirable on many systems, especially high-latency networks of workstations, uni-partition methods are still in wide use in production codes for flow problems. Consequently, it remains important to achieve good efficiency with this technique that has essentially been superseded by multi-partitioning for parallel ADI-type algorithms. Another reason for the concentration on improving the performance of pipeline methods is their applicability in other types of flow solver kernels with stronger implied data dependence. Analytical expressions can be derived for the size of the dynamic load imbalance incurred in traditional pipelines. From these it can be determined what is the optimal first-processor retardation that leads to the shortest total completion time for the pipeline process. Theoretical predictions of pipeline performance with and without optimization match experimental observations on the iPSC/860 very well. Analysis of pipeline performance also highlights the effect of uncareful grid partitioning in flow solvers that employ pipeline algorithms. If grid blocks at boundaries are not at least as large in the wall-normal direction as those immediately adjacent to them, then the first processor in the pipeline will receive a computational load that is less than that of subsequent processors, magnifying the pipeline slowdown effect. Extra compensation is needed for grid boundary effects, even if all grid blocks are equally sized.
1997 annual report : environmental monitoring program Louisiana offshore oil port pipeline.
DOT National Transportation Integrated Search
1998-06-01
The Louisiana Offshore Oil Port (LOOP) Environmental Monitoring Program includes an onshore pipeline vegetation and wildlife survey as a continuing study designed to measure the immediate and long-term impacts of LOOP-related pipeline construction an...
77 FR 61825 - Pipeline Safety: Notice of Public Meeting on Pipeline Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... program performance measures for gas distribution, gas transmission, and hazardous liquids pipelines. The... distribution pipelines (49 CFR 192.1007(e)), gas transmission pipelines (49 CFR 192.945) and hazardous liquids...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
... From OMB of One Current Public Collection of Information: Pipeline Corporate Security Review Program... current security practices in the pipeline industry by way of TSA's Pipeline Corporate Security Review... Collection Requirement The TSA Pipeline Security Branch is responsible for conducting Pipeline Corporate...
77 FR 51848 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Program for Gas Distribution Pipelines. DATES: Interested persons are invited to submit comments on or.... These regulations require operators of hazardous liquid pipelines and gas pipelines to develop and...
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
77 FR 19799 - Pipeline Safety: Pipeline Damage Prevention Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
...,602 to $3,445,975. Evaluating just the lower range of benefits over ten years results in a total... consequences resulting from excavation damage to pipelines. A comprehensive damage prevention program requires..., including that resulting from excavation, digging, and other impacts, is also precipitated by operators...
PRADA: pipeline for RNA sequencing data analysis.
Torres-García, Wandaliz; Zheng, Siyuan; Sivachenko, Andrey; Vegesna, Rahulsimham; Wang, Qianghu; Yao, Rong; Berger, Michael F; Weinstein, John N; Getz, Gad; Verhaak, Roel G W
2014-08-01
Technological advances in high-throughput sequencing necessitate improved computational tools for processing and analyzing large-scale datasets in a systematic automated manner. For that purpose, we have developed PRADA (Pipeline for RNA-Sequencing Data Analysis), a flexible, modular and highly scalable software platform that provides many different types of information available by multifaceted analysis starting from raw paired-end RNA-seq data: gene expression levels, quality metrics, detection of unsupervised and supervised fusion transcripts, detection of intragenic fusion variants, homology scores and fusion frame classification. PRADA uses a dual-mapping strategy that increases sensitivity and refines the analytical endpoints. PRADA has been used extensively and successfully in the glioblastoma and renal clear cell projects of The Cancer Genome Atlas program. http://sourceforge.net/projects/prada/ gadgetz@broadinstitute.org or rverhaak@mdanderson.org Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
75 FR 32836 - Pipeline Safety: Workshop on Public Awareness Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... American Public Gas Association Association of Oil Pipelines American Petroleum Institute Interstate... the pipeline industry). Hazardous Liquid Gas Transmission/Gathering Natural Gas Distribution (10...
Automated Monitoring of Pipeline Rights-of-Way
NASA Technical Reports Server (NTRS)
Frost, Chard Ritchie
2010-01-01
NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.
Long-Term Monitoring of Cased Pipelines Using Longrange Guided-Wave Technique
DOT National Transportation Integrated Search
2009-05-19
Integrity management programs for gas transmission pipelines are required by The Office of Pipeline Safety (OPS)/DOT. Direct Assessment (DA) and 'Other Technologies' have become the focus of assessment options for pipeline integrity on cased crossing...
The Vulnerability Formation Mechanism and Control Strategy of the Oil and Gas Pipeline City
NASA Astrophysics Data System (ADS)
Chen, Y. L.; Han, L.
2017-12-01
Most of the pipelines of oil and gas pipelines in our country have been for more than 25 years. These pipes are buried underground and was difficult to daily test. In addition, it was vulnerable to environmental, corrosion and natural disasters, So there is a hidden nature of accidents. The rapid development of urbanization, population accumulation, dense building and insufficient safety range are all the reasons for the frequent accidents of oil and gas pipelines. Therefore, to appraise and know the safe condition of the city various regions oil and gas pipelines is vital significant. In order to ensure the safety of oil and gas pipeline city, this paper defines the connotation of oil and gas pipeline city vulnerability according to the previous research on vulnerability. Then from three perspectives of environment, structure and behavior, based on the analytical paradigm of “structure—vulnerability conduct—performance” about oil and gas, the influential indicators of vulnerable oil and gas pipelines were analysed, the vulnerability mechanism framework of Oil and gas pipeline city was also constructed. Finally, the paper proposed the regulating strategy of the vulnerability of the oil and gas pipeline city to decrease its vulnerability index, which can be realize the city’s vulnerability evaluation and provides new ideas for the sustainable development of the city.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, E.A.; Smed, P.F.; Bryndum, M.B.
The paper describes the numerical program, PIPESIN, that simulates the behavior of a pipeline placed on an erodible seabed. PIPEline Seabed INteraction from installation until a stable pipeline seabed configuration has occurred is simulated in the time domain including all important physical processes. The program is the result of the joint research project, ``Free Span Development and Self-lowering of Offshore Pipelines`` sponsored by EU and a group of companies and carried out by the Danish Hydraulic Institute and Delft Hydraulics. The basic modules of PIPESIN are described. The description of the scouring processes has been based on and verified throughmore » physical model tests carried out as part of the research project. The program simulates a section of the pipeline (typically 500 m) in the time domain, the main input being time series of the waves and current. The main results include predictions of the onset of free spans, their length distribution, their variation in time, and the lowering of the pipeline as function of time.« less
Black Radicals Make for Bad Citizens: Undoing the Myth of the School to Prison Pipeline
ERIC Educational Resources Information Center
Sojoyner, Damien M.
2013-01-01
Over the past ten years, the analytic formation of the school to prison pipeline has come to dominate the lexicon and general common sense with respect to the relationship between schools and prisons in the United States. The concept and theorization that undergirds its meaning and function do not address the root causes that are central to…
Meta-analysis of human genome-microbiome association studies: the MiBioGen consortium initiative.
Wang, Jun; Kurilshikov, Alexander; Radjabzadeh, Djawad; Turpin, Williams; Croitoru, Kenneth; Bonder, Marc Jan; Jackson, Matthew A; Medina-Gomez, Carolina; Frost, Fabian; Homuth, Georg; Rühlemann, Malte; Hughes, David; Kim, Han-Na; Spector, Tim D; Bell, Jordana T; Steves, Claire J; Timpson, Nicolas; Franke, Andre; Wijmenga, Cisca; Meyer, Katie; Kacprowski, Tim; Franke, Lude; Paterson, Andrew D; Raes, Jeroen; Kraaij, Robert; Zhernakova, Alexandra
2018-06-08
In recent years, human microbiota, especially gut microbiota, have emerged as an important yet complex trait influencing human metabolism, immunology, and diseases. Many studies are investigating the forces underlying the observed variation, including the human genetic variants that shape human microbiota. Several preliminary genome-wide association studies (GWAS) have been completed, but more are necessary to achieve a fuller picture. Here, we announce the MiBioGen consortium initiative, which has assembled 18 population-level cohorts and some 19,000 participants. Its aim is to generate new knowledge for the rapidly developing field of microbiota research. Each cohort has surveyed the gut microbiome via 16S rRNA sequencing and genotyped their participants with full-genome SNP arrays. We have standardized the analytical pipelines for both the microbiota phenotypes and genotypes, and all the data have been processed using identical approaches. Our analysis of microbiome composition shows that we can reduce the potential artifacts introduced by technical differences in generating microbiota data. We are now in the process of benchmarking the association tests and performing meta-analyses of genome-wide associations. All pipeline and summary statistics results will be shared using public data repositories. We present the largest consortium to date devoted to microbiota-GWAS. We have adapted our analytical pipelines to suit multi-cohort analyses and expect to gain insight into host-microbiota cross-talk at the genome-wide level. And, as an open consortium, we invite more cohorts to join us (by contacting one of the corresponding authors) and to follow the analytical pipeline we have developed.
Carthon, J. Margo Brooks; Nguyen, Thai-Huy; Chittams, Jesse; Park, Elizabeth; Guevara, James
2015-01-01
Objectives The purpose of this study was to identify common components of diversity pipeline programs across a national sample of nursing institutions and determine what effect these programs have on increasing underrepresented minority enrollment and graduation. Design Linked data from an electronic survey conducted November 2012 to March 2013 and American Association of Colleges of Nursing baccalaureate graduation and enrollment data (2008 and 2012). Participants Academic and administrative staff of 164 nursing schools in 26 states, including Puerto Rico in the United States. Methods Chi-square statistics were used to (1) describe organizational features of nursing diversity pipeline programs and (2) determine significant trends in underrepresented minorities’ graduation and enrollment between nursing schools with and without diversity pipeline programs Results Twenty percent (n = 33) of surveyed nursing schools reported a structured diversity pipeline program. The most frequent program measures associated with pipeline programs included mentorship, academic, and psychosocial support. Asian, Hispanic, and Native Hawaiian/Pacific Islander nursing student enrollment increased between 2008 and 2012. Hispanic/Latino graduation rates increased (7.9%–10.4%, p = .001), but they decreased among Black (6.8%–5.0%, p = .004) and Native American/Pacific Islander students (2.1 %–0.3%, p ≥ .001). Conclusions Nursing diversity pipeline programs are associated with increases in nursing school enrollment and graduation for some, although not all, minority students. Future initiatives should build on current trends while creating targeted strategies to reverse downward graduation trends among Black, Native American, and Pacific Island nursing students. PMID:24880900
Brooks Carthon, J Margo; Nguyen, Thai-Huy; Chittams, Jesse; Park, Elizabeth; Guevara, James
2014-01-01
The purpose of this study was to identify common components of diversity pipeline programs across a national sample of nursing institutions and determine what effect these programs have on increasing underrepresented minority enrollment and graduation. Linked data from an electronic survey conducted November 2012 to March 2013 and American Association of Colleges of Nursing baccalaureate graduation and enrollment data (2008 and 2012). Academic and administrative staff of 164 nursing schools in 26 states, including Puerto Rico in the United States. Chi-square statistics were used to (1) describe organizational features of nursing diversity pipeline programs and (2) determine significant trends in underrepresented minorities' graduation and enrollment between nursing schools with and without diversity pipeline programs Twenty percent (n = 33) of surveyed nursing schools reported a structured diversity pipeline program. The most frequent program measures associated with pipeline programs included mentorship, academic, and psychosocial support. Asian, Hispanic, and Native Hawaiian/Pacific Islander nursing student enrollment increased between 2008 and 2012. Hispanic/Latino graduation rates increased (7.9%-10.4%, p = .001), but they decreased among Black (6.8%-5.0%, p = .004) and Native American/Pacific Islander students (2.1 %-0.3%, p ≥ .001). Nursing diversity pipeline programs are associated with increases in nursing school enrollment and graduation for some, although not all, minority students. Future initiatives should build on current trends while creating targeted strategies to reverse downward graduation trends among Black, Native American, and Pacific Island nursing students. Copyright © 2014 Elsevier Inc. All rights reserved.
Germaine Reyes-French; Timothy J. Cohen
1991-01-01
This paper outlines a mitigation program for pipeline construction impacts to oak tree habitat by describing the requirements for the Offsite Oak Mitigation Program for the All American Pipeline (AAPL) in Santa Barbara County, California. After describing the initial environmental analysis, the County regulatory structure is described under which the plan was required...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Evaluations AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice... improve performance. For gas transmission pipelines, Sec. Sec. 192.911(i) and 192.945 define the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-03
... Information Collection Activity Under OMB Review: Pipeline Corporate Security Review AGENCY: Transportation.... Information Collection Requirement Title: Pipeline Corporate Security Review (PCSR). Type of Request... current industry security practices through its Pipeline Corporate Security Review (PCSR) program. The...
49 CFR 192.909 - How can an operator change its integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management...
ERIC Educational Resources Information Center
Pinckney, Charlyene Carol
2014-01-01
The current study was undertaken to examine the effectiveness of the Rowan University-School of Osteopathic Medicine - Summer Pre-Medical Research and Education Program (Summer PREP), a postsecondary medical sciences enrichment pipeline program for under-represented and disadvantaged students. Thirty-four former program participants were surveyed…
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
49 CFR 195.452 - Pipeline integrity management in high consequence areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge...
Demons registration for in vivo and deformable laser scanning confocal endomicroscopy.
Chiew, Wei-Ming; Lin, Feng; Seah, Hock Soon
2017-09-01
A critical effect found in noninvasive in vivo endomicroscopic imaging modalities is image distortions due to sporadic movement exhibited by living organisms. In three-dimensional confocal imaging, this effect results in a dataset that is tilted across deeper slices. Apart from that, the sequential flow of the imaging-processing pipeline restricts real-time adjustments due to the unavailability of information obtainable only from subsequent stages. To solve these problems, we propose an approach to render Demons-registered datasets as they are being captured, focusing on the coupling between registration and visualization. To improve the acquisition process, we also propose a real-time visual analytics tool, which complements the imaging pipeline and the Demons registration pipeline with useful visual indicators to provide real-time feedback for immediate adjustments. We highlight the problem of deformation within the visualization pipeline for object-ordered and image-ordered rendering. Visualizations of critical information including registration forces and partial renderings of the captured data are also presented in the analytics system. We demonstrate the advantages of the algorithmic design through experimental results with both synthetically deformed datasets and actual in vivo, time-lapse tissue datasets expressing natural deformations. Remarkably, this algorithm design is for embedded implementation in intelligent biomedical imaging instrumentation with customizable circuitry. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Demons registration for in vivo and deformable laser scanning confocal endomicroscopy
NASA Astrophysics Data System (ADS)
Chiew, Wei Ming; Lin, Feng; Seah, Hock Soon
2017-09-01
A critical effect found in noninvasive in vivo endomicroscopic imaging modalities is image distortions due to sporadic movement exhibited by living organisms. In three-dimensional confocal imaging, this effect results in a dataset that is tilted across deeper slices. Apart from that, the sequential flow of the imaging-processing pipeline restricts real-time adjustments due to the unavailability of information obtainable only from subsequent stages. To solve these problems, we propose an approach to render Demons-registered datasets as they are being captured, focusing on the coupling between registration and visualization. To improve the acquisition process, we also propose a real-time visual analytics tool, which complements the imaging pipeline and the Demons registration pipeline with useful visual indicators to provide real-time feedback for immediate adjustments. We highlight the problem of deformation within the visualization pipeline for object-ordered and image-ordered rendering. Visualizations of critical information including registration forces and partial renderings of the captured data are also presented in the analytics system. We demonstrate the advantages of the algorithmic design through experimental results with both synthetically deformed datasets and actual in vivo, time-lapse tissue datasets expressing natural deformations. Remarkably, this algorithm design is for embedded implementation in intelligent biomedical imaging instrumentation with customizable circuitry.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SADE is a software package for rapidly assembling analytic pipelines to manipulate data. The packages consists of the engine that manages the data and coordinates the movement of data between the tasks performing a function? a set of core libraries consisting of plugins that perform common tasks? and a framework to extend the system supporting the development of new plugins. Currently through configuration files, a pipeline can be defined that maps the routing of data through a series of plugins. Pipelines can be run in a batch mode or can process streaming data? they can be executed from the commandmore » line or run through a Windows background service. There currently exists over a hundred plugins, over fifty pipeline configurations? and the software is now being used by about a half-dozen projects.« less
Modelling of non-equilibrium flow in the branched pipeline systems
NASA Astrophysics Data System (ADS)
Sumskoi, S. I.; Sverchkov, A. M.; Lisanov, M. V.; Egorov, A. F.
2016-09-01
This article presents a mathematical model and a numerical method for solving the task of water hammer in the branched pipeline system. The task is considered in the onedimensional non-stationary formulation taking into account the realities such as the change in the diameter of the pipeline and its branches. By comparison with the existing analytic solution it has been shown that the proposed method possesses good accuracy. With the help of the developed model and numerical method the task has been solved concerning the transmission of the compression waves complex in the branching pipeline system when several shut down valves operate. It should be noted that the offered model and method may be easily introduced to a number of other tasks, for example, to describe the flow of blood in the vessels.
Optimal Energy Consumption Analysis of Natural Gas Pipeline
Liu, Enbin; Li, Changjun; Yang, Yi
2014-01-01
There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. Therefore, based on a full understanding of the actual needs of pipeline companies, we introduce production unit consumption indicators to establish an objective function for achieving the goal of lowering energy consumption. By using a dynamic programming method for solving the model and preparing calculation software, we can ensure that the solution process is quick and efficient. Using established optimization methods, we analyzed the energy savings for the XQ gas pipeline. By optimizing the boot program, the import station pressure, and the temperature parameters, we achieved the optimal energy consumption. By comparison with the measured energy consumption, the pipeline now has the potential to reduce energy consumption by 11 to 16 percent. PMID:24955410
77 FR 31827 - Pipeline Safety: Pipeline Damage Prevention Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
...://www.regulations.gov . FOR FURTHER INFORMATION CONTACT: For further information contact Sam Hall, Program Manager, PHMSA by email at sam[email protected] or by telephone at (804) 556-4678 or Larry White...
ECDA of Cased Pipeline Segments
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56-07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on pipeline safety program. Although, not...
A Study Skills Curriculum for Pipeline Programs.
ERIC Educational Resources Information Center
Saks, Norma Susswein, Ed.; Killeya, Ley A., Ed.; Rushton, Joan, Ed.
This study skills curriculum is part of a "pipeline" program designed to recruit, matriculate, and graduate educationally disadvantaged students at the University of Medicine and Dentistry of New Jersey-Robert Wood Johnson Medical School (UMDNJ-RWJMS). It is an integral part of the Biomedical Careers Program (BCP) and the Science…
VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans
NASA Astrophysics Data System (ADS)
Wang, Song; Gupta, Chetan; Mehta, Abhay
There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Budanov, V. A.; Golubeva, T. N.
2015-03-01
Matters concerned with making efficient use of the information-analytical system on the flow-accelerated corrosion problem in setting up in-service examination of the metal of pipeline elements operating in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered. The principles used to select samples of pipeline elements in planning ultrasonic thickness measurements for timely revealing metal thinning due to flow-accelerated corrosion along with reducing the total amount of measurements in the condensate-feedwater path are discussed.
Edlow, Brian L.; Hamilton, Karen; Hamilton, Roy H.
2007-01-01
This article provides an overview of the University of Pennsylvania School of Medicine’s Pipeline Neuroscience Program, a multi-tiered mentorship and education program for Philadelphia high school students in which University of Pennsylvania undergraduates are integrally involved. The Pipeline Neuroscience Program provides mentorship and education for students at all levels. High school students are taught by undergraduates, who learn from medical students who, in turn, are guided by neurology residents and fellows. Throughout a semester-long course, undergraduates receive instruction in neuroanatomy, neuroscience, and clinical neurology as part of the Pipeline’s case-based curriculum. During weekly classes, undergraduates make the transition from students to community educators by integrating their new knowledge into lesson plans that they teach to small groups of medically and academically underrepresented Philadelphia high school students. The Pipeline program thus achieves the dual goals of educating undergraduates about neuroscience and providing them with an opportunity to perform community service. PMID:23493190
78 FR 57455 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler; Chu, Andrew
The ThunderBird Cup v3.0 (TBC3) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Wellington K.; Morris, Tyler Jake; Chu, Andrew Chun-An
The ThunderBird Cup v2.0 (TBC2) program falls under the Minority Serving Institution Pipeline Program (MSIPP) that aims to establish a world-class workforce development, education and research program that combines the strengths of Historically Black Colleges and Universities (HBCUs) and national laboratories to create a K-20 pipeline of students to participate in cybersecurity and related fields.
49 CFR 190.239 - Safety orders.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Safety orders. 190.239 Section 190.239 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY PIPELINE SAFETY PROGRAMS AND RULEMAKING...
ML-o-Scope: A Diagnostic Visualization System for Deep Machine Learning Pipelines
2014-05-16
ML-o-scope: a diagnostic visualization system for deep machine learning pipelines Daniel Bruckner Electrical Engineering and Computer Sciences... machine learning pipelines 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...the system as a support for tuning large scale object-classification pipelines. 1 Introduction A new generation of pipelined machine learning models
NASA Technical Reports Server (NTRS)
Charity, Pamela C.; Klein, Paul B.; Wadhwa, Bhushan
1995-01-01
The Cleveland State University Minority Engineering Program Pipeline consist of programs which foster engineering career awareness, academic enrichment, and professional development for historically underrepresented minority studies. The programs involved are the Access to Careers in Engineering (ACE) Program for high school pre-engineering students: the LINK Program for undergraduate students pursuing degree which include engineering; and the PEP (Pre-calculus Enrichment Program) and EPIC (Enrichment Program in Calculus) mathematics programs for undergraduate academic enrichment. The pipeline is such that high school graduates from the ACE Program who enroll at Cleveland State University in pursuit of engineering degrees are admitted to the LINK Program for undergraduate level support. LINK Program students are among the minority participants who receive mathematics enrichment through the PEP and EPIC Programs for successful completion of their engineering required math courses. THese programs are interdependent and share the goal of preparing minority students for engineering careers by enabling them to achieve academically and obtain college degree and career related experience.
Design Against Propagating Shear Failure in Pipelines
NASA Astrophysics Data System (ADS)
Leis, B. N.; Gray, J. Malcolm
Propagating shear failure can occur in gas and certain hazardous liquid transmission pipelines, potentially leading to a large long-burning fire and/or widespread pollution, depending on the transported product. Such consequences require that the design of the pipeline and specification of the steel effectively preclude the chance of propagating shear failure. Because the phenomenology of such failures is complex, design against such occurrences historically has relied on full-scale demonstration experiments coupled with empirically calibrated analytical models. However, as economic drivers have pushed toward larger diameter higher pressure pipelines made of tough higher-strength grades, the design basis to ensure arrest has been severely compromised. Accordingly, for applications where the design basis becomes less certain, as has occurred increasing as steel grade and toughness has increased, it has become necessary to place greater reliance on the use and role of full-scale testing.
Rep. Young, Don [R-AK-At Large
2011-02-08
House - 02/09/2011 Referred to the Subcommittee on Railroads, Pipelines, and Hazardous Materials. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
78 FR 59906 - Pipeline Safety: Class Location Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
... 192 [Docket No. PHMSA-2013-0161] Pipeline Safety: Class Location Requirements AGENCY: Pipeline and... Location Requirements,'' seeking comments on whether integrity management program (IMP) requirements, or... for class location requirements. PHMSA has received two requests to extend the comment period to allow...
Pipeline safety and security : improved workforce planning and communication needed
DOT National Transportation Integrated Search
2002-08-01
Pipelines transport about 65 percent of the crude oil and refined oil products and nearly all of the natural gas in the United States. The Office of Pipeline Safety (OPS), within the Department of Transportation's (DOT) Research and Special Programs ...
49 CFR 192.1003 - What do the regulations in this subpart cover?
Code of Federal Regulations, 2010 CFR
2010-10-01
... AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas...? General. This subpart prescribes minimum requirements for an IM program for any gas distribution pipeline...
The American Science Pipeline: Sustaining Innovation in a Time of Economic Crisis
ERIC Educational Resources Information Center
Hue, Gillian; Sales, Jessica; Comeau, Dawn; Lynn, David G.; Eisen, Arri
2010-01-01
Significant limitations have emerged in America's science training pipeline, including inaccessibility, inflexibility, financial limitations, and lack of diversity. We present three effective programs that collectively address these challenges. The programs are grounded in rigorous science and integrate through diverse disciplines across…
Comeau, Donald C.; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W. John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net PMID:24935050
Rand, Hugh; Shumway, Martin; Trees, Eija K.; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E.; Defibaugh-Chavez, Stephanie; Carleton, Heather A.; Klimke, William A.; Katz, Lee S.
2017-01-01
Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools—we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. PMID:29372115
Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S
2017-01-01
As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines.
DOT National Transportation Integrated Search
2009-01-01
The Army maintains the capability to employ temporary petroleum pipelines. With the fiscal year (FY) 0813 program objective memorandum (POM) force, the Army proposes to retain two Active and twelve Reserve Petroleum Pipeline and Terminal Operating...
ERIC Educational Resources Information Center
Knox, Ronny D.
2013-01-01
This research project used the Narrative Non-fiction method to examine the school-to-prison pipeline phenomenon through the experiences of four previously incarcerated adult males who had been placed in Discipline Alternative Educational Programs (DAEPs) during their public school education. In 1981, DAEPs were instituted as a pilot program to…
Natural Gas Pipeline Replacement Programs Reduce Methane Leaks and Improve Consumer Safety
NASA Astrophysics Data System (ADS)
Jackson, R. B.
2015-12-01
From production through distribution, oil and natural gas infrastructure provide the largest source of anthropogenic methane in the U.S. and the second largest globally. To examine the prevalence of natural gas leaks downstream in distribution systems, we mapped methane leaks across 595, 750, and 247 road miles of three U.S. cities—Durham, NC, Cincinnati, OH, and Manhattan, NY, respectively—at different stages of pipeline replacement of cast iron and other older materials. We compare results with those for two cities we mapped previously, Boston and Washington, D.C. Overall, cities with pipeline replacement programs have considerably fewer leaks per mile than cities without such programs. Similar programs around the world should provide additional environmental, economic, and consumer safety benefits.
Bioinformatic pipelines in Python with Leaf
2013-01-01
Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Comeau, Donald C; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net. © The Author(s) 2014. Published by Oxford University Press.
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... understanding and analysis of the failure mechanisms or threats to integrity of each pipeline segment. (2) An... pipeline, information and data used for the information analysis; (13) results of the information analyses...
Experimental and analytical study of water pipe's rupture for damage identification purposes
NASA Astrophysics Data System (ADS)
Papakonstantinou, Konstantinos G.; Shinozuka, Masanobu; Beikae, Mohsen
2011-04-01
A malfunction, local damage or sudden pipe break of a pipeline system can trigger significant flow variations. As shown in the paper, pressure variations and pipe vibrations are two strongly correlated parameters. A sudden change in the flow velocity and pressure of a pipeline system can induce pipe vibrations. Thus, based on acceleration data, a rapid detection and localization of a possible damage may be carried out by inexpensive, nonintrusive monitoring techniques. To illustrate this approach, an experiment on a single pipe was conducted in the laboratory. Pressure gauges and accelerometers were installed and their correlation was checked during an artificially created transient flow. The experimental findings validated the correlation between the parameters. The interaction between pressure variations and pipe vibrations was also theoretically justified. The developed analytical model explains the connection among flow pressure, velocity, pressure wave propagation and pipe vibration. The proposed method provides a rapid, efficient and practical way to identify and locate sudden failures of a pipeline system and sets firm foundations for the development and implementation of an advanced, new generation Supervisory Control and Data Acquisition (SCADA) system for continuous health monitoring of pipe networks.
Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho
2017-11-01
High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Opportunity Knocks: Pipeline Programs Offer Minority Students a Path to Dentistry
ERIC Educational Resources Information Center
Fauteux, Nicole
2012-01-01
Minority students have traditionally been underrepresented in dental schools, which is why enrichment and pipeline programs aimed at helping minority students are necessary. That reality is reflected in their woeful underrepresentation among practicing dentists. Hispanics made up only 5.8 percent of practicing dentists in 2011, according to the…
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2013 CFR
2013-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and turret-and-hull... Platform Verification Program: (i) Drilling, production, and pipeline risers, and riser tensioning systems...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
49 CFR 198.39 - Qualifications for operation of one-call notification system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Qualifications for operation of one-call...) PIPELINE SAFETY REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.39 Qualifications for operation of one-call notification system. A one-call...
DOT National Transportation Integrated Search
1978-12-01
This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...
Code of Federal Regulations, 2012 CFR
2012-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2011 CFR
2011-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2013 CFR
2013-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Code of Federal Regulations, 2014 CFR
2014-10-01
... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192... internal corrosion, external corrosion, and stress corrosion cracking; (2) Static or resident threats, such... its integrity management program addressing actions it will take to respond to findings from this data...
Building Effective Pipelines to Increase Diversity in the Geosciences
NASA Astrophysics Data System (ADS)
Snow, E.; Robinson, C. R.; Neal-Mujahid, R.
2017-12-01
The U.S. Geological Survey (USGS) recognizes and understands the importance of a diverse workforce in advancing our science. Valuing Differences is one of the guiding principles of the USGS, and is the critical basis of the collaboration among the Youth and Education in Science (YES) program in the USGS Office of Science, Quality, and Integrity (OSQI), the Office of Diversity and Equal Opportunity (ODEO), and USGS science centers to build pipeline programs targeting diverse young scientists. Pipeline programs are robust, sustained relationships between two entities that provide a pathway from one to the other, in this case, from minority serving institutions to the USGS. The USGS has benefited from pipeline programs for many years. Our longest running program, with University of Puerto Rico Mayaguez (UPR), is a targeted outreach and internship program that has been managed by USGS scientists in Florida since the mid-1980's Originally begun as the Minority Participation in the Earth Sciences (MPES ) Program, it has evolved over the years, and in its several forms has brought dozens of interns to the USGS. Based in part on that success, in 2006 USGS scientists in Woods Hole MA worked with their Florida counterparts to build a pipeline program with City College of New York (CCNY). In this program, USGS scientists visit CCNY monthly, giving a symposium and meeting with students and faculty. The talks are so successful that the college created a course around them. In 2017, the CCNY and UPR programs brought 12 students to the USGS for summer internships. The CCNY model has been so successful that USGS is exploring creating similar pipeline programs. The YES office is coordinating with ODEO and USGS science centers to identify partner universities and build relationships that will lead to robust partnership where USGS scientists will visit regularly to engage with faculty and students and recruit students for USGS internships. The ideal partner universities will have a high population of underserved students, strong support for minority and first-generation students, proximity to a USGS office, and faculty and/or majors in several of the fields most important to USGS science: geology, geochemistry, energy, biology, ecology, environmental health, hydrology, climate science, GIS, high-capacity computing, and remote sensing.
Gong, Ting; Szustakowski, Joseph D
2013-04-15
For heterogeneous tissues, measurements of gene expression through mRNA-Seq data are confounded by relative proportions of cell types involved. In this note, we introduce an efficient pipeline: DeconRNASeq, an R package for deconvolution of heterogeneous tissues based on mRNA-Seq data. It adopts a globally optimized non-negative decomposition algorithm through quadratic programming for estimating the mixing proportions of distinctive tissue types in next-generation sequencing data. We demonstrated the feasibility and validity of DeconRNASeq across a range of mixing levels and sources using mRNA-Seq data mixed in silico at known concentrations. We validated our computational approach for various benchmark data, with high correlation between our predicted cell proportions and the real fractions of tissues. Our study provides a rigorous, quantitative and high-resolution tool as a prerequisite to use mRNA-Seq data. The modularity of package design allows an easy deployment of custom analytical pipelines for data from other high-throughput platforms. DeconRNASeq is written in R, and is freely available at http://bioconductor.org/packages. Supplementary data are available at Bioinformatics online.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
Amateur Image Pipeline Processing using Python plus PyRAF
NASA Astrophysics Data System (ADS)
Green, Wayne
2012-05-01
A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.
African-American Mentoring Program (AAMP): Addressing the Cracks in the Graduate Education Pipeline
ERIC Educational Resources Information Center
Green, Tonika Duren; Ammah, Beverly Booker; Butler-Byrd, Nola; Brandon, Regina; McIntosh, Angela
2017-01-01
In this conceptual article, we focus on mentoring as a strategy to mend the cracks in the education pipeline for African American graduate students. Our article highlights the African American Mentoring Program (AAMP) model and examines the unique methods it uses to support the retention and graduation of African American graduate students from a…
Hydrostatic collapse research in support of the Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, P.R.; McKeehan, D.S.
1995-12-01
This paper provides a summary of the collapse test program conducted as part of the technical development for the Ultra Deep Oman to India Pipeline. The paper describes the motivation for conducting the collapse test program, outlines the test objectives and procedures, presents the results obtained, and draws conclusions on the factors affecting collapse resistance.
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2013 CFR
2013-07-01
... pipelines are interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishkov, A.; Akopova, Gretta; Evans, Meredydd
This article will compare the natural gas transmission systems in the U.S. and Russia and review experience with methane mitigation technologies in the two countries. Russia and the United States (U.S.) are the world's largest consumers and producers of natural gas, and consequently, have some of the largest natural gas infrastructure. This paper compares the natural gas transmission systems in Russia and the U.S., their methane emissions and experiences in implementing methane mitigation technologies. Given the scale of the two systems, many international oil and natural gas companies have expressed interest in better understanding the methane emission volumes and trendsmore » as well as the methane mitigation options. This paper compares the two transmission systems and documents experiences in Russia and the U.S. in implementing technologies and programs for methane mitigation. The systems are inherently different. For instance, while the U.S. natural gas transmission system is represented by many companies, which operate pipelines with various characteristics, in Russia predominately one company, Gazprom, operates the gas transmission system. However, companies in both countries found that reducing methane emissions can be feasible and profitable. Examples of technologies in use include replacing wet seals with dry seals, implementing Directed Inspection and Maintenance (DI&M) programs, performing pipeline pump-down, applying composite wrap for non-leaking pipeline defects and installing low-bleed pneumatics. The research methodology for this paper involved a review of information on methane emissions trends and mitigation measures, analytical and statistical data collection; accumulation and analysis of operational data on compressor seals and other emission sources; and analysis of technologies used in both countries to mitigate methane emissions in the transmission sector. Operators of natural gas transmission systems have many options to reduce natural gas losses. Depending on the value of gas, simple, low-cost measures, such as adjusting leaking equipment components, or larger-scale measures, such as installing dry seals on compressors, can be applied.« less
NASA Astrophysics Data System (ADS)
Wallace, Eric W.; Perry, Justin C.; Ferguson, Robert L.; Jackson, Debbie K.
2015-08-01
The present study investigated the impact of a Science, Technology, Engineering, Mathematics and Health (STEM+H) university-based pipeline program, the Careers in Health and Medical Professions Program, over the course of two summers among predominantly African-American high school students recruited from urban school districts ( N = 155). Based on a mixed methods approach, results indicated that youth made significant gains in both academic and career knowledge. Furthermore, youth generally rated the program's sessions favorably, but also rated sessions with varying levels of satisfaction. The limitations and implications for program delivery and evaluation methods among pipeline programs are discussed.
Code of Federal Regulations, 2010 CFR
2010-10-01
... addressing time dependent and independent threats for a transmission pipeline operating below 30% SMYS not in... pipeline system are covered for purposes of the integrity management program requirements, an operator must... system, or an operator may apply one method to individual portions of the pipeline system. (Refer to...
NASA Astrophysics Data System (ADS)
Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng
2017-05-01
As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.
Characteristics of vibrational wave propagation and attenuation in submarine fluid-filled pipelines
NASA Astrophysics Data System (ADS)
Yan, Jin; Zhang, Juan
2015-04-01
As an important part of lifeline engineering in the development and utilization of marine resources, the submarine fluid-filled pipeline is a complex coupling system which is subjected to both internal and external flow fields. By utilizing Kennard's shell equations and combining with Helmholtz equations of flow field, the coupling equations of submarine fluid-filled pipeline for n=0 axisymmetrical wave motion are set up. Analytical expressions of wave speed are obtained for both s=1 and s=2 waves, which correspond to a fluid-dominated wave and an axial shell wave, respectively. The numerical results for wave speed and wave attenuation are obtained and discussed subsequently. It shows that the frequency depends on phase velocity, and the attenuation of this mode depends strongly on material parameters of the pipe and the internal and the external fluid fields. The characteristics of PVC pipe are studied for a comparison. The effects of shell thickness/radius ratio and density of the contained fluid on the model are also discussed. The study provides a theoretical basis and helps to accurately predict the situation of submarine pipelines, which also has practical application prospect in the field of pipeline leakage detection.
Development of Protective Coatings for Co-Sequestration Processes and Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwagen, Gordon; Huang, Yaping
2011-11-30
The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied asmore » potential candidates for internal pipeline coating to transport SCCO2.« less
The Importance of Outreach Programs to Unblock the Pipeline and Broaden Diversity in ICT Education
ERIC Educational Resources Information Center
Lang, Catherine; Craig, Annemieke; Egan, Mary Anne
2016-01-01
There is a need for outreach programs to attract a diverse range of students to the computing discipline. The lack of qualified computing graduates to fill the growing number of computing vacancies is of concern to government and industry and there are few female students entering the computing pipeline at high school level. This paper presents…
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Satagopam, Venkata; Gu, Wei; Eifes, Serge; Gawron, Piotr; Ostaszewski, Marek; Gebel, Stephan; Barbosa-Silva, Adriano; Balling, Rudi; Schneider, Reinhard
2016-01-01
Abstract Translational medicine is a domain turning results of basic life science research into new tools and methods in a clinical environment, for example, as new diagnostics or therapies. Nowadays, the process of translation is supported by large amounts of heterogeneous data ranging from medical data to a whole range of -omics data. It is not only a great opportunity but also a great challenge, as translational medicine big data is difficult to integrate and analyze, and requires the involvement of biomedical experts for the data processing. We show here that visualization and interoperable workflows, combining multiple complex steps, can address at least parts of the challenge. In this article, we present an integrated workflow for exploring, analysis, and interpretation of translational medicine data in the context of human health. Three Web services—tranSMART, a Galaxy Server, and a MINERVA platform—are combined into one big data pipeline. Native visualization capabilities enable the biomedical experts to get a comprehensive overview and control over separate steps of the workflow. The capabilities of tranSMART enable a flexible filtering of multidimensional integrated data sets to create subsets suitable for downstream processing. A Galaxy Server offers visually aided construction of analytical pipelines, with the use of existing or custom components. A MINERVA platform supports the exploration of health and disease-related mechanisms in a contextualized analytical visualization system. We demonstrate the utility of our workflow by illustrating its subsequent steps using an existing data set, for which we propose a filtering scheme, an analytical pipeline, and a corresponding visualization of analytical results. The workflow is available as a sandbox environment, where readers can work with the described setup themselves. Overall, our work shows how visualization and interfacing of big data processing services facilitate exploration, analysis, and interpretation of translational medicine data. PMID:27441714
Visual programming for next-generation sequencing data analytics.
Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia
2016-01-01
High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, S. Yu
2018-05-01
A method has been developed to reduce the resistance to movement of a pipeline in a horizontal curved well in the construction of underground passages using trenchless technologies. The method can be applied at the design stage. The idea of the proposed method consists in approximating the trajectory of the designed trenchless passage to the equilibrium profile. It has been proved that in order to reduce the resistance to movement of the pipeline arising from contact with the borehole wall, the profile of its initial and final sections must correspond, depending on the initial conditions, to the parabola or hyperbolic cosine equation. Analytical dependences are obtained which allow supplementing the methods of calculation of traction effort in trenchless construction for the case when the profile of the well is given by an arbitrary function.
Force of resistance to pipeline pulling in plane and volumetrically curved wells
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, S. Yu; Toropov, E. S.
2018-05-01
A method has been developed for calculating the component of the pulling force of a pipeline, arising from the well curvature in one or several planes, with the assumption that the pipeline is ballasted by filling with water or otherwise until zero buoyancy in the drilling mud is reached. This paper shows that when calculating this force, one can neglect the effect of sections with zero curvature. In the other case, if buoyancy of the pipeline is other than zero, the resistance force in the curvilinear sections should be calculated taking into account the difference between the normal components of the buoyancy force and weight. In the paper, it is proved that without taking into account resistance forces from the viscosity of the drilling mud, if buoyancy of the pipeline is zero, the total resistance force is independent of the length of the pipe and is determined by the angle equal to the sum of the entry angle and the exit angle of the pipeline to the day surface. For the case of the well curvature in several planes, it is proposed to perform the calculation of such volumetrically curved well by the central angle of the well profile. Analytical dependences are obtained that allow calculating the pulling force for well profiles with a variable curvature radius, i.e. at different angles of deviation between the drill pipes along the well profile.
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D; Lin, Selena; Jain, Surbhi; Song, Wei; Su, Ying-Hsiu
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq's pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community.
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D.; Lin, Selena; Jain, Surbhi; Song, Wei
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq’s pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community. PMID:28829778
Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Samatova, Nagiza; Wu, Kesheng
This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.
Mathematical simulation for compensation capacities area of pipeline routes in ship systems
NASA Astrophysics Data System (ADS)
Ngo, G. V.; Sakhno, K. N.
2018-05-01
In this paper, the authors considered the problem of manufacturability’s enhancement of ship systems pipeline at the designing stage. The analysis of arrangements and possibilities for compensation of deviations for pipeline routes has been carried out. The task was set to produce the “fit pipe” together with the rest of the pipes in the route. It was proposed to compensate for deviations by movement of the pipeline route during pipe installation and to calculate maximum values of these displacements in the analyzed path. Theoretical bases of deviation compensation for pipeline routes using rotations of parallel section pairs of pipes are assembled. Mathematical and graphical simulations of compensation area capacities of pipeline routes with various configurations are completed. Prerequisites have been created for creating an automated program that will allow one to determine values of the compensatory capacities area for pipeline routes and to assign quantities of necessary allowances.
An Aperture Photometry Pipeline for K2 Data
NASA Astrophysics Data System (ADS)
Buzasi, Derek L.; Carboneau, Lindsey; Lezcano, Andy; Vydra, Ekaterina
2016-01-01
As part of an ongoing research program with undergraduate students at Florida Gulf Coast University, we have constructed an aperture photometry pipeline for K2 data. The pipeline performs dynamic automated aperture mask definition for all targets in the K2 fields, followed by aperture photometry and detrending. Our pipeline is currently used to support a number of projects, including studies of stellar rotation and activity, red giant asteroseismology, gyrochronology, and exoplanet searches. In addition, output is used to support an undergraduate class on exoplanets aimed at a student audience of both majors and non-majors. The pipeline is designed for both batch and single-target use, and is easily extensible to data from other missions, and pipeline output is available to the community. This paper will describe our pipeline and its capabilities and illustrate the quality of the results, drawing on all of the applications for which it is currently used.
Pipelining in a changing competitive environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.G.; Wishart, D.M.
1996-12-31
The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less
Chan, Kuang-Lim; Rosli, Rozana; Tatarinova, Tatiana V; Hogan, Michael; Firdaus-Raih, Mohd; Low, Eng-Ti Leslie
2017-01-27
Gene prediction is one of the most important steps in the genome annotation process. A large number of software tools and pipelines developed by various computing techniques are available for gene prediction. However, these systems have yet to accurately predict all or even most of the protein-coding regions. Furthermore, none of the currently available gene-finders has a universal Hidden Markov Model (HMM) that can perform gene prediction for all organisms equally well in an automatic fashion. We present an automated gene prediction pipeline, Seqping that uses self-training HMM models and transcriptomic data. The pipeline processes the genome and transcriptome sequences of the target species using GlimmerHMM, SNAP, and AUGUSTUS pipelines, followed by MAKER2 program to combine predictions from the three tools in association with the transcriptomic evidence. Seqping generates species-specific HMMs that are able to offer unbiased gene predictions. The pipeline was evaluated using the Oryza sativa and Arabidopsis thaliana genomes. Benchmarking Universal Single-Copy Orthologs (BUSCO) analysis showed that the pipeline was able to identify at least 95% of BUSCO's plantae dataset. Our evaluation shows that Seqping was able to generate better gene predictions compared to three HMM-based programs (MAKER2, GlimmerHMM and AUGUSTUS) using their respective available HMMs. Seqping had the highest accuracy in rice (0.5648 for CDS, 0.4468 for exon, and 0.6695 nucleotide structure) and A. thaliana (0.5808 for CDS, 0.5955 for exon, and 0.8839 nucleotide structure). Seqping provides researchers a seamless pipeline to train species-specific HMMs and predict genes in newly sequenced or less-studied genomes. We conclude that the Seqping pipeline predictions are more accurate than gene predictions using the other three approaches with the default or available HMMs.
Development of Optimized Welding Solutions for X100 Linepipe Steel
DOT National Transportation Integrated Search
2011-09-01
This investigation is part of a major consolidated program of research sponsored by the US Department of Transportation (DOT) Pipeline Hazardous Materials Safety Administration (PHMSA) and the Pipeline Research Council International (PRCI) to advance...
Enrichment programs to create a pipeline to biomedical science careers.
Cregler, L L
1993-01-01
The Student Educational Enrichment Programs at the Medical College of Georgia in the School of Medicine were created to increase underrepresented minorities in the pipeline to biomedical science careers. Eight-week summer programs are conducted for high school, research apprentice, and intermediate and advanced college students. There is a prematriculation program for accepted medical, dental, and graduate students. Between 1979 and 1990, 245 high school students attended 12 summer programs. Of these, 240 (98%) entered college 1 year later. In 1986, after eight programs, 162 (68%) high school participants graduated from college with a baccalaureate degree, and 127 responded to a follow-up survey. Sixty-two (49%) of the college graduates attended health science schools, and 23 (18%) of these matriculated to medical school. Of college students, 504 participated in 13 summer programs. Four hundred (79%) of these students responded to a questionnaire, which indicated that 348 (87%) of the 400 entered health science occupations and/or professional schools; 179 (45%) of these students matriculated to medical school. Minority students participating in enrichment programs have greater success in gaining acceptance to college and professional school. These data suggest that early enrichment initiatives increase the number of underrepresented minorities in the biomedical science pipeline.
Mathematical model of polyethylene pipe bending stress state
NASA Astrophysics Data System (ADS)
Serebrennikov, Anatoly; Serebrennikov, Daniil
2018-03-01
Introduction of new machines and new technologies of polyethylene pipeline installation is usually based on the polyethylene pipe flexibility. It is necessary that existing bending stresses do not lead to an irreversible polyethylene pipe deformation and to violation of its strength characteristics. Derivation of the mathematical model which allows calculating analytically the bending stress level of polyethylene pipes with consideration of nonlinear characteristics is presented below. All analytical calculations made with the mathematical model are experimentally proved and confirmed.
Phyx: phylogenetic tools for unix.
Brown, Joseph W; Walker, Joseph F; Smith, Stephen A
2017-06-15
The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Phyx: phylogenetic tools for unix
Brown, Joseph W.; Walker, Joseph F.; Smith, Stephen A.
2017-01-01
Abstract Summary: The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx: a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. Availability and Implementation: phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx Contact: eebsmith@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28174903
Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark
2016-07-05
There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.
NASA Astrophysics Data System (ADS)
Borden, Paula D.
This dissertation study concerned the lack of underrepresented minority students matriculating through the health professions pipeline. The term pipeline is "the educational avenue by which one must travel to successfully enter a profession" (Sullivan Alliance, 2004). There are a significant number of health professional pipeline programs based across the United States and, for the purposes of this study, a focus was placed on the Science Enrichment Preparation (S.E.P.) Program which is based at The University of North Carolina at Chapel Hill. The S.E.P. Program, is an eight-week residential summer experience, designed to support underrepresented minority pre-health students develop the competitive edge for successful admission into health professional school programs. The bedrock of this dissertation study concerned itself with the relationships between cognitive variables and non-cognitive variables and academic performance of students in the S.E.P. Program from 2005-2013. The study was undertaken to provide a clearer understanding for the NC Health Careers Access Program's (NC-HCAP) leadership with regard to variables associated with the students' academic performance in the S.E.P. Program. The data outcomes were informative for NC-HCAP in identifying cognitive and non-cognitive variables associated with student academic performance. Additionally, these findings provided direction as to what infrastructures may be put into place to more effectively support the S.E.P. participants. It is the researcher's hope this study may serve as an educational model and resource to pipeline programs and others with similar educational missions. The consequences and implications of a non-diverse healthcare workforce are high and far reaching. Without parity representation in the healthcare workforce, health disparities between racial and economic groups will likely continue to grow.
Digital Imaging of Pipeline Mechanical Damage and Residual Stress
DOT National Transportation Integrated Search
2010-02-19
The purpose of this program was to enhance characterization of mechanical damage in pipelines through application of digital eddy current imaging. Lift-off maps can be used to develop quantitative representations of mechanical damage and magnetic per...
Formicola, Allan J; D'Abreu, Kim C; Tedesco, Lisa A
2010-10-01
By now, all dental schools should understand the need to increase the enrollment of underrepresented minority (URM) students. While there has been a major increase in the number of Hispanic/Latino, African American/Black, and Native American applicants to dental schools over the past decade, there has not been a major percent increase in the enrollment of URM students except in the schools participating in the Pipeline, Profession, and Practice: Community-Based Dental Education program, which have far exceeded the percent increase in enrollment of URM students in other U.S. dental schools during Phase I of the program (2002-07). Assuming that all dental schools wish to improve the diversity of their student bodies, chapters 9-12 of this report--for which this chapter serves as an introduction--provide strategies learned from the Pipeline schools to increase the applications and enrollment of URM students. Some of the changes that the Pipeline schools put into place were the result of two focus group studies of college and dental students of color. These studies provided guidance on some of the barriers and challenges students of color face when considering dentistry as a career. New accreditation standards make it clear that the field of dentistry expects dental schools to re-energize their commitment to diversity.
Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows
Torri, Federica; Dinov, Ivo D.; Zamanyan, Alen; Hobel, Sam; Genco, Alex; Petrosyan, Petros; Clark, Andrew P.; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Knowles, James A.; Ames, Joseph; Kesselman, Carl; Toga, Arthur W.; Potkin, Steven G.; Vawter, Marquis P.; Macciardi, Fabio
2012-01-01
Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS) technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG), to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases) when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders. PMID:23139896
Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S
2014-12-01
We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.
Chabuk, Ali; Al-Ansari, Nadhir; Hussain, Hussain Musa; Knutsson, Sven; Pusch, Roland
2016-05-01
Al-Hillah Qadhaa is located in the central part of Iraq. It covers an area of 908 km(2) with a total population of 856,804 inhabitants. This Qadhaa is the capital of Babylon Governorate. Presently, no landfill site exists in that area based on scientific site selection criteria. For this reason, an attempt has been carried out to find the best locations for landfills. A total of 15 variables were considered in this process (groundwater depth, rivers, soil types, agricultural land use, land use, elevation, slope, gas pipelines, oil pipelines, power lines, roads, railways, urban centres, villages and archaeological sites) using a geographic information system. In addition, an analytical hierarchy process was used to identify the weight for each variable. Two suitable candidate landfill sites were determined that fulfil the requirements with an area of 9.153 km(2) and 8.204 km(2) These sites can accommodate solid waste till 2030. © The Author(s) 2016.
Hu, Yijie; Deng, Liqing; Chen, Jinwu; Zhou, Siyu; Liu, Shuang; Fu, Yufan; Yang, Chunxian; Liao, Zhihua; Chen, Min
2016-03-01
Purple sweet potato (Ipomoea batatas L.) is rich in anthocyanin pigments, which are valuable constituents of the human diet. Techniques to identify and quantify anthocyanins and their antioxidant potential are desirable for cultivar selection and breeding. In this study, we performed a quantitative and qualitative chemical analysis of 30 purple sweet potato (PSP) cultivars, using various assays to measure reducing power radical-scavenging activities, and linoleic acid autoxidation inhibition activity. Grey relational analysis (GRA) was applied to establish relationships between the antioxidant activities and the chemical fingerprints, in order to identify key bioactive compounds. The results indicated that four peonidin-based anthocyanins and three cyanidin-based anthocyanins make significant contributions to antioxidant activity. We conclude that the analytical pipeline described here represents an effective method to evaluate the antioxidant potential of, and the contributing compounds present in, PSP cultivars. This approach may be used to guide future breeding strategies. Copyright © 2015. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
Designing a reliable leak bio-detection system for natural gas pipelines.
Batzias, F A; Siontorou, C G; Spanidis, P-M P
2011-02-15
Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.
Pipeline safety fund : minimum balance was not reasonably estimated
DOT National Transportation Integrated Search
2001-04-01
The Office of Pipeline Safety (OPS), a component of the Research and Special Programs Administration (RSPA) of the Department of Transportation (DOT), performs a variety of activities related to the safety of natural gas (NG) and hazardous liquid (HL...
Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko
2016-06-01
Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.
Chen, I-Hsuan; Aguilar, Hillary Andaluz; Paez Paez, J Sebastian; Wu, Xiaofeng; Pan, Li; Wendt, Michael K; Iliuk, Anton B; Zhang, Ying; Tao, W Andy
2018-05-15
Glycoproteins comprise more than half of current FDA-approved protein cancer markers, but the development of new glycoproteins as disease biomarkers has been stagnant. Here we present a pipeline to develop glycoproteins from extracellular vesicles (EVs) through integrating quantitative glycoproteomics with a novel reverse phase glycoprotein array and then apply it to identify novel biomarkers for breast cancer. EV glycoproteomics show promise in circumventing the problems plaguing current serum/plasma glycoproteomics and allowed us to identify hundreds of glycoproteins that have not been identified in blood. We identified 1,453 unique glycopeptides representing 556 glycoproteins in EVs, among which 20 were verified significantly higher in individual breast cancer patients. We further applied a novel glyco-specific reverse phase protein array to quantify a subset of the candidates. Together, this study demonstrates the great potential of this integrated pipeline for biomarker discovery.
Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric
2014-01-29
Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.
Channel erosion surveys along TAPS route, Alaska, 1974
Childers, Joseph; Jones, Stanley H.
1975-01-01
Repeated site surveys and aerial photographs at 26 stream crossings along the trans-Alaska pipeline system (TAPS) route during the period 1969-74 provide chronologie records of channel changes that predate pipeline-related construction at the sites. The 1974 surveys and photographs show some of the channel changes wrought by construction of the haul road from the Yukon River to Prudhoe Bay and by construction of camps and working pads all along the pipeline route. No pipeline crossings were constructed before 1975. These records of channel changes together with flood and icing measurements are part of the United States Department of the lnterior's continuing surveillance program to document the hydrologic aspects of the trans-Alaska pipeline and its environmental impacts.
NASA Astrophysics Data System (ADS)
Tomarov, G. V.; Shipkov, A. A.; Lovchev, V. N.; Gutsev, D. F.
2016-10-01
Problems of metal flow-accelerated corrosion (FAC) in the pipelines and equipment of the condensate- feeding and wet-steam paths of NPP power-generating units (PGU) are examined. Goals, objectives, and main principles of the methodology for the implementation of an integrated program of AO Concern Rosenergoatom for the prevention of unacceptable FAC thinning and for increasing operational flow-accelerated corrosion resistance of NPP EaP are worded (further the Program). A role is determined and potentialities are shown for the use of Russian software packages in the evaluation and prediction of FAC rate upon solving practical problems for the timely detection of unacceptable FAC thinning in the elements of pipelines and equipment (EaP) of the secondary circuit of NPP PGU. Information is given concerning the structure, properties, and functions of the software systems for plant personnel support in the monitoring and planning of the inservice inspection of FAC thinning elements of pipelines and equipment of the secondary circuit of NPP PGUs, which are created and implemented at some Russian NPPs equipped with VVER-1000, VVER-440, and BN-600 reactors. It is noted that one of the most important practical results of software packages for supporting NPP personnel concerning the issue of flow-accelerated corrosion consists in revealing elements under a hazard of intense local FAC thinning. Examples are given for successful practice at some Russian NPP concerning the use of software systems for supporting the personnel in early detection of secondary-circuit pipeline elements with FAC thinning close to an unacceptable level. Intermediate results of working on the Program are presented and new tasks set in 2012 as a part of the updated program are denoted. The prospects of the developed methods and tools in the scope of the Program measures at the stages of design and construction of NPP PGU are discussed. The main directions of the work on solving the problems of flow-accelerated corrosion of pipelines and equipment in Russian NPP PGU are defined.
Rep. Young, Don [R-AK-At Large
2010-09-29
House - 09/30/2010 Referred to the Subcommittee on Railroads, Pipelines, and Hazardous Materials. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
tcpl: The ToxCast Pipeline for High-Throughput Screening Data
Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 192.616 Section 192.616... BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Operations § 192.616 Public awareness. (a) Except for..., each pipeline operator must develop and implement a written continuing public education program that...
Xie, Jing; Lu, Xiongxiong; Wu, Xue; Lin, Xiaoyi; Zhang, Chao; Huang, Xiaofang; Chang, Zhili; Wang, Xinjing; Wen, Chenlei; Tang, Xiaomei; Shi, Minmin; Zhan, Qian; Chen, Hao; Deng, Xiaxing; Peng, Chenghong; Li, Hongwei; Fang, Yuan; Shao, Yang; Shen, Baiyong
2016-05-01
Targeted therapies including monoclonal antibodies and small molecule inhibitors have dramatically changed the treatment of cancer over past 10 years. Their therapeutic advantages are more tumor specific and with less side effects. For precisely tailoring available targeted therapies to each individual or a subset of cancer patients, next-generation sequencing (NGS) has been utilized as a promising diagnosis tool with its advantages of accuracy, sensitivity, and high throughput. We developed and validated a NGS-based cancer genomic diagnosis targeting 115 prognosis and therapeutics relevant genes on multiple specimen including blood, tumor tissue, and body fluid from 10 patients with different cancer types. The sequencing data was then analyzed by the clinical-applicable analytical pipelines developed in house. We have assessed analytical sensitivity, specificity, and accuracy of the NGS-based molecular diagnosis. Also, our developed analytical pipelines were capable of detecting base substitutions, indels, and gene copy number variations (CNVs). For instance, several actionable mutations of EGFR,PIK3CA,TP53, and KRAS have been detected for indicating drug susceptibility and resistance in the cases of lung cancer. Our study has shown that NGS-based molecular diagnosis is more sensitive and comprehensive to detect genomic alterations in cancer, and supports a direct clinical use for guiding targeted therapy.
iSeq: Web-Based RNA-seq Data Analysis and Visualization.
Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng
2018-01-01
Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .
Increasing Diversity and Gender Parity by working with Professional Organizations and HBCUs
NASA Astrophysics Data System (ADS)
Wims, T. R.
2017-12-01
Context/Purpose: This abstract proposes tactics for recruiting diverse applicants and addressing gender parity in the geoscience workforce. Methods: The geoscience community should continue to develop and expand a pipeline of qualified potential employees and managers at all levels. Recruitment from professional organizations, which are minority based, such as the National Society of Black Engineers (NSBE), and the Society of Hispanic Professional Engineers (SHPE) provides senior and midlevel scientists, engineers, program managers, and corporate managers/administrators with proven track records of success. Geoscience organizations should consider increasing hiring from the 100+ Historically Black Colleges and Universities (HBCU) which have a proven track records of producing high quality graduates with math, science, computer science, and engineering backgrounds. HBCU alumni have been working in all levels of government and corporate organizations for more than 50 years. Results: Professional organizations, like NSBE, have members with one to 40 years of applicable work experience, who are prime candidates for employment in the geoscience community at all levels. NSBE, also operates pipeline programs to graduate 10,000 bachelor degree minority candidates per year by 2025, up from the current 3,620/year. HBCUs have established educational programs and several pipelines for attracting undergraduate students into the engineering and science fields. Since many HBCUs enroll more women than men, they are also addressing gender parity. Both professional organizations and HBCU's have pipeline programs that reach children in high school. Interpretation: Qualified and capable minority and women candidates are available in the United States. Pipelines for employing senior, mid-level, and junior skill sets are in place, but underutilized by some geoscience companies and organizations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robin Gordon; Bill Bruce; Nancy Porter
2003-05-01
The two broad categories of deposited weld metal repair and fiber-reinforced composite repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repairs and for fiber-reinforced composite repair. To date, all of the experimental work pertaining to the evaluation of potential repair methods has focused on fiber-reinforced composite repairs. Hydrostatic testing was also conducted on four pipeline sections with simulated corrosion damage: twomore » with composite liners and two without.« less
Cyber Power Potential of the Army’s Reserve Component
2017-01-01
and could extend logically to include electric power, water, food, railway, gas pipelines , and so forth. One consideration to note is that in cases...29 CHAPTER FOUR Army Reserve Component Cyber Inventory Analysis .......................... 31...Background and Analytical Framework ........................................................... 31 Army Reserve Component Cyber Inventory Analysis , 2015
A Bridge to the Stars: A Model High School-to-College Pipeline to Improve Diversity in STEM
NASA Astrophysics Data System (ADS)
McIntosh, Daniel H.; Jennings, Derrick H.
2017-01-01
Increasing participation by historically underrepresented Americans in the STEM workforce remains a national priority. Existing strategies have failed to increase diversity especially in the physical sciences despite federal mandates. To meet this urgent challenge, it is imperative to immediately identify and support the expansion of effective high school-to-college STEM pipelines. A Bridge to the Stars (ABttS) is a creative and tested pipeline designed to steadily increase the numbers of disadvantaged 15-21 year-olds pursuing and completing 4-year STEM degrees. This unique program offers extended engagement in astronomy, arguably the most accessible window to science, through a 3-tier STEM immersion program of innovative learning (in a freshman science course), authentic research training (in a freshman science lab), and supportive near-peer mentoring at U.Missouri-Kansas City, an urban research university. Each tier of the ABttS pipeline by itself has the potential to broaden student aspirations for careers as technological innovators or STEM educators. Students who elect to transition through multiple tiers will substantially reinforce their successes with STEM activities, and significantly bolster their self-esteem necessary to personally manifest STEM aspirations. We will summarize the impact of this program after 5 years, and share our latest improvements. The long-term mission of ABttS is to see urban educational institutions across the U.S. adopt similar pipelines in all STEM disciplines built on the ABttS model.
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Public awareness. 195.440 Section 195.440... PIPELINE Operation and Maintenance § 195.440 Public awareness. (a) Each pipeline operator must develop and implement a written continuing public education program that follows the guidance provided in the American...
A novel pipeline based FPGA implementation of a genetic algorithm
NASA Astrophysics Data System (ADS)
Thirer, Nonel
2014-05-01
To solve problems when an analytical solution is not available, more and more bio-inspired computation techniques have been applied in the last years. Thus, an efficient algorithm is the Genetic Algorithm (GA), which imitates the biological evolution process, finding the solution by the mechanism of "natural selection", where the strong has higher chances to survive. A genetic algorithm is an iterative procedure which operates on a population of individuals called "chromosomes" or "possible solutions" (usually represented by a binary code). GA performs several processes with the population individuals to produce a new population, like in the biological evolution. To provide a high speed solution, pipelined based FPGA hardware implementations are used, with a nstages pipeline for a n-phases genetic algorithm. The FPGA pipeline implementations are constraints by the different execution time of each stage and by the FPGA chip resources. To minimize these difficulties, we propose a bio-inspired technique to modify the crossover step by using non identical twins. Thus two of the chosen chromosomes (parents) will build up two new chromosomes (children) not only one as in classical GA. We analyze the contribution of this method to reduce the execution time in the asynchronous and synchronous pipelines and also the possibility to a cheaper FPGA implementation, by using smaller populations. The full hardware architecture for a FPGA implementation to our target ALTERA development card is presented and analyzed.
DOT National Transportation Integrated Search
2012-08-30
Preventing unauthorized intrusions on pipeline Right of Ways (ROWs) and mechanical damage due to third party strikes by machinery is a constant challenge for the pipeline industry. Equally important for safety and environmental protection is the dete...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Dyke, G.D.; Shem, L.M.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and night- of-way management practices. This report presents the results of a survey conducted on August 22, 1991, in an emergent intertidal estuarine wetland in Terrebonne Parish, Louisiana. The site includes three pipelines installed between 1958 and 1969. Vegetation within the site comprises three native tidal marsh grasses: Spartina alterniflora, Spartina patens, and Distichlismore » spicata. All three species occurred over the pipelines, within the right-of-way and in both natural areas. Vegetative differences attributable to the installation or presence of the pipelines were not obvious over the pipelines or in the habitat east of the pipelines. However, because of the presence of a canal west of the 1969 pipeline, vegetation was less abundant in that area, and D. spicata was absent from all but the most distant plots of the transacts. Data obtained in the study indicate that when rights-of-way through brackish marsh are restored to their original elevations, they are revegetated with native vegetation similar to that in surrounding areas.« less
Programming the Navier-Stokes computer: An abstract machine model and a visual editor
NASA Technical Reports Server (NTRS)
Middleton, David; Crockett, Tom; Tomboulian, Sherry
1988-01-01
The Navier-Stokes computer is a parallel computer designed to solve Computational Fluid Dynamics problems. Each processor contains several floating point units which can be configured under program control to implement a vector pipeline with several inputs and outputs. Since the development of an effective compiler for this computer appears to be very difficult, machine level programming seems necessary and support tools for this process have been studied. These support tools are organized into a graphical program editor. A programming process is described by which appropriate computations may be efficiently implemented on the Navier-Stokes computer. The graphical editor would support this programming process, verifying various programmer choices for correctness and deducing values such as pipeline delays and network configurations. Step by step details are provided and demonstrated with two example programs.
Mortise terrorism on the main pipelines
NASA Astrophysics Data System (ADS)
Komarov, V. A.; Nigrey, N. N.; Bronnikov, D. A.; Nigrey, A. A.
2018-01-01
The research aim of the work is to analyze the effectiveness of the methods of physical protection of main pipelines proposed in the article from the "mortise terrorism" A mathematical model has been developed that made it possible to predict the dynamics of "mortise terrorism" in the short term. An analysis of the effectiveness of physical protection methods proposed in the article to prevent unauthorized impacts on the objects under investigation is given. A variant of a video analytics system has been developed that allows detecting violators with recognition of the types of work they perform at a distance of 150 meters in conditions of complex natural backgrounds and precipitation. Probability of detection is 0.959.
Master-slave mixed arrays for data-flow computations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, T.L.; Fisher, P.D.
1983-01-01
Control cells (masters) and computation cells (slaves) are mixed in regular geometric patterns to form reconfigurable arrays known as master-slave mixed arrays (MSMAS). Interconnections of the corners and edges of the hexagonal control cells and the edges of the hexagonal computation cells are used to construct synchronous and asynchronous communication networks, which support local computation and local communication. Data-driven computations result in self-directed ring pipelines within the MSMA, and composite data-flow computations are executed in a pipelined fashion. By viewing an MSMA as a computing network of tightly-linked ring pipelines, data-flow programs can be uniformly distributed over these pipelines formore » efficient resource utilisation. 9 references.« less
NASA Astrophysics Data System (ADS)
Dunn, Warwick B.
2008-03-01
The functional levels of biological cells or organisms can be separated into the genome, transcriptome, proteome and metabolome. Of these the metabolome offers specific advantages to the investigation of the phenotype of biological systems. The investigation of the metabolome (metabolomics) has only recently appeared as a mainstream scientific discipline and is currently developing rapidly for the study of microbial, plant and mammalian metabolomes. The metabolome pipeline or workflow encompasses the processes of sample collection and preparation, collection of analytical data, raw data pre-processing, data analysis and data storage. Of these processes the collection of analytical data will be discussed in this review with specific interest shown in the application of mass spectrometry in the metabolomics pipeline. The current developments in mass spectrometry platforms (GC-MS, LC-MS, DIMS and imaging MS) and applications of specific interest will be highlighted. The current limitations of these platforms and applications will be discussed with areas requiring further development also highlighted. These include the detectable coverage of the metabolome, the identification of metabolites and the process of converting raw data to biological knowledge.
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
SEALS: an Innovative Pipeline Program Targeting Obstacles to Diversity in the Physician Workforce.
Fritz, Cassandra D L; Press, Valerie G; Nabers, Darrell; Levinson, Dana; Humphrey, Holly; Vela, Monica B
2016-06-01
Medical schools may find implementing pipeline programs for minority pre-medical students prohibitive due to a number of factors including the lack of well-described programs in the literature, the limited evidence for program development, and institutional financial barriers. Our goals were to (1) design a pipeline program based on educational theory; (2) deliver the program in a low cost, sustainable manner; and (3) evaluate intermediate outcomes of the program. SEALS is a 6-week program based on an asset bundles model designed to promote: (1) socialization and professionalism, (2) education in science learning tools, (3) acquisition of finance literacy, (4) the leveraging of mentorship and networks, and (5) social expectations and resilience, among minority pre-medical students. This is a prospective mixed methods study. Students completed survey instruments pre-program, post-program, and 6 months post-program, establishing intermediate outcome measures. Thirteen students matriculated to SEALS. The SEALS cohort rated themselves as improved or significantly improved when asked to rate their familiarity with MCAT components (p < 0.01), ability to ask for a letter of recommendation (p = 0.04), and importance of interview skills (p = 0.04) compared with before the program. Over 90 % of students referenced the health disparities lecture series as an inspiration to advocate for minority health. Six-month surveys suggested that SEALS students acquired and applied four of the five assets at their college campuses. This low-cost, high-quality, program can be undertaken by medical schools interested in promoting a diverse workforce that may ultimately begin to address and reduce health care disparities.
2014-01-01
Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911
Shaikh, Faiq; Franc, Benjamin; Allen, Erastus; Sala, Evis; Awan, Omer; Hendrata, Kenneth; Halabi, Safwan; Mohiuddin, Sohaib; Malik, Sana; Hadley, Dexter; Shrestha, Rasu
2018-03-01
Enterprise imaging has channeled various technological innovations to the field of clinical radiology, ranging from advanced imaging equipment and postacquisition iterative reconstruction tools to image analysis and computer-aided detection tools. More recently, the advancement in the field of quantitative image analysis coupled with machine learning-based data analytics, classification, and integration has ushered in the era of radiomics, a paradigm shift that holds tremendous potential in clinical decision support as well as drug discovery. However, there are important issues to consider to incorporate radiomics into a clinically applicable system and a commercially viable solution. In this two-part series, we offer insights into the development of the translational pipeline for radiomics from methodology to clinical implementation (Part 1) and from that point to enterprise development (Part 2). In Part 2 of this two-part series, we study the components of the strategy pipeline, from clinical implementation to building enterprise solutions. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng
2014-01-01
Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines that are specialized for health data researchers. PMID:24370496
Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng
2014-04-01
Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines that are specialized for health data researchers. Copyright © 2013 Elsevier Inc. All rights reserved.
Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krause, E.; et al.
We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos
2017-08-01
Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Zimmerman, R.E.; Alsum, S.K.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents results of a survey conducted over the period of August 5--7, 1991, at the Little Timber Creek crossing in Gloucester County, New Jersey, where three pipelines, constructed in 1950, 1960, and 1990, cross the creek and associated wetlands. The old side of the ROW, created bymore » the installation of the 1960 pipeline, was designed to contain a raised peat bed over the 1950 pipeline and an open-water ditch over the 1960 pipeline. The new portion of the ROW, created by installation of the 1990 pipeline, has an open-water ditch over the pipeline (resulting from settling of the backfill) and a raised peat bed (resulting from rebound of compacted peat). Both the old and new ROWs contain dense stands of herbs; the vegetation on the old ROW was more similar to that in the adjacent natural area than was vegetation in the new ROW. The ROW increased species and habitat diversity in the wetlands. It may contribute to the spread of purple loosestrife and affect species sensitive to habitat fragmentation.« less
Bio-Docklets: virtualization containers for single-step execution of NGS pipelines
Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis
2017-01-01
Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616
Park, Byeonghyeok; Baek, Min-Jeong; Min, Byoungnam; Choi, In-Geol
2017-09-01
Genome annotation is a primary step in genomic research. To establish a light and portable prokaryotic genome annotation pipeline for use in individual laboratories, we developed a Shiny app package designated as "P-CAPS" (Prokaryotic Contig Annotation Pipeline Server). The package is composed of R and Python scripts that integrate publicly available annotation programs into a server application. P-CAPS is not only a browser-based interactive application but also a distributable Shiny app package that can be installed on any personal computer. The final annotation is provided in various standard formats and is summarized in an R markdown document. Annotation can be visualized and examined with a public genome browser. A benchmark test showed that the annotation quality and completeness of P-CAPS were reliable and compatible with those of currently available public pipelines.
Visual Analytics for Power Grid Contingency Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.; Huang, Zhenyu; Chen, Yousu
2014-01-20
Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure tomore » do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.« less
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-02-01
Upper East Fork Poplar Creek Operable Unit 2 consists of the Abandoned Nitric Acid pipeline (ANAP). This pipeline was installed in 1951 to transport liquid wastes {approximately}4800 ft from Buildings 9212, 9215, and 9206 to the S-3 Ponds. Materials known to have been discharged through the pipeline include nitric acid, depleted and enriched uranium, various metal nitrates, salts, and lead skimmings. During the mid-1980s, sections of the pipeline were removed during various construction projects. A total of 19 locations were chosen to be investigated along the pipeline for the first phase of this Remedial Investigation. Sampling consisted of drilling downmore » to obtain a soil sample at a depth immediately below the pipeline. Additional samples were obtained deeper in the subsurface depending upon the depth of the pipeline, the depth of the water table, and the point of auger refusal. The 19 samples collected below the pipeline were analyzed by the Oak Ridge Y-12 Plant`s laboratory for metals, nitrate/nitrite, and isotopic uranium. Samples collected from three boreholes were also analyzed for volatile organic compounds because these samples produced a response with organic vapor monitoring equipment. Uranium activities in the soil samples ranged from 0.53 to 13.0 pCi/g for {sup 238}U, from 0.075 to 0.75 pCi/g for {sup 235}U, and from 0.71 to 5.0 pCi/g for {sup 238}U. Maximum total values for lead, chromium, and nickel were 75.1 mg/kg, 56.3 mg/kg, and 53.0 mg/kg, respectively. The maximum nitrate/nitrite value detected was 32.0 mg-N/kg. One sample obtained adjacent to a sewer line contained various organic compounds, at least some of which were tentatively identified as fragrance chemicals commonly associated with soaps and cleaning solutions. The results of the baseline human health risk assessment for the ANAP contaminants of potential concern show no unacceptable risks to human health.« less
The Health Equity Scholars Program: Innovation in the Leaky Pipeline.
Upshur, Carole C; Wrighting, Diedra M; Bacigalupe, Gonzalo; Becker, Joan; Hayman, Laura; Lewis, Barbara; Mignon, Sylvia; Rokop, Megan E; Sweet, Elizabeth; Torres, Marie Idali; Watanabe, Paul; Woods, Cedric
2018-04-01
Despite attempts to increase enrollment of under-represented minorities (URMs: primarily Black/African American, Hispanic/Latino, and Native American students) in health professional programs, limited progress has been made. Compelling reasons to rectify this situation include equity for URMs, better prepared health professionals when programs are diverse, better quality and access to health care for UMR populations, and the need for diverse talent to tackle difficult questions in health science and health care delivery. However, many students who initiate traditional "pipeline" programs designed to link URMs to professional schools in health professions and the sciences, do not complete them. In addition, program requirements often restrict entry to highly qualified students while not expanding opportunities for promising, but potentially less well-prepared candidates. The current study describes innovations in an undergraduate pipeline program, the Health Equity Scholars Program (HESP) designed to address barriers URMs experience in more traditional programs, and provides evaluative outcomes and qualitative feedback from participants. A primary outcome was timely college graduation. Eighty percent (80%) of participants, both transfer students and first time students, so far achieved this outcome, with 91% on track, compared to the campus average of 42% for all first time students and 58-67% for transfers. Grade point averages also improved (p = 0.056) after program participation. Graduates (94%) were working in health care/human services positions and three were in health-related graduate programs. Creating a more flexible program that admits a broader range of URMs has potential to expand the numbers of URM students interested and prepared to make a contribution to health equity research and clinical care.
King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.
King, Andrew J.; Fisher, Arielle M.; Becich, Michael J.; Boone, David N.
2017-01-01
The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist. PMID:28400991
"Pushed" to Teach: Pedagogies and Policies for a Black Women Educator Pipeline
ERIC Educational Resources Information Center
Gist, Conra D.; White, Terrenda; Bianco, Margarita
2018-01-01
This research study examines the learning experiences of 11th- and 12th-grade Black girls participating in a precollegiate program committed to increasing the number of Teachers of Color entering the profession by viewing a teaching career as an act of social justice committed to educational equity. The pipeline functions as an education reform…
The Challenge of Creating a More Diverse Economics: Lessons from the UCR Minority Pipeline Project
ERIC Educational Resources Information Center
Dymski, Gary A.
2017-01-01
This paper reflects on the experience of the 1999-2002 minority pipeline program (MPP) at the University of California, Riverside. With support from the American Economic Association, the MPP identified students of color interested in economics, let them explore economic issues affecting minority communities, and encouraged them to consider…
MetaDB a Data Processing Workflow in Untargeted MS-Based Metabolomics Experiments.
Franceschi, Pietro; Mylonas, Roman; Shahaf, Nir; Scholz, Matthias; Arapitsas, Panagiotis; Masuero, Domenico; Weingart, Georg; Carlin, Silvia; Vrhovsek, Urska; Mattivi, Fulvio; Wehrens, Ron
2014-01-01
Due to their sensitivity and speed, mass-spectrometry based analytical technologies are widely used to in metabolomics to characterize biological phenomena. To address issues like metadata organization, quality assessment, data processing, data storage, and, finally, submission to public repositories, bioinformatic pipelines of a non-interactive nature are often employed, complementing the interactive software used for initial inspection and visualization of the data. These pipelines often are created as open-source software allowing the complete and exhaustive documentation of each step, ensuring the reproducibility of the analysis of extensive and often expensive experiments. In this paper, we will review the major steps which constitute such a data processing pipeline, discussing them in the context of an open-source software for untargeted MS-based metabolomics experiments recently developed at our institute. The software has been developed by integrating our metaMS R package with a user-friendly web-based application written in Grails. MetaMS takes care of data pre-processing and annotation, while the interface deals with the creation of the sample lists, the organization of the data storage, and the generation of survey plots for quality assessment. Experimental and biological metadata are stored in the ISA-Tab format making the proposed pipeline fully integrated with the Metabolights framework.
Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers
Vahdati, Nader; Lawand, Lydia
2017-01-01
Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature. PMID:28956847
Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers.
Al Handawi, Khalil; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia
2017-09-28
Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber's modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.
Bailit, Howard L
2010-10-01
Disparities in access to dental care are a major problem in the United States. Effectively run community-based dental education programs can make a significant contribution to reducing access disparities and at the same time enrich the educational experiences of dental students and residents. For complex historical reasons, dental schools did not base their clinical training programs in community hospitals and clinics like the other health professions. Now, because of trends in school finances, changes in societal values, and limitations in current educational experiences, schools are increasing the time students spend in community clinics. This is likely to continue. The chapters in the first section of the report on the Pipeline, Profession, and Practice: Community-Based Dental Education program--for which this chapter serves as an introduction-provide detailed information on the operation of community-based education programs.
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
Jiang, Yue; Xiong, Xuejian; Danska, Jayne; Parkinson, John
2016-01-12
Metatranscriptomics is emerging as a powerful technology for the functional characterization of complex microbial communities (microbiomes). Use of unbiased RNA-sequencing can reveal both the taxonomic composition and active biochemical functions of a complex microbial community. However, the lack of established reference genomes, computational tools and pipelines make analysis and interpretation of these datasets challenging. Systematic studies that compare data across microbiomes are needed to demonstrate the ability of such pipelines to deliver biologically meaningful insights on microbiome function. Here, we apply a standardized analytical pipeline to perform a comparative analysis of metatranscriptomic data from diverse microbial communities derived from mouse large intestine, cow rumen, kimchi culture, deep-sea thermal vent and permafrost. Sequence similarity searches allowed annotation of 19 to 76% of putative messenger RNA (mRNA) reads, with the highest frequency in the kimchi dataset due to its relatively low complexity and availability of closely related reference genomes. Metatranscriptomic datasets exhibited distinct taxonomic and functional signatures. From a metabolic perspective, we identified a common core of enzymes involved in amino acid, energy and nucleotide metabolism and also identified microbiome-specific pathways such as phosphonate metabolism (deep sea) and glycan degradation pathways (cow rumen). Integrating taxonomic and functional annotations within a novel visualization framework revealed the contribution of different taxa to metabolic pathways, allowing the identification of taxa that contribute unique functions. The application of a single, standard pipeline confirms that the rich taxonomic and functional diversity observed across microbiomes is not simply an artefact of different analysis pipelines but instead reflects distinct environmental influences. At the same time, our findings show how microbiome complexity and availability of reference genomes can impact comprehensive annotation of metatranscriptomes. Consequently, beyond the application of standardized pipelines, additional caution must be taken when interpreting their output and performing downstream, microbiome-specific, analyses. The pipeline used in these analyses along with a tutorial has been made freely available for download from our project website: http://www.compsysbio.org/microbiome .
Lammers, Youri; Peelen, Tamara; Vos, Rutger A; Gravendeel, Barbara
2014-02-06
Mixtures of internationally traded organic substances can contain parts of species protected by the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). These mixtures often raise the suspicion of border control and customs offices, which can lead to confiscation, for example in the case of Traditional Chinese medicines (TCMs). High-throughput sequencing of DNA barcoding markers obtained from such samples provides insight into species constituents of mixtures, but manual cross-referencing of results against the CITES appendices is labor intensive. Matching DNA barcodes against NCBI GenBank using BLAST may yield misleading results both as false positives, due to incorrectly annotated sequences, and false negatives, due to spurious taxonomic re-assignment. Incongruence between the taxonomies of CITES and NCBI GenBank can result in erroneous estimates of illegal trade. The HTS barcode checker pipeline is an application for automated processing of sets of 'next generation' barcode sequences to determine whether these contain DNA barcodes obtained from species listed on the CITES appendices. This analytical pipeline builds upon and extends existing open-source applications for BLAST matching against the NCBI GenBank reference database and for taxonomic name reconciliation. In a single operation, reads are converted into taxonomic identifications matched with names on the CITES appendices. By inclusion of a blacklist and additional names databases, the HTS barcode checker pipeline prevents false positives and resolves taxonomic heterogeneity. The HTS barcode checker pipeline can detect and correctly identify DNA barcodes of CITES-protected species from reads obtained from TCM samples in just a few minutes. The pipeline facilitates and improves molecular monitoring of trade in endangered species, and can aid in safeguarding these species from extinction in the wild. The HTS barcode checker pipeline is available at https://github.com/naturalis/HTS-barcode-checker.
2014-01-01
Background Mixtures of internationally traded organic substances can contain parts of species protected by the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). These mixtures often raise the suspicion of border control and customs offices, which can lead to confiscation, for example in the case of Traditional Chinese medicines (TCMs). High-throughput sequencing of DNA barcoding markers obtained from such samples provides insight into species constituents of mixtures, but manual cross-referencing of results against the CITES appendices is labor intensive. Matching DNA barcodes against NCBI GenBank using BLAST may yield misleading results both as false positives, due to incorrectly annotated sequences, and false negatives, due to spurious taxonomic re-assignment. Incongruence between the taxonomies of CITES and NCBI GenBank can result in erroneous estimates of illegal trade. Results The HTS barcode checker pipeline is an application for automated processing of sets of 'next generation’ barcode sequences to determine whether these contain DNA barcodes obtained from species listed on the CITES appendices. This analytical pipeline builds upon and extends existing open-source applications for BLAST matching against the NCBI GenBank reference database and for taxonomic name reconciliation. In a single operation, reads are converted into taxonomic identifications matched with names on the CITES appendices. By inclusion of a blacklist and additional names databases, the HTS barcode checker pipeline prevents false positives and resolves taxonomic heterogeneity. Conclusions The HTS barcode checker pipeline can detect and correctly identify DNA barcodes of CITES-protected species from reads obtained from TCM samples in just a few minutes. The pipeline facilitates and improves molecular monitoring of trade in endangered species, and can aid in safeguarding these species from extinction in the wild. The HTS barcode checker pipeline is available at https://github.com/naturalis/HTS-barcode-checker. PMID:24502833
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grafe, J.L.
During the past decade many changes have taken place in the natural gas industry, not the least of which is the way information (data) is acquired, moved, compiled, integrated and disseminated within organizations. At El Paso Natural Gas Company (EPNG) the Operations Control Department has been at the center of these changes. The Systems Section within Operations Control has been instrumental in developing the computer programs that acquire and store real-time operational data, and then make it available to not only the Gas Control function, but also to anyone else within the company who might require it and, to amore » limited degree, any supplier or purchaser of gas utilizing the El Paso pipeline. These computer programs which make up the VISA system are, in effect, the tools that help move the data that flows in the pipeline of information within the company. Their integration into this pipeline process is the topic of this paper.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wynne, Adam S.
2011-05-05
In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less
Welding and NDT development in support of Oman-India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Even, T.M.; Laing, B.; Hirsch, D.
1995-12-01
The Oman to India gas pipeline is designed for a maximum water depth of 3,500 m. For such a pipeline, resistance to hydrostatic collapse is a critical factor and dictates that very heavy wall pipe be used, preliminarily 24 inch ID x 1.625 inch wall. Because of the water depth, much of the installation will be by J-Lay which requires that the Joint be welded and inspected in a single station. This paper describes the results of welding and NDT test programs conducted to determine the minimum time to perform these operations in heavy wall pipe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of a survey conducted August 17--19, 1992, at the Norris Brook crossing in the town of Peabody, Essex County, Massachusetts. The pipeline at this site was installed during September and October 1990. A backhoe was used to install the pipeline. The pipe was assembledmore » on the adjacent upland and slid into the trench, after which the backhoe was used again to fill the trench and cover the pipeline. Within two years after pipeline construction, a dense vegetative community, composed predominantly of native perennial species, had become established on the ROW. Compared with adjacent natural areas undisturbed by pipeline installation, there was an increase in purple loosestrife and cattail within the ROW, while large woody species were excluded from the ROW. As a result of the ROW`s presence, habitat diversity, edge-type habitat, and species diversity increased within the site. Crooked-stem aster, Aster prenanthoides (a species on the Massasschusetts list of plants of special concern), occurred in low numbers in the adjacent natural areas and had reinvaded the ROW in low numbers.« less
The Stanford Medical Youth Science Program: Educational and Science-Related Outcomes
ERIC Educational Resources Information Center
Crump, Casey; Ned, Judith; Winkleby, Marilyn A.
2015-01-01
Biomedical preparatory programs (pipeline programs) have been developed at colleges and universities to better prepare youth for entering science- and health-related careers, but outcomes of such programs have seldom been rigorously evaluated. We conducted a matched cohort study to evaluate the Stanford Medical Youth Science Program's Summer…
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
ERIC Educational Resources Information Center
Russell, Melody L.; Atwater, Mary M.
2005-01-01
This study focuses on 11 African American undergraduate seniors in a biology degree program at a predominantly white research institution in the southeastern United States. These 11 respondents shared their journeys throughout the high school and college science pipeline. Participants described similar precollege factors and experiences that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stencel, J.M.; Ochsenbein, M.P.
2003-04-14
The KY DOE EPSCoR Program included efforts to impact positively the pipeline of science and engineering students and to establish research, education and business infrastructure, sustainable beyond DOE EPSCoR funding.
The American Science Pipeline: Sustaining Innovation in a Time of Economic Crisis
Hue, Gillian; Sales, Jessica; Comeau, Dawn; Lynn, David G.
2010-01-01
Significant limitations have emerged in America's science training pipeline, including inaccessibility, inflexibility, financial limitations, and lack of diversity. We present three effective programs that collectively address these challenges. The programs are grounded in rigorous science and integrate through diverse disciplines across undergraduate, graduate, and postdoctoral students, and resonate with the broader community. We discuss these models in the context of current economic constraints on higher education and the urgent need for our institutions to recruit and retain diverse student populations and sustain the successful American record in scientific education and innovation. PMID:21123689
"A Hidden Part of Me": Latino/a Students, Silencing, and the Epidermalization of Inferiority
ERIC Educational Resources Information Center
Irizarry, Jason G.; Raible, John
2014-01-01
Using Critical Race Theory (CRT) and Latino/a Critical Race Theory (LatCrit) as analytical tools, this article examines the experiences of a seven Latino/a high school students at various points of engagement with the school-to-prison pipeline. Building on and extending Franz Fanon's (1952) concept of the epidermalization of inferiority, the…
NASA Astrophysics Data System (ADS)
Wang, Xun; Ghidaoui, Mohamed S.
2018-07-01
This paper considers the problem of identifying multiple leaks in a water-filled pipeline based on inverse transient wave theory. The analytical solution to this problem involves nonlinear interaction terms between the various leaks. This paper shows analytically and numerically that these nonlinear terms are of the order of the leak sizes to the power two and; thus, negligible. As a result of this simplification, a maximum likelihood (ML) scheme that identifies leak locations and leak sizes separately is formulated and tested. It is found that the ML estimation scheme is highly efficient and robust with respect to noise. In addition, the ML method is a super-resolution leak localization scheme because its resolvable leak distance (approximately 0.15λmin , where λmin is the minimum wavelength) is below the Nyquist-Shannon sampling theorem limit (0.5λmin). Moreover, the Cramér-Rao lower bound (CRLB) is derived and used to show the efficiency of the ML scheme estimates. The variance of the ML estimator approximates the CRLB proving that the ML scheme belongs to class of best unbiased estimator of leak localization methods.
Tissue-aware RNA-Seq processing and normalization for heterogeneous and sparse data.
Paulson, Joseph N; Chen, Cho-Yi; Lopes-Ramos, Camila M; Kuijjer, Marieke L; Platig, John; Sonawane, Abhijeet R; Fagny, Maud; Glass, Kimberly; Quackenbush, John
2017-10-03
Although ultrahigh-throughput RNA-Sequencing has become the dominant technology for genome-wide transcriptional profiling, the vast majority of RNA-Seq studies typically profile only tens of samples, and most analytical pipelines are optimized for these smaller studies. However, projects are generating ever-larger data sets comprising RNA-Seq data from hundreds or thousands of samples, often collected at multiple centers and from diverse tissues. These complex data sets present significant analytical challenges due to batch and tissue effects, but provide the opportunity to revisit the assumptions and methods that we use to preprocess, normalize, and filter RNA-Seq data - critical first steps for any subsequent analysis. We find that analysis of large RNA-Seq data sets requires both careful quality control and the need to account for sparsity due to the heterogeneity intrinsic in multi-group studies. We developed Yet Another RNA Normalization software pipeline (YARN), that includes quality control and preprocessing, gene filtering, and normalization steps designed to facilitate downstream analysis of large, heterogeneous RNA-Seq data sets and we demonstrate its use with data from the Genotype-Tissue Expression (GTEx) project. An R package instantiating YARN is available at http://bioconductor.org/packages/yarn .
Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi
2013-04-10
Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Stefan Devlin, Benjamin; Nakura, Toru; Ikeda, Makoto; Asada, Kunihiro
We detail a self synchronous field programmable gate array (SSFPGA) with dual-pipeline (DP) architecture to conceal pre-charge time for dynamic logic, and its throughput optimization by using pipeline alignment implemented on benchmark circuits. A self synchronous LUT (SSLUT) consists of a three input tree-type structure with 8bits of SRAM for programming. A self synchronous switch box (SSSB) consists of both pass transistors and buffers to route signals, with 12bits of SRAM. One common block with one SSLUT and one SSSB occupies 2.2Mλ2 area with 35bits of SRAM, and the prototype SSFPGA with 34 × 30 (1020) blocks is designed and fabricated using 65nm CMOS. Measured results show at 1.2V 430MHz and 647MHz operation for a 3bit ripple carry adder, without and with throughput optimization, respectively. We find that using the proposed pipeline alignment techniques we can perform at maximum throughput of 647MHz in various benchmarks on the SSFPGA. We demonstrate up to 56.1 times throughput improvement with our pipeline alignment techniques. The pipeline alignment is carried out within the number of logic elements in the array and pipeline buffers in the switching matrix.
Parameters of Solidifying Mixtures Transporting at Underground Ore Mining
NASA Astrophysics Data System (ADS)
Golik, Vladimir; Dmitrak, Yury
2017-11-01
The article is devoted to the problem of providing mining enterprises with solidifying filling mixtures at underground mining. The results of analytical studies using the data of foreign and domestic practice of solidifying mixtures delivery to stopes are given. On the basis of experimental practice the parameters of transportation of solidifying filling mixtures are given with an increase in their quality due to the effect of vibration in the pipeline. The mechanism of the delivery process and the procedure for determining the parameters of the forced oscillations of the pipeline, the characteristics of the transporting processes, the rigidity of the elastic elements of pipeline section supports and the magnitude of vibrator' driving force are detailed. It is determined that the quality of solidifying filling mixtures can be increased due to the rational use of technical resources during the transportation of mixtures, and as a result the mixtures are characterized by a more even distribution of the aggregate. The algorithm for calculating the parameters of the pipe vibro-transport of solidifying filling mixtures can be in demand in the design of mineral deposits underground mining technology.
Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G
2010-09-01
This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).
Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud. PMID:26271043
Zhou, Chengran
2017-01-01
Abstract Over the past decade, biodiversity researchers have dedicated tremendous efforts to constructing DNA reference barcodes for rapid species registration and identification. Although analytical cost for standard DNA barcoding has been significantly reduced since early 2000, further dramatic reduction in barcoding costs is unlikely because Sanger sequencing is approaching its limits in throughput and chemistry cost. Constraints in barcoding cost not only led to unbalanced barcoding efforts around the globe, but also prevented high-throughput sequencing (HTS)–based taxonomic identification from applying binomial species names, which provide crucial linkages to biological knowledge. We developed an Illumina-based pipeline, HIFI-Barcode, to produce full-length Cytochrome c oxidase subunit I (COI) barcodes from pooled polymerase chain reaction amplicons generated by individual specimens. The new pipeline generated accurate barcode sequences that were comparable to Sanger standards, even for different haplotypes of the same species that were only a few nucleotides different from each other. Additionally, the new pipeline was much more sensitive in recovering amplicons at low quantity. The HIFI-Barcode pipeline successfully recovered barcodes from more than 78% of the polymerase chain reactions that didn’t show clear bands on the electrophoresis gel. Moreover, sequencing results based on the single molecular sequencing platform Pacbio confirmed the accuracy of the HIFI-Barcode results. Altogether, the new pipeline can provide an improved solution to produce full-length reference barcodes at about one-tenth of the current cost, enabling construction of comprehensive barcode libraries for local fauna, leading to a feasible direction for DNA barcoding global biomes. PMID:29077841
Liu, Shanlin; Yang, Chentao; Zhou, Chengran; Zhou, Xin
2017-12-01
Over the past decade, biodiversity researchers have dedicated tremendous efforts to constructing DNA reference barcodes for rapid species registration and identification. Although analytical cost for standard DNA barcoding has been significantly reduced since early 2000, further dramatic reduction in barcoding costs is unlikely because Sanger sequencing is approaching its limits in throughput and chemistry cost. Constraints in barcoding cost not only led to unbalanced barcoding efforts around the globe, but also prevented high-throughput sequencing (HTS)-based taxonomic identification from applying binomial species names, which provide crucial linkages to biological knowledge. We developed an Illumina-based pipeline, HIFI-Barcode, to produce full-length Cytochrome c oxidase subunit I (COI) barcodes from pooled polymerase chain reaction amplicons generated by individual specimens. The new pipeline generated accurate barcode sequences that were comparable to Sanger standards, even for different haplotypes of the same species that were only a few nucleotides different from each other. Additionally, the new pipeline was much more sensitive in recovering amplicons at low quantity. The HIFI-Barcode pipeline successfully recovered barcodes from more than 78% of the polymerase chain reactions that didn't show clear bands on the electrophoresis gel. Moreover, sequencing results based on the single molecular sequencing platform Pacbio confirmed the accuracy of the HIFI-Barcode results. Altogether, the new pipeline can provide an improved solution to produce full-length reference barcodes at about one-tenth of the current cost, enabling construction of comprehensive barcode libraries for local fauna, leading to a feasible direction for DNA barcoding global biomes. © The Authors 2017. Published by Oxford University Press.
49 CFR 198.37 - State one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false State one-call damage prevention program. 198.37... REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.37 State one-call damage prevention program. A State must adopt a one-call damage prevention...
Users Manual for the Dynamic Student Flow Model.
1981-07-31
populations within each pipeline are reasonably homogeneous and the pipeline curriculum provides a structured path along which the student must progress...curriculum is structured, student populations are non-homogeneous. They are drawn from diverse sources such as the Naval Aca- demy, NROTC and the Aviation...Officer Candidate program in numbers subjectively determined to provide the best population for subsequent flight training. His- torically, different
Pipeline Optimization Program (PLOP)
2006-08-01
the framework of the Dredging Operations Decision Support System (DODSS, https://dodss.wes.army.mil/wiki/0). PLOP compiles industry standards and...efficiency point ( BEP ). In the interest of acceptable wear rate on the pump, industrial standards dictate that the flow Figure 2. Pump class as a function of...percentage of the flow rate corresponding to the BEP . Pump Acceptability Rules. The facts for pump performance, industrial standards and pipeline and
49 CFR 192.913 - When may an operator deviate its program from certain requirements of this subpart?
Code of Federal Regulations, 2011 CFR
2011-10-01
... management program. An operator that uses a performance-based approach that satisfies the requirements for... to demonstrate the exceptional performance of its integrity management program through the following... to the operator's pipeline system and to the operator's integrity management program; (vi) A...
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
Data processing pipeline for serial femtosecond crystallography at SACLA.
Nakane, Takanori; Joti, Yasumasa; Tono, Kensuke; Yabashi, Makina; Nango, Eriko; Iwata, So; Ishitani, Ryuichiro; Nureki, Osamu
2016-06-01
A data processing pipeline for serial femtosecond crystallography at SACLA was developed, based on Cheetah [Barty et al. (2014). J. Appl. Cryst. 47 , 1118-1131] and CrystFEL [White et al. (2016). J. Appl. Cryst. 49 , 680-689]. The original programs were adapted for data acquisition through the SACLA API, thread and inter-node parallelization, and efficient image handling. The pipeline consists of two stages: The first, online stage can analyse all images in real time, with a latency of less than a few seconds, to provide feedback on hit rate and detector saturation. The second, offline stage converts hit images into HDF5 files and runs CrystFEL for indexing and integration. The size of the filtered compressed output is comparable to that of a synchrotron data set. The pipeline enables real-time feedback and rapid structure solution during beamtime.
Practical Approach for Hyperspectral Image Processing in Python
NASA Astrophysics Data System (ADS)
Annala, L.; Eskelinen, M. A.; Hämäläinen, J.; Riihinen, A.; Pölönen, I.
2018-04-01
Python is a very popular programming language among data scientists around the world. Python can also be used in hyperspectral data analysis. There are some toolboxes designed for spectral imaging, such as Spectral Python and HyperSpy, but there is a need for analysis pipeline, which is easy to use and agile for different solutions. We propose a Python pipeline which is built on packages xarray, Holoviews and scikit-learn. We have developed some of own tools, MaskAccessor, VisualisorAccessor and a spectral index library. They also fulfill our goal of easy and agile data processing. In this paper we will present our processing pipeline and demonstrate it in practice.
Guo, Li; Allen, Kelly S; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M; Wick, Robert L; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host-pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems.
The Stanford Medical Youth Science Program: educational and science-related outcomes.
Crump, Casey; Ned, Judith; Winkleby, Marilyn A
2015-05-01
Biomedical preparatory programs (pipeline programs) have been developed at colleges and universities to better prepare youth for entering science- and health-related careers, but outcomes of such programs have seldom been rigorously evaluated. We conducted a matched cohort study to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program that reaches out to low-income and underrepresented ethnic minority high school students. Five annual surveys were used to assess educational outcomes and science-related experience among 96 SRP participants and a comparison group of 192 youth who applied but were not selected to participate in the SRP, using ~2:1 matching on sociodemographic and academic background to control for potential confounders. SRP participants were more likely than the comparison group to enter college (100.0 vs. 84.4 %, p = 0.002), and both of these matriculation rates were more than double the statewide average (40.8 %). In most areas of science-related experience, SRP participants reported significantly more experience (>twofold odds) than the comparison group at 1 year of follow-up, but these differences did not persist after 2-4 years. The comparison group reported substantially more participation in science or college preparatory programs, more academic role models, and less personal adversity than SRP participants, which likely influenced these findings toward the null hypothesis. SRP applicants, irrespective of whether selected for participation, had significantly better educational outcomes than population averages. Short-term science-related experience was better among SRP participants, although longer-term outcomes were similar, most likely due to college and science-related opportunities among the comparison group. We discuss implications for future evaluations of other biomedical pipeline programs.
Exposure of Seventh and Eighth Grade Urban Youth to Dentistry and Oral Health Careers.
Mayberry, Melanie E; Young, Deirdre D; Sawilowsky, Shlomo; Hoelscher, Diane
2018-01-01
While pipeline programs for students from underrepresented minority groups have been established at the high school and college levels, fewer programs have been developed for middle school students. In an effort to reach this cohort, the University of Detroit Mercy School of Dentistry embarked on a grassroots collaborative pipeline program with two distinct segments: Urban Impressions and Dental Imprint. Their purpose is to expose Detroit-area seventh and eighth grade students to careers in dentistry, provide oral health education, and introduce role models. The aim of this pilot study was to determine outcomes for the middle school participants in Urban Impressions (n=86) and Dental Imprint (n=68). Both segments featured hands-on dental activities at the dental school. Outcomes were assessed by pretest-posttest surveys. Across the three cohorts, a total of 86 students participated in one or more sessions, with 57 completing the pre- and post-program surveys, for a 66% response rate. The results showed that the Dental Imprint respondents' knowledge of oral health, dental admissions, and specialties increased by an average 26% over three years. The gain in knowledge for each cohort was statistically significant (p<0.001). Overall, 91% of Urban Impressions and 95% of Dental Imprint respondents were positive about the value of the program. Thirty-one of 57 Urban Impressions respondents indicated interest in dentistry as a career following the program. These results suggest that the two segments of this program are meeting their goals of increasing middle grade students' awareness of oral health professions including dentistry and providing access to role models. Institutions may benefit from the description of strategies used by this program to address challenges related to establishing early pipeline programs.
A Controlled Evaluation of a High School Biomedical Pipeline Program: Design and Methods
NASA Astrophysics Data System (ADS)
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-02-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90 %, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers.
A CONTROLLED EVALUATION OF A HIGH SCHOOL BIOMEDICAL PIPELINE PROGRAM: DESIGN AND METHODS.
Winkleby, Marilyn A; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-02-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program's Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90%, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers.
US Army Research Laboratory Joint Interagency Field Experimentation 15-2 Final Report
2015-12-01
February 2015, at Alameda Island, California. Advanced text analytics capabilities were demonstrated in a logically coherent workflow pipeline that... text processing capabilities allowed the targeted use of a persistent imagery sensor for rapid detection of mission- critical events. The creation of...a very large text database from open source data provides a relevant and unclassified foundation for continued development of text -processing
Stepping Stones to Research: Providing Pipelines from Middle School through PhD
NASA Astrophysics Data System (ADS)
Noel-Storr, Jacob; Baum, S. A.; RIT Insight Lab SSR Team; Carlson CenterImaging Science Faculty, Chester F.
2014-01-01
We present a decade's worth of strategies designed to promote and provide "Stepping Stones to Research" to provide a realistic pipeline of educational opportunities, with multiple gateways and exit points, for students moving towards STEM careers along the "STEM pipeline". We also illustrate how the Stepping Stones are designed to incidentally co-inside with related external opportunities through which we can also guide and support our mentees on their paths. We present programs such as middle school family science programs, high school research opportunities, high school internships, undergraduate research pathways, research experiences for undergraduates, and other opportunities. We will highlight the presentations being made at this very meeting -- from the first presentation of a high school student, to a dissertation presentation of a PhD graduate -- that have benefited from this stepping stone principle. We also reflect on the essential nature of building a "researcher-trust", even as a young student, of advocates and mentors who can support the continuation of a scientific career.
Adaptations to a New Physical Training Program in the Combat Controller Training Pipeline
2010-09-01
education regarding optimizing recovery through hydration and nutrition . We designed and implemented a short class that explained the benefits of pre...to poor nutrition and hydration practices. Finally, many of the training methods employed throughout the pipeline were outdated, non-periodized, and...contributing to overtraining. Creation of a nutrition and hydration class. Apart from being told to drink copious amounts of water, trainees had little
MEGAnnotator: a user-friendly pipeline for microbial genomes assembly and annotation.
Lugli, Gabriele Andrea; Milani, Christian; Mancabelli, Leonardo; van Sinderen, Douwe; Ventura, Marco
2016-04-01
Genome annotation is one of the key actions that must be undertaken in order to decipher the genetic blueprint of organisms. Thus, a correct and reliable annotation is essential in rendering genomic data valuable. Here, we describe a bioinformatics pipeline based on freely available software programs coordinated by a multithreaded script named MEGAnnotator (Multithreaded Enhanced prokaryotic Genome Annotator). This pipeline allows the generation of multiple annotated formats fulfilling the NCBI guidelines for assembled microbial genome submission, based on DNA shotgun sequencing reads, and minimizes manual intervention, while also reducing waiting times between software program executions and improving final quality of both assembly and annotation outputs. MEGAnnotator provides an efficient way to pre-arrange the assembly and annotation work required to process NGS genome sequence data. The script improves the final quality of microbial genome annotation by reducing ambiguous annotations. Moreover, the MEGAnnotator platform allows the user to perform a partial annotation of pre-assembled genomes and includes an option to accomplish metagenomic data set assemblies. MEGAnnotator platform will be useful for microbiologists interested in genome analyses of bacteria as well as those investigating the complexity of microbial communities that do not possess the necessary skills to prepare their own bioinformatics pipeline. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of surveys conducted July 14-18, 1992, at the Deep Creek and the Brandy Branch crossings of a pipeline installed during May 1991 in Nassau County, Florida. Both floodplains supported bottomland hardwood forests. The pipeline at the Deep Creek crossing was installed by means ofmore » horizontal directional drilling after the ROW had been clear-cut, while the pipeline at the Brandy Branch crossing was installed by means of conventional open trenching. Neither site was seeded or fertilized. At the time of sampling, a dense vegetative community, made up primarily of native perennial herbaceous species, occupied the ROW within the Deep Creek floodplain. The Brandy Branch ROW was vegetated by a less dense stand of primarily native perennial herbaceous plants. Plant diversity was also lower at the Brandy Branch crossing than at the Deep Creek crossing. The results suggest that some of the differences in plant communities are related to the more hydric conditions at the Brandy Branch floodplain.« less
Telluric currents: A meeting of theory and observation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boteler, D.H.; Seager, W.H.
Pipe-to-soil (P/S) potential variations resulting from telluric currents have been observed on pipelines in many locations. However, it has never teen clear which parts of a pipeline will experience the worst effects. Two studies were conducted to answer this question. Distributed-source transmission line (DSTL) theory was applied to the problem of modeling geomagnetic induction in pipelines. This theory predicted that the largest P/S potential variations would occur at the ends of the pipeline. The theory also predicted that large P/S potential variations, of opposite sign, should occur on either side of an insulating flange. Independently, an observation program was conductedmore » to determine the change in telluric current P/S potential variations and to design counteractive measures along a pipeline in northern Canada. Observations showed that the amplitude of P/S potential fluctuations had maxima at the northern and southern ends of the pipeline. A further set of recordings around an insulating flange showed large P/S potential variations, of opposite sign, on either side of the flange. Agreement between the observations and theoretical predictions was remarkable. While the observations confirmed the theory, the theory explains how P/S potential variations are produced by telluric currents and provides the basis for design of cathodic protection systems for pipelines that can counteract any adverse telluric effects.« less
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
NGSANE: a lightweight production informatics framework for high-throughput data analysis.
Buske, Fabian A; French, Hugh J; Smith, Martin A; Clark, Susan J; Bauer, Denis C
2014-05-15
The initial steps in the analysis of next-generation sequencing data can be automated by way of software 'pipelines'. However, individual components depreciate rapidly because of the evolving technology and analysis methods, often rendering entire versions of production informatics pipelines obsolete. Constructing pipelines from Linux bash commands enables the use of hot swappable modular components as opposed to the more rigid program call wrapping by higher level languages, as implemented in comparable published pipelining systems. Here we present Next Generation Sequencing ANalysis for Enterprises (NGSANE), a Linux-based, high-performance-computing-enabled framework that minimizes overhead for set up and processing of new projects, yet maintains full flexibility of custom scripting when processing raw sequence data. Ngsane is implemented in bash and publicly available under BSD (3-Clause) licence via GitHub at https://github.com/BauerLab/ngsane. Denis.Bauer@csiro.au Supplementary data are available at Bioinformatics online.
Cohen Freue, Gabriela V.; Meredith, Anna; Smith, Derek; Bergman, Axel; Sasaki, Mayu; Lam, Karen K. Y.; Hollander, Zsuzsanna; Opushneva, Nina; Takhar, Mandeep; Lin, David; Wilson-McManus, Janet; Balshaw, Robert; Keown, Paul A.; Borchers, Christoph H.; McManus, Bruce; Ng, Raymond T.; McMaster, W. Robert
2013-01-01
Recent technical advances in the field of quantitative proteomics have stimulated a large number of biomarker discovery studies of various diseases, providing avenues for new treatments and diagnostics. However, inherent challenges have limited the successful translation of candidate biomarkers into clinical use, thus highlighting the need for a robust analytical methodology to transition from biomarker discovery to clinical implementation. We have developed an end-to-end computational proteomic pipeline for biomarkers studies. At the discovery stage, the pipeline emphasizes different aspects of experimental design, appropriate statistical methodologies, and quality assessment of results. At the validation stage, the pipeline focuses on the migration of the results to a platform appropriate for external validation, and the development of a classifier score based on corroborated protein biomarkers. At the last stage towards clinical implementation, the main aims are to develop and validate an assay suitable for clinical deployment, and to calibrate the biomarker classifier using the developed assay. The proposed pipeline was applied to a biomarker study in cardiac transplantation aimed at developing a minimally invasive clinical test to monitor acute rejection. Starting with an untargeted screening of the human plasma proteome, five candidate biomarker proteins were identified. Rejection-regulated proteins reflect cellular and humoral immune responses, acute phase inflammatory pathways, and lipid metabolism biological processes. A multiplex multiple reaction monitoring mass-spectrometry (MRM-MS) assay was developed for the five candidate biomarkers and validated by enzyme-linked immune-sorbent (ELISA) and immunonephelometric assays (INA). A classifier score based on corroborated proteins demonstrated that the developed MRM-MS assay provides an appropriate methodology for an external validation, which is still in progress. Plasma proteomic biomarkers of acute cardiac rejection may offer a relevant post-transplant monitoring tool to effectively guide clinical care. The proposed computational pipeline is highly applicable to a wide range of biomarker proteomic studies. PMID:23592955
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.
2013-01-01
SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).
Miller, Mark P; Knaus, Brian J; Mullins, Thomas D; Haig, Susan M
2013-01-01
SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (e.g., microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains 3 analysis modules along with a fourth control module that can automate analyses of large volumes of data. The modules are used to 1) identify the subset of paired-end sequences that pass Illumina quality standards, 2) align paired-end reads into a single composite DNA sequence, and 3) identify sequences that possess microsatellites (both simple and compound) conforming to user-specified parameters. The microsatellite search algorithm is extremely efficient, and we have used it to identify repeats with motifs from 2 to 25 bp in length. Each of the 3 analysis modules can also be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc.). We demonstrate use of the program with data from the brine fly Ephydra packardi (Diptera: Ephydridae) and provide empirical timing benchmarks to illustrate program performance on a common desktop computer environment. We further show that the Illumina platform is capable of identifying large numbers of microsatellites, even when using unenriched sample libraries and a very small percentage of the sequencing capacity from a single DNA sequencing run. All modules from SSR_pipeline are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, and Windows).
An acceleration system for Laplacian image fusion based on SoC
NASA Astrophysics Data System (ADS)
Gao, Liwen; Zhao, Hongtu; Qu, Xiujie; Wei, Tianbo; Du, Peng
2018-04-01
Based on the analysis of Laplacian image fusion algorithm, this paper proposes a partial pipelining and modular processing architecture, and a SoC based acceleration system is implemented accordingly. Full pipelining method is used for the design of each module, and modules in series form the partial pipelining with unified data formation, which is easy for management and reuse. Integrated with ARM processor, DMA and embedded bare-mental program, this system achieves 4 layers of Laplacian pyramid on the Zynq-7000 board. Experiments show that, with small resources consumption, a couple of 256×256 images can be fused within 1ms, maintaining a fine fusion effect at the same time.
MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.
Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei
2016-03-15
MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Employing Machine-Learning Methods to Study Young Stellar Objects
NASA Astrophysics Data System (ADS)
Moore, Nicholas
2018-01-01
Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.
A CONTROLLED EVALUATION OF A HIGH SCHOOL BIOMEDICAL PIPELINE PROGRAM: DESIGN AND METHODS
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2013-01-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of a controlled trial to evaluate the Stanford Medical Youth Science Program’s Summer Residential Program (SRP), a 25-year-old university-based biomedical pipeline program. This 5-year matched cohort study uses an annual survey to assess educational and career outcomes among four cohorts of students who participate in the SRP and a matched comparison group of applicants who were not chosen to participate in the SRP. Matching on sociodemographic and academic background allows control for potential confounding. This design enables the testing of whether the SRP has an independent effect on educational- and career-related outcomes above and beyond the effects of other factors such as gender, ethnicity, socioeconomic background, and pre-intervention academic preparation. The results will help determine which curriculum components contribute most to successful outcomes and which students benefit most. After 4 years of follow-up, the results demonstrate high response rates from SRP participants and the comparison group with completion rates near 90%, similar response rates by gender and ethnicity, and little attrition with each additional year of follow-up. This design and methods can potentially be replicated to evaluate and improve other biomedical pipeline programs, which are increasingly important for equipping more students for science- and health-related careers. PMID:24563603
Hintzsche, Jennifer; Kim, Jihye; Yadav, Vinod; Amato, Carol; Robinson, Steven E; Seelenfreund, Eric; Shellman, Yiqun; Wisell, Joshua; Applegate, Allison; McCarter, Martin; Box, Neil; Tentler, John; De, Subhajyoti
2016-01-01
Objective Currently, there is a disconnect between finding a patient’s relevant molecular profile and predicting actionable therapeutics. Here we develop and implement the Integrating Molecular Profiles with Actionable Therapeutics (IMPACT) analysis pipeline, linking variants detected from whole-exome sequencing (WES) to actionable therapeutics. Methods and materials The IMPACT pipeline contains 4 analytical modules: detecting somatic variants, calling copy number alterations, predicting drugs against deleterious variants, and analyzing tumor heterogeneity. We tested the IMPACT pipeline on whole-exome sequencing data in The Cancer Genome Atlas (TCGA) lung adenocarcinoma samples with known EGFR mutations. We also used IMPACT to analyze melanoma patient tumor samples before treatment, after BRAF-inhibitor treatment, and after BRAF- and MEK-inhibitor treatment. Results IMPACT Food and Drug Administration (FDA) correctly identified known EGFR mutations in the TCGA lung adenocarcinoma samples. IMPACT linked these EGFR mutations to the appropriate FDA-approved EGFR inhibitors. For the melanoma patient samples, we identified NRAS p.Q61K as an acquired resistance mutation to BRAF-inhibitor treatment. We also identified CDKN2A deletion as a novel acquired resistance mutation to BRAFi/MEKi inhibition. The IMPACT analysis pipeline predicts these somatic variants to actionable therapeutics. We observed the clonal dynamic in the tumor samples after various treatments. We showed that IMPACT not only helped in successful prioritization of clinically relevant variants but also linked these variations to possible targeted therapies. Conclusion IMPACT provides a new bioinformatics strategy to delineate candidate somatic variants and actionable therapies. This approach can be applied to other patient tumor samples to discover effective drug targets for personalized medicine. IMPACT is publicly available at http://tanlab.ucdenver.edu/IMPACT. PMID:27026619
Hintzsche, Jennifer; Kim, Jihye; Yadav, Vinod; Amato, Carol; Robinson, Steven E; Seelenfreund, Eric; Shellman, Yiqun; Wisell, Joshua; Applegate, Allison; McCarter, Martin; Box, Neil; Tentler, John; De, Subhajyoti; Robinson, William A; Tan, Aik Choon
2016-07-01
Currently, there is a disconnect between finding a patient's relevant molecular profile and predicting actionable therapeutics. Here we develop and implement the Integrating Molecular Profiles with Actionable Therapeutics (IMPACT) analysis pipeline, linking variants detected from whole-exome sequencing (WES) to actionable therapeutics. The IMPACT pipeline contains 4 analytical modules: detecting somatic variants, calling copy number alterations, predicting drugs against deleterious variants, and analyzing tumor heterogeneity. We tested the IMPACT pipeline on whole-exome sequencing data in The Cancer Genome Atlas (TCGA) lung adenocarcinoma samples with known EGFR mutations. We also used IMPACT to analyze melanoma patient tumor samples before treatment, after BRAF-inhibitor treatment, and after BRAF- and MEK-inhibitor treatment. IMPACT Food and Drug Administration (FDA) correctly identified known EGFR mutations in the TCGA lung adenocarcinoma samples. IMPACT linked these EGFR mutations to the appropriate FDA-approved EGFR inhibitors. For the melanoma patient samples, we identified NRAS p.Q61K as an acquired resistance mutation to BRAF-inhibitor treatment. We also identified CDKN2A deletion as a novel acquired resistance mutation to BRAFi/MEKi inhibition. The IMPACT analysis pipeline predicts these somatic variants to actionable therapeutics. We observed the clonal dynamic in the tumor samples after various treatments. We showed that IMPACT not only helped in successful prioritization of clinically relevant variants but also linked these variations to possible targeted therapies. IMPACT provides a new bioinformatics strategy to delineate candidate somatic variants and actionable therapies. This approach can be applied to other patient tumor samples to discover effective drug targets for personalized medicine.IMPACT is publicly available at http://tanlab.ucdenver.edu/IMPACT. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Teachers-in-Residence: New Pathways into the Profession. Ask the Team
ERIC Educational Resources Information Center
Han, Grace; Doyle, Daniela
2013-01-01
Teacher residency programs are a relatively new method for building stronger teacher pipelines. Research assessing the impact of these programs is still limited, but some early reports suggest that residency programs hold promise for improving teacher effectiveness and retention rates (Barrett, Hovde, Hahn, & Rosqueta, 2011; Papay, West,…
Emory U. Trains Its Own Leaders
ERIC Educational Resources Information Center
Selingo, Jeffrey J.
2009-01-01
This article describes Emory University's Excellence Through Leadership program. Started in 2006, the yearlong program is designed to help up to 20 administrators and faculty members annually improve their leadership skills, as well as create a pipeline to eventually replace senior leaders at the institution. Emory's leadership program is just one…
Tappis, Hannah; Doocy, Shannon; Amoako, Stephen
2013-01-01
ABSTRACT Despite decades of support for international food assistance programs by the U.S. Agency for International Development (USAID) Office of Food for Peace, relatively little is known about the commodity pipeline and management issues these programs face in post-conflict and politically volatile settings. Based on an audit of the program's commodity tracking system and interviews with 13 key program staff, this case study documents the experiences of organizations implementing the first USAID-funded non-emergency (development) food assistance program approved for Sudan and South Sudan. Key challenges and lessons learned in this experience about food commodity procurement, transport, and management may help improve the design and implementation of future development food assistance programs in a variety of complex, food-insecure settings around the world. Specifically, expanding shipping routes in complex political situations may facilitate reliable and timely commodity delivery. In addition, greater flexibility to procure commodities locally, rather than shipping U.S.-procured commodities, may avoid unnecessary shipping delays and reduce costs. PMID:25276532
30 CFR 817.180 - Utility installations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PERMANENT PROGRAM PERFORMANCE STANDARDS PERMANENT PROGRAM PERFORMANCE STANDARDS-UNDERGROUND MINING ACTIVITIES § 817.180 Utility installations. All underground mining activities shall be conducted in a manner...; oil, gas, and coal-slurry pipelines, railroads; electric and telephone lines; and water and sewage...
Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai
2017-11-23
The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.
Guo, Li; Allen, Kelly S.; Deiulio, Greg; Zhang, Yong; Madeiras, Angela M.; Wick, Robert L.; Ma, Li-Jun
2016-01-01
Current and emerging plant diseases caused by obligate parasitic microbes such as rusts, downy mildews, and powdery mildews threaten worldwide crop production and food safety. These obligate parasites are typically unculturable in the laboratory, posing technical challenges to characterize them at the genetic and genomic level. Here we have developed a data analysis pipeline integrating several bioinformatic software programs. This pipeline facilitates rapid gene discovery and expression analysis of a plant host and its obligate parasite simultaneously by next generation sequencing of mixed host and pathogen RNA (i.e., metatranscriptomics). We applied this pipeline to metatranscriptomic sequencing data of sweet basil (Ocimum basilicum) and its obligate downy mildew parasite Peronospora belbahrii, both lacking a sequenced genome. Even with a single data point, we were able to identify both candidate host defense genes and pathogen virulence genes that are highly expressed during infection. This demonstrates the power of this pipeline for identifying genes important in host–pathogen interactions without prior genomic information for either the plant host or the obligate biotrophic pathogen. The simplicity of this pipeline makes it accessible to researchers with limited computational skills and applicable to metatranscriptomic data analysis in a wide range of plant-obligate-parasite systems. PMID:27462318
NASA Astrophysics Data System (ADS)
Muggleton, J. M.; Rustighi, E.; Gao, Y.
2016-09-01
Waves that propagate at low frequencies in buried pipes are of considerable interest in a variety of practical scenarios, for example leak detection, remote pipe detection, and pipeline condition assessment and monitoring. Particularly useful are the n = 0, or axisymmetric, modes in which there is no displacement (or pressure) variation over the pipe cross section. Previous work has focused on two of the three axisymmetric wavetypes that can propagate: the s = 1, fluid- dominated wave; and the s = 2, shell-dominated wave. In this paper, the third axisymmetric wavetype, the s = 0 torsional wave, is studied. Whilst there is a large body of research devoted to the study of torsional waves and their use for defect detection in pipes at ultrasonic frequencies, little is known about their behaviour and possible exploitation at lower frequencies. Here, a low- frequency analytical dispersion relationship is derived for the torsional wavenumber for a buried pipe from which both the wavespeed and wave attenuation can be obtained. How the torsional waves subsequently radiate to the ground surface is then investigated, with analytical expressions being presented for the ground surface displacement above the pipe resulting from torsional wave motion within the pipe wall. Example results are presented and, finally, how such waves might be exploited in practice is discussed.
Acosta, David; Olsen, Polly
2006-10-01
Minority populations in the United States are growing rapidly, but physician workforce diversity has not kept pace with the needs of underserved communities. Minorities comprised 26.4% of the population in 1995; by 2050, these groups will comprise nearly half. Medical schools must enlist greater numbers of minority physicians and train all physicians to provide culturally responsive care. The University of Washington School of Medicine (UWSOM) is the nation's only medical school that serves a five-state region (Washington, Wyoming, Alaska, Montana, and Idaho). Its mission addresses the need to serve the region, rectify primary care shortages, and meet increasing regional demands for underserved populations. The UWSOM Native American Center of Excellence (NACOE) was established as one important way to respond to this charge. The authors describe pipeline and minority recruitment programs at UWSOM, focusing on the NACOE and other activities to recruit American Indian/Alaskan Native (AI/AN) applicants to medical schools. These programs have increased the numbers of AI/AN medical students; developed the Indian Health Pathway; worked to prepare students to provide culturally responsive care for AI/AN communities; researched health disparities specific to AI/AN populations; provided retention programs and services to ensure successful completion of medical training; developed mentorship networks; and provided faculty-development programs to increase entry of AI/AN physicians into academia. Challenges lie ahead. Barriers to the pipeline will continue to plague students, and inadequate federal funding will have a significant and negative impact on achieving needed physician-workforce diversity. Medical schools must play a larger role in resolving these, and continue to provide pipeline programs, retention programs, and minority faculty development that can make a difference.
Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin
2016-02-23
Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques.
Development of a Pipeline for Exploratory Metabolic Profiling of Infant Urine
Jackson, Frances; Georgakopoulou, Nancy; Kaluarachchi, Manuja; Kyriakides, Michael; Andreas, Nicholas; Przysiezna, Natalia; Hyde, Matthew J.; Modi, Neena; Nicholson, Jeremy K.; Wijeyesekera, Anisha; Holmes, Elaine
2017-01-01
Numerous metabolic profiling pipelines have been developed to characterize the composition of human biofluids and tissues, the vast majority of these being for studies in adults. To accommodate limited sample volume and to take into account the compositional differences between adult and infant biofluids, we developed and optimized sample handling and analytical procedures for studying urine from newborns. A robust pipeline for metabolic profiling using NMR spectroscopy was established, encompassing sample collection, preparation, spectroscopic measurement, and computational analysis. Longitudinal samples were collected from five infants from birth until 14 months of age. Methods of extraction and effects of freezing and sample dilution were assessed, and urinary contaminants from breakdown of polymers in a range of diapers and cotton wool balls were identified and compared, including propylene glycol, acrylic acid, and tert-butanol. Finally, assessment of urinary profiles obtained over the first few weeks of life revealed a dramatic change in composition, with concentrations of phenols, amino acids, and betaine altering systematically over the first few months of life. Therefore, neonatal samples require more stringent standardization of experimental design, sample handling, and analysis compared to that of adult samples to accommodate the variability and limited sample volume. PMID:27476583
Double-pulse laser-induced breakdown spectroscopy analysis of scales from petroleum pipelines
NASA Astrophysics Data System (ADS)
Cavalcanti, G. H.; Rocha, A. A.; Damasceno, R. N.; Legnaioli, S.; Lorenzetti, G.; Pardini, L.; Palleschi, V.
2013-09-01
Pipeline scales from the Campos Bay Petroleum Field near Rio de Janeiro, Brazil have been analyzed by both Raman spectroscopy and by laser-induced breakdown spectroscopy (LIBS) using a double-pulse, calibration-free approach. Elements that are characteristic of petroleum (e.g. C, H, N, O, Mg, Na, Fe and V) were detected, in addition to the Ca, Al, and Si which form the matrix of the scale. The LIBS results were compared with the results of micro-Raman spectroscopy, which confirmed the nature of the incrustations inferred by the LIBS analysis. Results of this preliminary study suggest that diffusion of pipe material into the pipeline intake column plays an important role in the growth of scale. Thanks to the simplicity and relative low cost of equipment and to the fact that no special chemical pre-treatment of the samples is needed, LIBS can offer very fast acquisition of data and the possibility of in situ measurements. LIBS could thus represent an alternative or complementary method for the chemical characterization of the scales by comparison to conventional analytical techniques, such as X-ray diffraction or X-ray fluorescence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Van Dyke, G.D.; Zimmerman, R.E.
1994-12-01
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipelines on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and right-of-way (ROW) management practices. This report presents the results of a survey conducted over the period of August 3-4, 1992, at the Cassadaga wetlands crossing in Gerry Township, Chautauqua County, New York. The pipeline at this site was installed during February and March 1981. After completion of pipeline installation, the ROW was fertilized,more » mulched, and seeded with annual ryegrass. Two adjacent sites were surveyed in this study: a forested wetland and an emergent wetlands Eleven years after pipeline installation, the ROW at both sites supported diverse vegetative communities. Although devoid of large woody species, the ROW within the forested wetland had a dense vegetative cover. The ROW within the emergent wetland had a slightly less dense and more diverse vegetative community compared with that in the adjacent natural areas (NAs). The ROW within the emergent wetland also had a large number of introduced species that were not present in the adjacent NAs. The ROW, with its emergent marsh plant community, provided habitat diversity within the forested wetlands Because the ROW contained species not found within the adjacent NAs, overall species diversity was increased.« less
Watson, Christopher M; Camm, Nick; Crinnion, Laura A; Clokie, Samuel; Robinson, Rachel L; Adlard, Julian; Charlton, Ruth; Markham, Alexander F; Carr, Ian M; Bonthron, David T
2017-12-01
Diagnostic genetic testing programmes based on next-generation DNA sequencing have resulted in the accrual of large datasets of targeted raw sequence data. Most diagnostic laboratories process these data through an automated variant-calling pipeline. Validation of the chosen analytical methods typically depends on confirming the detection of known sequence variants. Despite improvements in short-read alignment methods, current pipelines are known to be comparatively poor at detecting large insertion/deletion mutations. We performed clinical validation of a local reassembly tool, ABRA (assembly-based realigner), through retrospective reanalysis of a cohort of more than 2000 hereditary cancer cases. ABRA enabled detection of a 96-bp deletion, 4-bp insertion mutation in PMS2 that had been initially identified using a comparative read-depth approach. We applied an updated pipeline incorporating ABRA to the entire cohort of 2000 cases and identified one previously undetected pathogenic variant, a 23-bp duplication in PTEN. We demonstrate the effect of read length on the ability to detect insertion/deletion variants by comparing HiSeq2500 (2 × 101-bp) and NextSeq500 (2 × 151-bp) sequence data for a range of variants and thereby show that the limitations of shorter read lengths can be mitigated using appropriate informatics tools. This work highlights the need for ongoing development of diagnostic pipelines to maximize test sensitivity. We also draw attention to the large differences in computational infrastructure required to perform day-to-day versus large-scale reprocessing tasks.
ERIC Educational Resources Information Center
Dixon, John; Girifalco, Tony; Yakabosky, Walt
2008-01-01
This article describes the Applied Engineering Technology (AET) Career and Educational Pathways Program, which helps local manufacturers find quality workers. The program features 32 high schools, three community colleges, and 10 four-year institutions offering an integrated regional system of applied engineering education. The goal is to enroll…
Program for At-Risk Students Helps College, Too
ERIC Educational Resources Information Center
Carlson, Scott
2012-01-01
The author introduces a new program that brings city kids who really need college to a private rural campus that really needs kids. Under the program, called Pipelines Into Partnership, a handful of urban high schools and community organizations--the groups that know their kids beyond the black and white of their transcripts--determine which…
18 CFR 357.5 - Cash management programs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Cash management...: CARRIERS SUBJECT TO PART I OF THE INTERSTATE COMMERCE ACT § 357.5 Cash management programs. Oil pipeline... and § 357.2 of this title that participate in cash management programs must file these agreements with...
18 CFR 357.5 - Cash management programs.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Cash management...: CARRIERS SUBJECT TO PART I OF THE INTERSTATE COMMERCE ACT § 357.5 Cash management programs. Oil pipeline... and § 357.2 of this title that participate in cash management programs must file these agreements with...
Thurmond, V B; Cregler, L L
1999-04-01
To track gifted underrepresented minority (URM) students who entered the pipeline to health professional school when they were in high school and to determine whether and why students left the pipeline to enter other professions. A questionnaire was mailed to 162 students who had participated in the Student Educational Enrichment Program (SEEP) in health sciences at the Medical College of Georgia between 1984 and 1991; 123 (75%) responded. Students in the study population had higher graduation rates than the average state or national student. Fifty-nine (48%) of the students had entered health care careers; 98% had stated that intention when they were in high school. Although some of the students stated trouble with course work and GPA as reasons for their decisions to change career tracks, many students said that their interests in non-medical careers had been fostered by mentors or by opportunities to serve internships. Early intervention is important to retaining students in a pipeline that leads to a health care career. Summer programs are successful, but may not be enough to help students with difficult science courses in college, especially chemistry. However, another important conclusion is that much more needs to be done to help students find mentors with whom they can develop relationships and to give them opportunities to work in health care settings.
Measuring the Success of a Pipeline Program to Increase Nursing Workforce Diversity.
Katz, Janet R; Barbosa-Leiker, Celestina; Benavides-Vaello, Sandra
2016-01-01
The purpose of this study was to understand changes in knowledge and opinions of underserved American Indian and Hispanic high school students after attending a 2-week summer pipeline program using and testing a pre/postsurvey. The research aims were to (a) psychometrically analyze the survey to determine if scale items could be summed to create a total scale score or subscale scores; (b) assess change in scores pre/postprogram; and (c) examine the survey to make suggestions for modifications and further testing to develop a valid tool to measure changes in student perceptions about going to college and nursing as a result of pipeline programs. Psychometric analysis indicated poor model fit for a 1-factor model for the total scale and majority of subscales. Nonparametric tests indicated statistically significant increases in 13 items and decreases in 2 items. Therefore, while total scores or subscale scores cannot be used to assess changes in perceptions from pre- to postprogram, the survey can be used to examine changes over time in each item. Student did not have an accurate view of nursing and college and underestimated support needed to attend college. However students realized that nursing was a profession with autonomy, respect, and honor. Copyright © 2016 Elsevier Inc. All rights reserved.
Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer
NASA Astrophysics Data System (ADS)
Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division
2016-06-01
Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.
NASA Astrophysics Data System (ADS)
Dudukalov, A.
Leakage from pipe-lines, nonhermetic wells and other industrial equipment of highly mineralized chloride-sodium brines, incidentally produced during oil field exploitation is one of the main source of fresh groundwater contamination on the Arlan oil field. Thermodynamic calculation, aimed to define more exactly brines chemical composi- tion and density was carried out by FREZCHEM2 program (Mironenko M.V. et al. 1997). Five brines types with mineralization of 137.9, 181.2, 217.4, 243.7, 267.8 g/l and density of 1.176, 1.09, 1.135, 1.153, 1.167 g/cm3 correspondingly were used. It is necessary to note that preliminarily chemical compositions of two last brines were corrected according to their mineralization. During calculations it was determined the following density values of brines: 1.082, 1.114, 1.131, 1.146, 1.158 g/cm3 conse- quently. Obtained results demonstrate the significant discrepancy in experimental and model estimates. Significant excess of anions over cations in experimental data indicates a major prob- lem with the analytical measurements. During calculations it was analyzed the possi- bility of changes in brines density depending on editing to cations or deducting from anions requisite amount of agent for keeping charge balance equal to zero. Received results demonstrate that in this case brines density can change on 0.004-0.011 g/cm3.
Update on the SDSS-III MARVELS data pipeline development
NASA Astrophysics Data System (ADS)
Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.
2014-01-01
MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.
Nearing, Kathryn A; Hunt, Cerise; Presley, Jessica H; Nuechterlein, Bridget M; Moss, Marc; Manson, Spero M
2015-10-01
This paper is the first in a five-part series on the clinical and translational science educational pipeline and presents strategies to support recruitment and retention to create diverse pathways into clinical and translational research (CTR). The strategies address multiple levels or contexts of persistence decisions and include: (1) creating a seamless pipeline by forming strategic partnerships to achieve continuity of support for scholars and collective impact; (2) providing meaningful research opportunities to support identity formation as a scientist and sustain motivation to pursue and persist in CTR careers; (3) fostering an environment for effective mentorship and peer support to promote academic and social integration; (4) advocating for institutional policies to alleviate environmental pull factors; and, (5) supporting program evaluation-particularly, the examination of longitudinal outcomes. By combining institutional policies that promote a culture and climate for diversity with quality, evidence-based programs and integrated networks of support, we can create the environment necessary for diverse scholars to progress successfully and efficiently through the pipeline to achieve National Institutes of Health's vision of a robust CTR workforce. © 2015 Wiley Periodicals, Inc.
Kamala C T; Balaram V; Dharmendra V; Satyanarayanan M; Subramanyam K S V; Krishnaiah A
2014-11-01
Recently introduced microwave plasma-atomic emission spectroscopy (MP-AES) represents yet another and very important addition to the existing array of modern instrumental analytical techniques. In this study, an attempt is made to summarize the performance characteristics of MP-AES and its potential as an analytical tool for environmental studies with some practical examples from Patancheru and Uppal industrial sectors of Hyderabad city. A range of soil, sediment, water reference materials, particulate matter, and real-life samples were chosen to evaluate the performance of this new analytical technique. Analytical wavelengths were selected considering the interference effects of other concomitant elements present in different sample solutions. The detection limits for several elements were found to be in the range from 0.05 to 5 ng/g. The trace metals analyzed in both the sectors followed the topography with more pollution in the low-lying sites. The metal contents were found to be more in ground waters than surface waters. Since a decade, the pollutants are transfered from Patancheru industrial area to Musi River. After polluting Nakkavagu and turning huge tracts of agricultural lands barren besides making people residing along the rivulet impotent and sick, industrialists of Patancheru are shifting the effluents to downstream of Musi River through an 18-km pipeline from Patancheru. Since the effluent undergoes primary treatment at Common Effluent Treatment Plant (CETP) at Patanchru and travels through pipeline and mixes with sewage, the organic effluents will be diluted. But the inorganic pollutants such as heavy and toxic metals tend to accumulate in the environmental segments near and downstreams of Musi River. The data generated by MP-AES of toxic metals like Zn, Cu, and Cr in the ground and surface waters can only be attributed to pollution from Patancheru since no other sources are available to Musi River.
A Mathematical Model of Gas-Turbine Pump Complex
NASA Astrophysics Data System (ADS)
Shpilevoy, V. A.; Chekardovsky, S. M.; Zakirazkov, A. G.
2016-10-01
The articles analyzes the state of an extensive network of main oil pipelines of Tyumen region on the basis of statistical data, and also suggest ways of improving the efficiency of energy-saving policy on the main transport oil. Various types of main oil pipelines pump drives were examined. It was determined that now there is no strict analytical dependence between main operating properties of the power turbine of gas turbine engine. At the same time it is necessary to determine the operating parameters using a turbine at GTPU, interconnection between power and speed frequency, as well as the feasibility of using a particular mode. Analysis of foreign experience, the state of domestic enterprises supplying the country with gas turbines, features of the further development of transport of hydrocarbon resources allows us to conclude the feasibility of supplying the oil transportation industry of our country with pumping units based on gas turbine drive.
World Bank oil-pipeline project designed to prevent HIV transmission.
Kigotho, A W
1997-11-29
A World Bank-funded oil pipeline project, in Chad and Cameroon, is the first large-scale construction project in sub-Saharan Africa to incorporate an HIV/AIDS prevention component. The project entails the development of oil fields in southern Chad and construction of 1100 km of pipeline to port facilities on Cameroon's Atlantic coast. 3000 construction workers from the two countries will be employed between 1998 and 2001, including about 600 truck drivers. In some areas along the pipeline route, 50% of the prostitutes (who are frequented by truck drivers) are HIV-infected. The HIV/AIDS intervention aims to prevent HIV and sexually transmitted diseases (STDs) among project workers through social marketing of condoms, treatment of STDs in prostitutes along the route, and health education to modify high-risk behaviors. The program is considered a test case for African governments and donors interested in whether the integration of a health component in major construction projects can avoid AIDS epidemics in affected countries.
Aerial image databases for pipeline rights-of-way management
NASA Astrophysics Data System (ADS)
Jadkowski, Mark A.
1996-03-01
Pipeline companies that own and manage extensive rights-of-way corridors are faced with ever-increasing regulatory pressures, operating issues, and the need to remain competitive in today's marketplace. Automation has long been an answer to the problem of having to do more work with less people, and Automated Mapping/Facilities Management/Geographic Information Systems (AM/FM/GIS) solutions have been implemented at several pipeline companies. Until recently, the ability to cost-effectively acquire and incorporate up-to-date aerial imagery into these computerized systems has been out of the reach of most users. NASA's Earth Observations Commercial Applications Program (EOCAP) is providing a means by which pipeline companies can bridge this gap. The EOCAP project described in this paper includes a unique partnership with NASA and James W. Sewall Company to develop an aircraft-mounted digital camera system and a ground-based computer system to geometrically correct and efficiently store and handle the digital aerial images in an AM/FM/GIS environment. This paper provides a synopsis of the project, including details on (1) the need for aerial imagery, (2) NASA's interest and role in the project, (3) the design of a Digital Aerial Rights-of-Way Monitoring System, (4) image georeferencing strategies for pipeline applications, and (5) commercialization of the EOCAP technology through a prototype project at Algonquin Gas Transmission Company which operates major gas pipelines in New England, New York, and New Jersey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shem, L.M.; Zimmerman, R.E.; Hayes, D.
The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipeline on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and night of-way (ROW) management practices. This report presents the results of a survey conducted over the period of August 12-13, 1991, at the Bayou Grand Cane crossing in De Soto Parish, Louisiana, where a pipeline constructed three years prior to the survey crosses the bayou through mature bottomland hardwoods. The sit was notmore » seeded or fertilized after construction activities. At the time of sampling, a dense herb stratum (composed of mostly native species) covered the 20-m-wide ROW, except within drainage channels. As a result of the creation of the ROW, new habitat was created, plant diversity increased, and forest habitat became fragmented. The ROW must be maintained at an early stage of succession to allow access to the pipeline however, impacts to the wetland were minimized by decreasing the width of the ROW to 20 m and recreating the drainage channels across the ROW. The canopy trees on the ROW`s edge shaded part of the ROW, which helped to minimize the effects of the ROW.« less
78 FR 43263 - Paperless Hazard Communications Pilot Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
... Research Division (PHH-23), Pipeline and Hazardous Materials Safety Administration, 1200 New Jersey Avenue... materials by air, highway, rail, and water) to test the feasibility and then evaluate both the feasibility... times and locations.'' On September 12, 2001, the Research and Special Programs Administration (the...
NASA Astrophysics Data System (ADS)
Russell, Melody L.; Atwater, Mary M.
2005-08-01
This study focuses on 11 African American undergraduate seniors in a biology degree program at a predominantly white research institution in the southeastern United States. These 11 respondents shared their journeys throughout the high school and college science pipeline. Participants described similar precollege factors and experiences that contributed to their academic success and persistence at a predominantly white institution. One of the most critical factors in their academic persistence was participation in advanced science and mathematics courses as part of their high school college preparatory program. Additional factors that had a significant impact on their persistence and academic success were family support, teacher encouragement, intrinsic motivation, and perseverance.
The Rural Girls in Science Project: from Pipelines to Affirming Science Education
NASA Astrophysics Data System (ADS)
Ginorio, Angela B.; Huston, Michelle; Frevert, Katie; Seibel, Jane Bierman
The Rural Girls in Science (RGS) program was developed to foster the interest in science, engineering, and mathematics among rural high school girls in the state of Washington. Girls served include American Indians, Latinas, and Whites. This article provides an overview of the program and its outcomes not only for the participants (girls, teachers, counselors, and schools) but the researchers. Lessons learned from and about the participants are presented, and lessons learned from the process are discussed to illustrate how RGS moved from a focus on individuals to a focus on the school. The initial guiding concepts (self-esteem and scientific pipeline) were replaced by “possible selves” and our proposed complementary concepts: science-affirming and affirming science education.
Assessing the Impact of a Research-Based STEM Program on STEM Majors' Attitudes and Beliefs
ERIC Educational Resources Information Center
Huziak-Clark, Tracy; Sondergeld, Toni; Staaden, Moira; Knaggs, Christine; Bullerjahn, Anne
2015-01-01
The Science, Engineering, and Technology Gateway of Ohio (SETGO) program has a three-pronged approach to meeting the needs at different levels of students in the science, technology, engineering, and mathematics (STEM) pipeline. The SETGO program was an extensive collaboration between a two-year community college and a nearby four-year…
A Controlled Evaluation of a High School Biomedical Pipeline Program: Design and Methods
ERIC Educational Resources Information Center
Winkleby, Marilyn A.; Ned, Judith; Ahn, David; Koehler, Alana; Fagliano, Kathleen; Crump, Casey
2014-01-01
Given limited funding for school-based science education, non-school-based programs have been developed at colleges and universities to increase the number of students entering science- and health-related careers and address critical workforce needs. However, few evaluations of such programs have been conducted. We report the design and methods of…
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
ERIC Educational Resources Information Center
Abdul-Alim, Jamaal
2012-01-01
This article features the Ronald E. McNair Postbaccalaureate Achievement Program at the University of Memphis. The McNair program is named after Ronald E. McNair, the second African-American in space, who died in the Space Shuttle Challenger explosion in 1986. Approximately 200 campuses across the nation host the program. Whereas the program…
Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.
Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459
NASA Astrophysics Data System (ADS)
Muggleton, J. M.; Kalkowski, M.; Gao, Y.; Rustighi, E.
2016-07-01
Waves that propagate at low frequencies in buried pipes are of considerable interest in a variety of practical scenarios, for example leak detection, remote pipe detection, and pipeline condition assessment and monitoring. Whilst there has been considerable research and commercial attention on the accurate location of pipe leakage for many years, the various causes of pipe failures and their identification, have not been well documented; moreover, there are still a number of gaps in the existing knowledge. Previous work has focused on two of the three axisymmetric wavetypes that can propagate: the s=1, fluid-dominated wave; and the s=2, shell-dominated wave. In this paper, the third axisymmetric wavetype, the s=0 torsional wave, is investigated. The effects of the surrounding soil on the characteristics of wave propagation and attenuation are analysed for a compact pipe/soil interface for which there is no relative motion between the pipe wall and the surrounding soil. An analytical dispersion relationship is derived for the torsional wavenumber from which both the wavespeed and wave attenuation can be obtained. How torsional waves can subsequently radiate to the ground surface is then investigated. Analytical expressions are derived for the ground surface displacement above the pipe resulting from torsional wave motion within the pipe wall. A numerical model is also included, primarily in order to validate some of the assumptions made whilst developing the analytical solutions, but also so that some comparison in the results may be made. Example results are presented for both a cast iron pipe and an MDPE pipe buried in two typical soil types.
Selected reaction monitoring mass spectrometry: a methodology overview.
Ebhardt, H Alexander
2014-01-01
Moving past the discovery phase of proteomics, the term targeted proteomics combines multiple approaches investigating a certain set of proteins in more detail. One such targeted proteomics approach is the combination of liquid chromatography and selected or multiple reaction monitoring mass spectrometry (SRM, MRM). SRM-MS requires prior knowledge of the fragmentation pattern of peptides, as the presence of the analyte in a sample is determined by measuring the m/z values of predefined precursor and fragment ions. Using scheduled SRM-MS, many analytes can robustly be monitored allowing for high-throughput sample analysis of the same set of proteins over many conditions. In this chapter, fundaments of SRM-MS are explained as well as an optimized SRM pipeline from assay generation to data analyzed.
75 FR 15613 - Hazardous Materials Transportation; Registration and Fee Assessment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 107 [Docket No. PHMSA-2009-0201 (HM-208H)] RIN 2137-AE47 Hazardous Materials Transportation... registration program are to gather information about the transportation of hazardous materials, and to fund the...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2012 CFR
2012-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2011 CFR
2011-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
49 CFR 199.113 - Employee assistance program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing § 199.113 Employee assistance program. (a) Each operator shall provide an employee... must be drug tested based on reasonable cause. The operator may establish the EAP as a part of its...
Diversifying the STEM Pipeline: The Model Replication Institutions Program
ERIC Educational Resources Information Center
Cullinane, Jenna
2009-01-01
In 2006, the National Science Foundation (NSF) began funding the Model Replication Institutions (MRI) program, which sought to improve the quality, availability, and diversity of science, technology, engineering, and mathematics (STEM) education. Faced with pressing national priorities in the STEM fields and chronic gaps in postsecondary…
Development of Time-Distance Helioseismology Data Analysis Pipeline for SDO/HMI
NASA Technical Reports Server (NTRS)
DuVall, T. L., Jr.; Zhao, J.; Couvidat, S.; Parchevsky, K. V.; Beck, J.; Kosovichev, A. G.; Scherrer, P. H.
2008-01-01
The Helioseismic and Magnetic Imager of SDO will provide uninterrupted 4k x 4k-pixel Doppler-shift images of the Sun with approximately 40 sec cadence. These data will have a unique potential for advancing local helioseismic diagnostics of the Sun's interior structure and dynamics. They will help to understand the basic mechanisms of solar activity and develop predictive capabilities for NASA's Living with a Star program. Because of the tremendous amount of data the HMI team is developing a data analysis pipeline, which will provide maps of subsurface flows and sound-speed distributions inferred form the Doppler data by the time-distance technique. We discuss the development plan, methods, and algorithms, and present the status of the pipeline, testing results and examples of the data products.
Analysis of Decisions Made Using the Analytic Hierarchy Process
2013-09-01
country petroleum pipelines (Dey, 2003), deciding how best to manage U.S. watersheds (De Steiguer, Duberstein, and Lopes, 2003), and the U. S. Army...many benefits to its use. Primarily these fall under the heading of managing chaos. Specifically, the AHP is a tool that can be used to simplify and...originally. The commonly used scenario is this: the waiter asks if you want chicken or fish, and you reply fish. The waiter then remembers that steak is
A Python Analytical Pipeline to Identify Prohormone Precursors and Predict Prohormone Cleavage Sites
Southey, Bruce R.; Sweedler, Jonathan V.; Rodriguez-Zas, Sandra L.
2008-01-01
Neuropeptides and hormones are signaling molecules that support cell–cell communication in the central nervous system. Experimentally characterizing neuropeptides requires significant efforts because of the complex and variable processing of prohormone precursor proteins into neuropeptides and hormones. We demonstrate the power and flexibility of the Python language to develop components of an bioinformatic analytical pipeline to identify precursors from genomic data and to predict cleavage as these precursors are en route to the final bioactive peptides. We identified 75 precursors in the rhesus genome, predicted cleavage sites using support vector machines and compared the rhesus predictions to putative assignments based on homology to human sequences. The correct classification rate of cleavage using the support vector machines was over 97% for both human and rhesus data sets. The functionality of Python has been important to develop and maintain NeuroPred (http://neuroproteomics.scs.uiuc.edu/neuropred.html), a user-centered web application for the neuroscience community that provides cleavage site prediction from a wide range of models, precision and accuracy statistics, post-translational modifications, and the molecular mass of potential peptides. The combined results illustrate the suitability of the Python language to implement an all-inclusive bioinformatics approach to predict neuropeptides that encompasses a large number of interdependent steps, from scanning genomes for precursor genes to identification of potential bioactive neuropeptides. PMID:19169350
Peters, Kelsey C; Swaminathan, Harish; Sheehan, Jennifer; Duffy, Ken R; Lun, Desmond S; Grgicak, Catherine M
2017-11-01
Samples containing low-copy numbers of DNA are routinely encountered in casework. The signal acquired from these sample types can be difficult to interpret as they do not always contain all of the genotypic information from each contributor, where the loss of genetic information is associated with sampling and detection effects. The present work focuses on developing a validation scheme to aid in mitigating the effects of the latter. We establish a scheme designed to simultaneously improve signal resolution and detection rates without costly large-scale experimental validation studies by applying a combined simulation and experimental based approach. Specifically, we parameterize an in silico DNA pipeline with experimental data acquired from the laboratory and use this to evaluate multifarious scenarios in a cost-effective manner. Metrics such as signal 1copy -to-noise resolution, false positive and false negative signal detection rates are used to select tenable laboratory parameters that result in high-fidelity signal in the single-copy regime. We demonstrate that the metrics acquired from simulation are consistent with experimental data obtained from two capillary electrophoresis platforms and various injection parameters. Once good resolution is obtained, analytical thresholds can be determined using detection error tradeoff analysis, if necessary. Decreasing the limit of detection of the forensic process to one copy of DNA is a powerful mechanism by which to increase the information content on minor components of a mixture, which is particularly important for probabilistic system inference. If the forensic pipeline is engineered such that high-fidelity electropherogram signal is obtained, then the likelihood ratio (LR) of a true contributor increases and the probability that the LR of a randomly chosen person is greater than one decreases. This is, potentially, the first step towards standardization of the analytical pipeline across operational laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.
2013-01-01
Background Accurate and complete identification of mobile elements is a challenging task in the current era of sequencing, given their large numbers and frequent truncations. Group II intron retroelements, which consist of a ribozyme and an intron-encoded protein (IEP), are usually identified in bacterial genomes through their IEP; however, the RNA component that defines the intron boundaries is often difficult to identify because of a lack of strong sequence conservation corresponding to the RNA structure. Compounding the problem of boundary definition is the fact that a majority of group II intron copies in bacteria are truncated. Results Here we present a pipeline of 11 programs that collect and analyze group II intron sequences from GenBank. The pipeline begins with a BLAST search of GenBank using a set of representative group II IEPs as queries. Subsequent steps download the corresponding genomic sequences and flanks, filter out non-group II introns, assign introns to phylogenetic subclasses, filter out incomplete and/or non-functional introns, and assign IEP sequences and RNA boundaries to the full-length introns. In the final step, the redundancy in the data set is reduced by grouping introns into sets of ≥95% identity, with one example sequence chosen to be the representative. Conclusions These programs should be useful for comprehensive identification of group II introns in sequence databases as data continue to rapidly accumulate. PMID:24359548
Diversifying the STEM Pipeline: Recommendations from the Model Replication Institutions Program
ERIC Educational Resources Information Center
Institute for Higher Education Policy, 2010
2010-01-01
Launched in 2006 to address issues of national competitiveness and equity in science, technology, engineering, and mathematics (STEM) fields, the National Science Foundation-funded Model Replication Institutions (MRI) program sought to improve the quality, availability, and diversity of STEM education. The project offered technical assistance to…
The Challenges in Providing Needed Transition Programming to Juvenile Offenders
ERIC Educational Resources Information Center
Platt, John S.; Bohac, Paul D.; Wade, Wanda
2015-01-01
The transition to and from juvenile justice settings is a complex and challenging process. Effectively preparing juvenile justice personnel to address the transition needs of incarcerated students is an essential aspect of reducing the negative effects of the school-to-prison pipeline. This article examines program and professional development…
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56- 07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on their pipeline safety program. Althou...
Building a Pipeline: One Company's Holistic Approach to College Relations
ERIC Educational Resources Information Center
Pratt, Joseph
2003-01-01
This article describes how Fidelity, the largest mutual fund company in the United States, has transformed a traditional college recruiting program into a holistic college partnership that emphasizes the interdependence of its parts. Fidelity's enhanced internship program embraces the "try before you buy" philosophy, which benefits both the firm…
49 CFR 192.909 - How can an operator change its integrity management program?
Code of Federal Regulations, 2011 CFR
2011-10-01
... Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management... 49 Transportation 3 2011-10-01 2011-10-01 false How can an operator change its integrity management program? 192.909 Section 192.909 Transportation Other Regulations Relating to Transportation...
Gazda, Nicholas P; Griffin, Emily; Hamrick, Kasey; Baskett, Jordan; Mellon, Meghan M; Eckel, Stephen F; Granko, Robert P
2018-04-01
Purpose: The purpose of this article is to share experiences after the development of a health-system pharmacy administration residency with a MS degree and express the need for additional programs in nonacademic medical center health-system settings. Summary: Experiences with the development and implementation of a health-system pharmacy administration residency at a large community teaching hospital are described. Resident candidates benefit from collaborations with other health-systems through master's degree programs and visibility to leaders at your health-system. Programs benefit from building a pipeline of future pharmacy administrators and by leveraging the skills of residents to contribute to projects and department-wide initiatives. Tools to assist in the implementation of a new pharmacy administration program are also described and include rotation and preceptor development, marketing and recruiting, financial evaluation, and steps to prepare for accreditation. Conclusion: Health-system pharmacy administration residents provide the opportunity to build a pipeline of high-quality leaders, provide high-level project involvement, and produce a positive return on investment (ROI) for health-systems. These programs should be explored in academic and nonacademic-based health-systems.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review.
Dwivedi, D; Lepkova, K; Becker, T
2017-03-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review
NASA Astrophysics Data System (ADS)
Dwivedi, D.; Lepkova, K.; Becker, T.
2017-03-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.
Emerging surface characterization techniques for carbon steel corrosion: a critical brief review
Dwivedi, D.; Becker, T.
2017-01-01
Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed. PMID:28413351
Designing health promotion programs by watching the market.
Gelb, B D; Bryant, J M
1992-03-01
More health care providers and payors are beginning to see health promotion programs as a significant tool for attracting patients, reducing costs, or both. To help design programs that take into account the values and lifestyles of the target group, naturalistic observation can be useful. The authors illustrate the approach in a study of pipeline workers that provided input for the design of nutrition and smoking cessation programs.
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
NASA Astrophysics Data System (ADS)
Mohamed, Adel M. E.; Mohamed, Abuo El-Ela A.
2013-06-01
Ground vibrations induced by blasting in the cement quarries are one of the fundamental problems in the quarrying industry and may cause severe damage to the nearby utilities and pipelines. Therefore, a vibration control study plays an important role in the minimization of environmental effects of blasting in quarries. The current paper presents the influence of the quarry blasts at the National Cement Company (NCC) on the two oil pipelines of SUMED Company southeast of Helwan City, by measuring the ground vibrations in terms of Peak Particle Velocity (PPV). The seismic refraction for compressional waves deduced from the shallow seismic survey and the shear wave velocity obtained from the Multi channel Analysis of Surface Waves (MASW) technique are used to evaluate the closest site of the two pipelines to the quarry blasts. The results demonstrate that, the closest site of the two pipelines is of class B, according to the National Earthquake Hazard Reduction Program (NEHRP) classification and the safe distance to avoid any environmental effects is 650 m, following the deduced Peak Particle Velocity (PPV) and scaled distance (SD) relationship (PPV = 700.08 × SD-1.225) in mm/s and the Air over Pressure (air blast) formula (air blast = 170.23 × SD-0.071) in dB. In the light of prediction analysis, the maximum allowable charge weight per delay was found to be 591 kg with damage criterion of 12.5 mm/s at the closest site of the SUMED pipelines.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Scaling-up NLP Pipelines to Process Large Corpora of Clinical Notes.
Divita, G; Carter, M; Redd, A; Zeng, Q; Gupta, K; Trautner, B; Samore, M; Gundlapalli, A
2015-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". This paper describes the scale-up efforts at the VA Salt Lake City Health Care System to address processing large corpora of clinical notes through a natural language processing (NLP) pipeline. The use case described is a current project focused on detecting the presence of an indwelling urinary catheter in hospitalized patients and subsequent catheter-associated urinary tract infections. An NLP algorithm using v3NLP was developed to detect the presence of an indwelling urinary catheter in hospitalized patients. The algorithm was tested on a small corpus of notes on patients for whom the presence or absence of a catheter was already known (reference standard). In planning for a scale-up, we estimated that the original algorithm would have taken 2.4 days to run on a larger corpus of notes for this project (550,000 notes), and 27 days for a corpus of 6 million records representative of a national sample of notes. We approached scaling-up NLP pipelines through three techniques: pipeline replication via multi-threading, intra-annotator threading for tasks that can be further decomposed, and remote annotator services which enable annotator scale-out. The scale-up resulted in reducing the average time to process a record from 206 milliseconds to 17 milliseconds or a 12- fold increase in performance when applied to a corpus of 550,000 notes. Purposely simplistic in nature, these scale-up efforts are the straight forward evolution from small scale NLP processing to larger scale extraction without incurring associated complexities that are inherited by the use of the underlying UIMA framework. These efforts represent generalizable and widely applicable techniques that will aid other computationally complex NLP pipelines that are of need to be scaled out for processing and analyzing big data.
RGAugury: a pipeline for genome-wide prediction of resistance gene analogs (RGAs) in plants.
Li, Pingchuan; Quan, Xiande; Jia, Gaofeng; Xiao, Jin; Cloutier, Sylvie; You, Frank M
2016-11-02
Resistance gene analogs (RGAs), such as NBS-encoding proteins, receptor-like protein kinases (RLKs) and receptor-like proteins (RLPs), are potential R-genes that contain specific conserved domains and motifs. Thus, RGAs can be predicted based on their conserved structural features using bioinformatics tools. Computer programs have been developed for the identification of individual domains and motifs from the protein sequences of RGAs but none offer a systematic assessment of the different types of RGAs. A user-friendly and efficient pipeline is needed for large-scale genome-wide RGA predictions of the growing number of sequenced plant genomes. An integrative pipeline, named RGAugury, was developed to automate RGA prediction. The pipeline first identifies RGA-related protein domains and motifs, namely nucleotide binding site (NB-ARC), leucine rich repeat (LRR), transmembrane (TM), serine/threonine and tyrosine kinase (STTK), lysin motif (LysM), coiled-coil (CC) and Toll/Interleukin-1 receptor (TIR). RGA candidates are identified and classified into four major families based on the presence of combinations of these RGA domains and motifs: NBS-encoding, TM-CC, and membrane associated RLP and RLK. All time-consuming analyses of the pipeline are paralleled to improve performance. The pipeline was evaluated using the well-annotated Arabidopsis genome. A total of 98.5, 85.2, and 100 % of the reported NBS-encoding genes, membrane associated RLPs and RLKs were validated, respectively. The pipeline was also successfully applied to predict RGAs for 50 sequenced plant genomes. A user-friendly web interface was implemented to ease command line operations, facilitate visualization and simplify result management for multiple datasets. RGAugury is an efficiently integrative bioinformatics tool for large scale genome-wide identification of RGAs. It is freely available at Bitbucket: https://bitbucket.org/yaanlpc/rgaugury .
NASA Astrophysics Data System (ADS)
Lee, Rebecca Elizabeth
Despite the proliferation of women in higher education and the workforce, they have yet to achieve parity with men in many of the science, technology, engineering, and math (STEM) majors and careers. The gap is even greater in the representation of women from lower socioeconomic backgrounds. This study examined pre-college intervention strategies provided by the University of Southern California's Math, Engineering, Science Achievement (MESA) program, as well as the relationships and experiences that contributed to the success of underrepresented female high school students in the STEM pipeline. A social capital framework provided the backdrop to the study. This qualitative study takes an ethnographic approach, incorporating 11 interviews, 42 hours of observation, and document analysis to address the research questions: How does involvement in the MESA program impact female students' decisions to pursue a mathematics or science major in college? What is the role of significant others in supporting and encouraging student success? The findings revealed a continuous cycle of support for these students. The cycle started in the home environment, where parents were integral in the early influence on the students' decisions to pursue higher education. Relationships with teachers, counselors, and peers provided critical networks of support in helping these students to achieve their academic goals. Participation in the MESA program empowered the students and provided additional connections to knowledge-based resources. This study highlights the interplay among family, school, and the MESA program in the overall support of underrepresented female students in the STEM pipeline.
The Analysis of the T+X Program and a Proposal for a New Pilot
2013-07-01
analyze how suitable the T+X ratings are for expansion from a 4-year obligation ( 4YO ) to a 5YO. We start by looking at these ratings in 2008 through...a training pipeline of 7.8 months, on average. In combination with a 60-month PST, there was a 20-month deficit between the sailors’ 4YO and the...had a training pipeline of 6.5 months before June 2011 and about 6.0 months since then. Previously, PSTs varied from 54 to 60 months; thus, 4YO
Improved Photometry for the DASCH Pipeline
NASA Astrophysics Data System (ADS)
Tang, Sumin; Grindlay, Jonathan; Los, Edward; Servillat, Mathieu
2013-07-01
The Digital Access to a Sky Century@Harvard (DASCH) project is digitizing the ˜500,000 glass plate images obtained (full sky) by the Harvard College Observatory from 1885 to 1992. Astrometry and photometry for each resolved object are derived with photometric rms values of ˜0.15 mag for the initial photometry analysis pipeline. Here we describe new developments for DASCH photometry, applied to the Kepler field, that have yielded further improvements, including better identification of image blends and plate defects by measuring image profiles and astrometric deviations. A local calibration procedure using nearby stars in a similar magnitude range as the program star (similar to what has been done for visual photometry from the plates) yields additional improvement for a net photometric rms of ˜0.1 mag. We also describe statistical measures of light curves that are now used in the DASCH pipeline processing to identify new variables autonomously. The DASCH photometry methods described here are used in the pipeline processing for the data releases of DASCH data,5 as well as for a forthcoming paper on the long-term variables discovered by DASCH in the Kepler field.
Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang
2011-01-01
I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and FG-MD, were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of beta-proteins are still needed to further improve the I-TASSER pipeline. PMID:22069036
Image processing pipeline for synchrotron-radiation-based tomographic microscopy.
Hintermüller, C; Marone, F; Isenegger, A; Stampanoni, M
2010-07-01
With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 x 1024 to 2048 x 2048 pixels and are acquired in 5-15 min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc.
The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs
NASA Astrophysics Data System (ADS)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah
2016-03-01
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results canmore » be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.« less
A Pipeline Tool for CCD Image Processing
NASA Astrophysics Data System (ADS)
Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.
MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.
Pharmaceutical new product development: the increasing role of in-licensing.
Edwards, Nancy V
2008-12-01
Many pharmaceutical companies are facing a pipeline gap because of the increasing economic burden and uncertainty associated with internal research and development programs designed to develop new pharmaceutical products. To fill this pipeline gap, pharmaceutical companies are increasingly relying on in-licensing opportunities. New business development identifies new pharmaceuticals that satisfy unmet needs and are a good strategic fit for the company, completes valuation models and forecasts, evaluates the ability of the company to develop and launch products, and pursues in-licensing agreements for pharmaceuticals that cannot be developed internally on a timely basis. These agreements involve the transfer of access rights for patents, trademarks, or similar intellectual property from an outside company in exchange for payments. Despite the risks, in-licensing is increasingly becoming the preferred method for pharmaceutical companies with pipeline gaps to bring new pharmaceuticals to the clinician.
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
49 CFR 110.30 - Grant application.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., Pipeline and Hazardous Materials Safety Administration, U.S. Department of Transportation, East Building... emergency response team; and (D) The impact that the grant will have on the program. (ii) A discussion of...
ERIC Educational Resources Information Center
Bernstein, Hamutal; Martin, Carlos; Eyster, Lauren; Anderson, Theresa; Owen, Stephanie; Martin-Caughey, Amanda
2015-01-01
The Urban Institute conducted an implementation and participant-outcomes evaluation of the Alaska Native Science & Engineering Program (ANSEP). ANSEP is a multi-stage initiative designed to prepare and support Alaska Native students from middle school through graduate school to succeed in science, technology, engineering, and math (STEM)…
2016-06-10
Democratic Society White House Leadership Development Program (WHLD) Harvard Kennedy School (HKS)–Senior Executive Fellows Program George......Nurse Leaders: An Exploration of Current Nurse Leadership Development in the Veterans Health Administration 5a. CONTRACT NUMBER 5b. GRANT NUMBER
ERIC Educational Resources Information Center
Palmer, Mark H.; Elmore, R. Douglas; Watson, Mary Jo; Kloesel, Kevin; Palmer, Kristen
2009-01-01
Very few Native American students pursue careers in the geosciences. To address this national problem, several units at the University of Oklahoma are implementing a geoscience "pipeline" program that is designed to increase the number of Native American students entering geoscience disciplines. One of the program's strategies includes…
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2011 CFR
2011-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2012 CFR
2012-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2013 CFR
2013-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2014 CFR
2014-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
NASA Astrophysics Data System (ADS)
Hosseini, Mahmood; Salek, Shamila; Moradi, Masoud
2008-07-01
The effect of corrosion phenomenon has been investigated by performing some sets of 3-Dimensional Nonlinear Time History Analysis (3-D NLTHA) in which soil structure interaction as well as wave propagation effects have been taken into consideration. The 3-D NLTHA has been performed by using a finite element computer program, and both states of overall and local corrosions have been considered for the study. The corrosion has been modeled in the computer program by introducing decreased values of either pipe wall thickness or modulus of elasticity and Poisson ratio. Three sets of 3-component accelerograms have been used in analyses, and some appropriate numbers of zeros have been added at the beginning of records to take into account the wave propagation in soil and its multi-support excitation effect. The soil has been modeled by nonlinear springs in longitudinal, lateral, and vertical directions. A relatively long segment of the pipeline has been considered for the study and the effect of end conditions has been investigated by assuming different kinds end supports for the segment. After studying the corroded pipeline, a remedy has been considered for the seismic retrofit of corroded pipe by using a kind of Fiber Reinforced Polymers (FRP) cover. The analyses have been repeated for the retrofitted pipeline to realize the adequacy of FRP cover. Numerical results show that if the length of the pipeline segment is large enough, comparing to the wave length of shear wave in the soil, the end conditions do not have any major effect on the maximum stress and strain values in the pipe. Results also show that corrosion can lead to the increase in plastic strain values in the pipe up to 4 times in the case of overall corrosion and up to 20 times in the case of local corrosion. The satisfactory effect of using FRP cover is also shown by the analyses results, which confirm the decrease of strain values to 1/3.
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2012 CFR
2012-07-01
... interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for the benefit of the...
30 CFR 1206.157 - Determination of transportation allowances.
Code of Federal Regulations, 2011 CFR
2011-07-01
... interconnected to a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research, development, and commercialization programs on natural gas related topics for the benefit of the...
NASA Astrophysics Data System (ADS)
Nitheesh Kumar, P.; Khan, Vishwas Chandra; Balaganesan, G.; Pradhan, A. K.; Sivakumar, M. S.
2018-04-01
The present study is concerned with the repair of through thickness corrosion or leaking defects in metallic pipelines using a commercially available metallic seal and glass/epoxy composite. Pipe specimens are made with three different types of most commonly occurring through thickness corrosion/leaking defects. The metallic seal is applied over the through thickness corrosion/leaking defect and it is reinforced with glass/epoxy composite overwrap. The main objective of the metallic seal is to arrest the leak at live pressure. After reinforcing the metallic seal with glass/epoxy composite overwrap, the repaired composite wrap is able to sustain high pressures. Burst test is performed for different configurations of metallic seal and optimum configuration of metallic seal is determined. The optimum configurations of metallic seal for three different types of through thickness corrosion/leaking defects are further reinforced with glass/epoxy composite wrap and experimental failure pressure is determined by performing the burst test. An analytical model as per ISO 24817 has been developed to validate experimental results.
Orecchioni, Marco; Bedognetti, Davide; Newman, Leon; Fuoco, Claudia; Spada, Filomena; Hendrickx, Wouter; Marincola, Francesco M; Sgarrella, Francesco; Rodrigues, Artur Filipe; Ménard-Moyon, Cécilia; Cesareni, Gianni; Kostarelos, Kostas; Bianco, Alberto; Delogu, Lucia G
2017-10-24
Understanding the biomolecular interactions between graphene and human immune cells is a prerequisite for its utilization as a diagnostic or therapeutic tool. To characterize the complex interactions between graphene and immune cells, we propose an integrative analytical pipeline encompassing the evaluation of molecular and cellular parameters. Herein, we use single-cell mass cytometry to dissect the effects of graphene oxide (GO) and GO functionalized with amino groups (GONH 2 ) on 15 immune cell populations, interrogating 30 markers at the single-cell level. Next, the integration of single-cell mass cytometry with genome-wide transcriptome analysis shows that the amine groups reduce the perturbations caused by GO on cell metabolism and increase biocompatibility. Moreover, GONH 2 polarizes T-cell and monocyte activation toward a T helper-1/M1 immune response. This study describes an innovative approach for the analysis of the effects of nanomaterials on distinct immune cells, laying the foundation for the incorporation of single-cell mass cytometry on the experimental pipeline.
Integrative pipeline for profiling DNA copy number and inferring tumor phylogeny.
Urrutia, Eugene; Chen, Hao; Zhou, Zilu; Zhang, Nancy R; Jiang, Yuchao
2018-06-15
Copy number variation is an important and abundant source of variation in the human genome, which has been associated with a number of diseases, especially cancer. Massively parallel next-generation sequencing allows copy number profiling with fine resolution. Such efforts, however, have met with mixed successes, with setbacks arising partly from the lack of reliable analytical methods to meet the diverse and unique challenges arising from the myriad experimental designs and study goals in genetic studies. In cancer genomics, detection of somatic copy number changes and profiling of allele-specific copy number (ASCN) are complicated by experimental biases and artifacts as well as normal cell contamination and cancer subclone admixture. Furthermore, careful statistical modeling is warranted to reconstruct tumor phylogeny by both somatic ASCN changes and single nucleotide variants. Here we describe a flexible computational pipeline, MARATHON, which integrates multiple related statistical software for copy number profiling and downstream analyses in disease genetic studies. MARATHON is publicly available at https://github.com/yuchaojiang/MARATHON. Supplementary data are available at Bioinformatics online.
A real-time coherent dedispersion pipeline for the giant metrewave radio telescope
NASA Astrophysics Data System (ADS)
De, Kishalay; Gupta, Yashwant
2016-02-01
A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.
Deep ocean corrosion research in support of Oman India gas pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, F.W.; McKeehan, D.S.
1995-12-01
The increasing interest in deepwater exploration and production has motivated the development of technologies required to accomplish tasks heretofore possible only onshore and in shallow water. The tremendous expense of technology development and the cost of specialized equipment has created concerns that the design life of these facilities may be compromised by corrosion. The requirements to develop and prove design parameters to meet these demands will require an ongoing environmental testing and materials evaluation and development program. This paper describes a two-fold corrosion testing program involving: (1) the installation of two corrosion test devices installed in-situ, and (2) a laboratorymore » test conducted in simulated site-specific seawater. These tests are expected to qualify key parameters necessary to design a cathodic protection system to protect the Oman-to-India pipeline.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, A.G.
The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membershipmore » is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program.« less
Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A
2005-04-01
The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.
Fuchs, Jonathan; Kouyate, Aminta; Kroboth, Liz; McFarland, Willi
2016-01-01
Structured, mentored research programs for high school and undergraduate students from underrepresented minority (URM) backgrounds are needed to increase the diversity of our nation’s biomedical research workforce. In particular, a robust pipeline of investigators from the communities disproportionately affected by the HIV epidemic is needed not only for fairness and equity but for insights and innovations to address persistent racial and ethnic disparities in new infections. We created the Summer HIV/AIDS Research Program (SHARP) at the San Francisco Department of Public Health for URM undergraduates as a 12-week program of hands-on research experience, one-on-one mentoring by a senior HIV investigator, didactic seminars for content and research methods, and networking opportunities. The first four cohorts (2012–2015) of SHARP gained research skills, built confidence in their abilities and self-identified as scientists. In addition, the majority of program alumni is employed in research positions and has been admitted to or is pursuing graduate degree programs in fields related to HIV prevention. While we await empirical studies of specific mentoring strategies at early educational stages, programs that engage faculty who are sensitive to the unique challenges facing diverse students and who draw lessons from established mentoring frameworks can help build an inclusive generation of HIV researchers. PMID:27066986
Fuchs, Jonathan; Kouyate, Aminta; Kroboth, Liz; McFarland, Willi
2016-09-01
Structured, mentored research programs for high school and undergraduate students from underrepresented minority (URM) backgrounds are needed to increase the diversity of our nation's biomedical research workforce. In particular, a robust pipeline of investigators from the communities disproportionately affected by the HIV epidemic is needed not only for fairness and equity but for insights and innovations to address persistent racial and ethnic disparities in new infections. We created the Summer HIV/AIDS Research Program (SHARP) at the San Francisco Department of Public Health for URM undergraduates as a 12-week program of hands-on research experience, one-on-one mentoring by a senior HIV investigator, didactic seminars for content and research methods, and networking opportunities. The first four cohorts (2012-2015) of SHARP gained research skills, built confidence in their abilities and self-identified as scientists. In addition, the majority of program alumni is employed in research positions and has been admitted to or is pursuing graduate degree programs in fields related to HIV prevention. While we await empirical studies of specific mentoring strategies at early educational stages, programs that engage faculty who are sensitive to the unique challenges facing diverse students and who draw lessons from established mentoring frameworks can help build an inclusive generation of HIV researchers.
NASA Astrophysics Data System (ADS)
Karimi, Kurosh; Shirzaditabar, Farzad
2017-08-01
The analytic signal of magnitude of the magnetic field’s components and its first derivatives have been employed for locating magnetic structures, which can be considered as point-dipoles or line of dipoles. Although similar methods have been used for locating such magnetic anomalies, they cannot estimate the positions of anomalies in noisy states with an acceptable accuracy. The methods are also inexact in determining the depth of deep anomalies. In noisy cases and in places other than poles, the maximum points of the magnitude of the magnetic vector components and Az are not located exactly above 3D bodies. Consequently, the horizontal location estimates of bodies are accompanied by errors. Here, the previous methods are altered and generalized to locate deeper models in the presence of noise even at lower magnetic latitudes. In addition, a statistical technique is presented for working in noisy areas and a new method, which is resistant to noise by using a ‘depths mean’ method, is made. Reduction to the pole transformation is also used to find the most possible actual horizontal body location. Deep models are also well estimated. The method is tested on real magnetic data over an urban gas pipeline in the vicinity of Kermanshah province, Iran. The estimated location of the pipeline is accurate in accordance with the result of the half-width method.
Customisation of the exome data analysis pipeline using a combinatorial approach.
Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay
2012-01-01
The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.
The MSCA Program: Developing Analytic Unicorns
ERIC Educational Resources Information Center
Houghton, David M.; Schertzer, Clint; Beck, Scott
2018-01-01
Marketing analytics students who can communicate effectively with decision makers are in high demand. These "analytic unicorns" are hard to find. The Master of Science in Customer Analytics (MSCA) degree program at Xavier University seeks to fill that need. In this paper, we discuss the process of creating the MSCA program. We outline…
Code of Federal Regulations, 2013 CFR
2013-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2013-10-01 2013-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2014 CFR
2014-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2014-10-01 2014-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2011 CFR
2011-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2011-10-01 2011-10-01 false What knowledge and training must personnel have to...
Code of Federal Regulations, 2012 CFR
2012-10-01
...: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge... to the integrity management program possesses and maintains a thorough knowledge of the integrity... 49 Transportation 3 2012-10-01 2012-10-01 false What knowledge and training must personnel have to...
ERIC Educational Resources Information Center
Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.
2014-01-01
African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…
Developing and Managing Talent in the SEA. Benchmark. No. 4
ERIC Educational Resources Information Center
Gross, B.; Jochim A.
2013-01-01
State education agencies (SEAs) are reframing their work to be more coordinated and strategic but talent in most SEAs continues to be in large part defined by federal programs and oriented toward the routines of compliance. Existing talent pipelines in SEAs are rooted in the historic functions of administering federal programs and doing little…
ERIC Educational Resources Information Center
Wesley Schultz, P.; Hernandez, Paul R.; Woodcock, Anna; Estrada, Mica; Chance, Randie C.; Aguilar, Maria; Serpe, Richard T.
2011-01-01
For more than 40 years, there has been a concerted national effort to promote diversity among the scientific research community. Yet given the persistent national-level disparity in educational achievements of students from various ethnic and racial groups, the efficacy of these programs has come into question. The current study reports results…
Learning Systematically from Experience through a Research-to-Practice Pipeline in Chicago
ERIC Educational Resources Information Center
Fine, Wendy; Lansing, Jiffy; Bacon, Marshaun
2018-01-01
The Becoming A Man (BAM) program is a school-based group counseling and mentoring program run by Youth Guidance (YG), a community organization that serves children in Chicago schools who are at risk. BAM guides young men to learn, internalize, and practice social cognitive skills, make responsible decisions for their future, and become positive…
Advanced Technological Education (ATE) Program: Building a Pipeline of Skilled Workers. Policy Brief
ERIC Educational Resources Information Center
American Youth Policy Forum, 2010
2010-01-01
In the Fall of 2008, the American Youth Policy Forum hosted a series of three Capitol Hill forums showcasing the Advanced Technological Education (ATE) program supported by the National Science Foundation (NSF). The goal of these forums was to educate national policymakers about the importance of: (1) improving the science and math competencies of…
NASA Astrophysics Data System (ADS)
Kyrychok, Vladyslav; Torop, Vasyl
2018-03-01
The present paper is devoted to the problem of the assessment of probable crack growth at pressure vessel nozzles zone under the cyclic seismic loads. The approaches to creating distributed pipeline systems, connected to equipment are being proposed. The possibility of using in common different finite element program packages for accurate estimation of the strength of bonded pipelines and pressure vessels systems is shown and justified. The authors propose checking the danger of defects in nozzle domain, evaluate the residual life of the system, basing on the developed approach.
Simulation of a manual electric-arc welding in a working gas pipeline. 1. Formulation of the problem
NASA Astrophysics Data System (ADS)
Baikov, V. I.; Gishkelyuk, I. A.; Rus', A. M.; Sidorovich, T. V.; Tonkonogov, B. A.
2010-11-01
Problems of mathematical simulation of the temperature stresses arising in the wall of a pipe of a cross-country gas pipeline in the process of electric-arc welding of defects in it have been considered. Mathematical models of formation of temperatures, deformations, and stresses in a gas pipe subjected to phase transformations have been developed. These models were numerically realized in the form of algorithms representing a part of an application-program package. Results of verification of the computational complex and calculation results obtained with it are presented.
49 CFR 109.3 - Inspections and Investigations.
Code of Federal Regulations, 2014 CFR
2014-10-01
....3 Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS PROGRAM PROCEDURES Inspections and Investigations § 109.3 Inspections and Investigations. (a...
Transport of thermal water from well to thermal baths
NASA Astrophysics Data System (ADS)
Montegrossi, Giordano; Vaselli, Orlando; Tassi, Franco; Nocentini, Matteo; Liccioli, Caterina; Nisi, Barbara
2013-04-01
The main problem in building a thermal bath is having a hot spring or a thermal well located in an appropriate position for customer access; since Roman age, thermal baths were distributed in the whole empire and often road and cities were built all around afterwards. Nowadays, the perspectives are changed and occasionally the thermal resource is required to be transported with a pipeline system from the main source to the spa. Nevertheless, the geothermal fluid may show problems of corrosion and scaling during transport. In the Ambra valley, central Italy, a geothermal well has recently been drilled and it discharges a Ca(Mg)-SO4, CO2-rich water at the temperature of 41 °C, that could be used for supplying a new spa in the surrounding areas of the well itself. The main problem is that the producing well is located in a forest tree ca. 4 km far away from the nearest structure suitable to host the thermal bath. In this study, we illustrate the pipeline design from the producing well to the spa, constraining the physical and geochemical parameters to reduce scaling and corrosion phenomena. The starting point is the thermal well that has a flow rate ranging from 22 up to 25 L/sec. The thermal fluid is heavily precipitating calcite (50-100 ton/month) due to the calcite-CO2 equilibrium in the reservoir, where a partial pressure of 11 bar of CO2 is present. One of the most vexing problems in investigating scaling processed during the fluid transport in the pipeline is that there is not a proper software package for multiphase fluid flow in pipes characterized by such a complex chemistry. As a consequence, we used a modified TOUGHREACT with Pitzer database, arranged to use Darcy-Weisbach equation, and applying "fictitious" material properties in order to give the proper y- z- velocity profile in comparison to the analytical solution for laminar fluid flow in pipes. This investigation gave as a result the lowest CO2 partial pressure to be kept in the pipeline (nearly 2.5 bar) to avoid uncontrolled calcite precipitation, and accordingly the pipeline path was designed. Non-linear phenomena that may originate calcite precipitation, such as phase separation and pressure waves, were discussed. The pipeline and the thermal bath are planned to be built next year.
Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay
2018-01-01
Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462
Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.
Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min
2013-12-01
Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
Theoretical and Conceptual Framework for a High School Pathways to Pharmacy Program
Bauman, Jerry
2010-01-01
Objectives To determine whether participation in the University of Illinois at Chicago College of Pharmacy (UIC-COP) Pathways to Pharmacy, an early urban pipeline program, motivated underrepresented minority students to pursue a prepharmacy curriculum in college and choose pharmacy as a career. Methods Over a 4-year period, underrepresented minority high school students participated in a comprehensive 6-week program that included 3 weeks of prepharmacy curriculum and intensive socialization and 3 weeks working as a pharmacy technician in a chain pharmacy. The High School Survey of Student Engagement (HSSSE) was administered 3 times to 120 program participants from 2005-2008, with 4 open-ended questions added to the pretest, 3 open-ended questions added to the test administered at the midpoint of the program, and 7 open-ended questions added to the posttest. Results After completing the program, 88 (75%) of the 120 students enrolled in the college's prepharmacy curriculum and planned to pursue a career in pharmacy, 10 (8%) were not interested in pursuing a career in pharmacy, and 20 (17%) were undecided, compared to the pretest data which showed that 40 (33%) were interested in a career in pharmacy, and 80 (67%) were undecided (p < 0.0001). Conclusions Participation in a Pathways to Pharmacy program grounded in both a theoretical and conceptual socialization model framework increased the number of underrepresented minority students in the pipeline to pharmacy schools. PMID:21179260
Theoretical and conceptual framework for a high school pathways to pharmacy program.
Awé, Clara; Bauman, Jerry
2010-10-11
To determine whether participation in the University of Illinois at Chicago College of Pharmacy (UIC-COP) Pathways to Pharmacy, an early urban pipeline program, motivated underrepresented minority students to pursue a prepharmacy curriculum in college and choose pharmacy as a career. Over a 4-year period, underrepresented minority high school students participated in a comprehensive 6-week program that included 3 weeks of prepharmacy curriculum and intensive socialization and 3 weeks working as a pharmacy technician in a chain pharmacy. The High School Survey of Student Engagement (HSSSE) was administered 3 times to 120 program participants from 2005-2008, with 4 open-ended questions added to the pretest, 3 open-ended questions added to the test administered at the midpoint of the program, and 7 open-ended questions added to the posttest. After completing the program, 88 (75%) of the 120 students enrolled in the college's prepharmacy curriculum and planned to pursue a career in pharmacy, 10 (8%) were not interested in pursuing a career in pharmacy, and 20 (17%) were undecided, compared to the pretest data which showed that 40 (33%) were interested in a career in pharmacy, and 80 (67%) were undecided (p < 0.0001). Participation in a Pathways to Pharmacy program grounded in both a theoretical and conceptual socialization model framework increased the number of underrepresented minority students in the pipeline to pharmacy schools.
Rashied-Henry, Kweli; Fraser-White, Marilyn; Roberts, Calpurnyia B; Wilson, Tracey E; Morgan, Rochelle; Brown, Humberto; Shaw, Raphael; Jean-Louis, Girardin; Graham, Yvonne J; Brown, Clinton; Browne, Ruth
2012-01-01
The purpose of this paper was to describe the development and implementation of a health disparities summer internship program for minority high school students that was created to increase their knowledge of health disparities, provide hands-on training in community-engaged research, support their efforts to advocate for policy change, and further encourage youth to pursue careers in the health professions. Fifty-one high school students who were enrolled in a well-established, science-enrichment after-school program in Brooklyn, New York, participated in a 4-week summer internship program. Students conducted a literature review, focus groups/interviews, geographic mapping or survey development that focused on reducing health disparities at 1 of 15 partnering CBOs. Overall, student interns gained an increase in knowledge of racial/ethnic health disparities. There was a 36.2% increase in students expressing an interest in pursuing careers in minority health post program. The majority of the participating CBOs were able to utilize the results of the student-led research projects for their programs. In addition, research conclusions and policy recommendations based on the students' projects were given to local elected officials. As demonstrated by our program, community-academic partnerships can provide educational opportunities to strengthen the academic pipeline for students of color interested in health careers and health disparities research.
STARL -- a Program to Correct CCD Image Defects
NASA Astrophysics Data System (ADS)
Narbutis, D.; Vanagas, R.; Vansevičius, V.
We present a program tool, STARL, designed for automatic detection and correction of various defects in CCD images. It uses genetic algorithm for deblending and restoring of overlapping saturated stars in crowded stellar fields. Using Subaru Telescope Suprime-Cam images we demonstrate that the program can be implemented in the wide-field survey data processing pipelines for production of high quality color mosaics. The source code and examples are available at the STARL website.
Implementing a Workforce Development Pipeline
NASA Technical Reports Server (NTRS)
Hix, Billy
2002-01-01
Research shows that the number of highly trained scientists and engineers has continued a steady decline during the 1990's. Furthermore, at the high school level, almost 40% of the total high school graduates are seeking technical skills in preparation of entering the workforce directly. The decrease of students in technology and science programs, along with the lack of viable vocational programs, haunts educators and businesses alike. However, MSFC (Marshall Space Flight Center) has the opportunity to become a leading edge model of workforce development by offering a unified program of apprenticeships, workshops, and educational initiatives. These programs will be designed to encourage young people of all backgrounds to pursue the fields of technology and science, to assist research opportunities, and to support teachers in the systemic changes that they are facing. The emphasis of our program based on grade levels will be: Elementary Level: Exposure to the workforce. Middle School: Examine the workforce. High School and beyond: Instruct the workforce. It is proposed that MSFC create a well-integrated Workforce Development Pipeline Program. The program will act to integrate the many and varied programs offered across MSFC directorates and offices. It will offer a clear path of programs for students throughout middle school, high school, technical training, and college and universities. The end result would consist of technicians, bachelors degrees, masters degrees, and PhDs in science and engineering fields entering the nation's workforce, with a focus on NASA's future personnel needs.
Cooperative agreement # RITARS-14-H-HOU : final report.
DOT National Transportation Integrated Search
2016-07-15
The University of Houston, in partnership with the Gas Technology Institute, and with support from the : Commercial Remote Sensing & Spatial Technologies Program at the U.S. Department of Transportation : undertook a pilot project to mitigate pipelin...
Rep. Brown, Corrine [D-FL-3
2011-10-14
House - 10/17/2011 Referred to the Subcommittee on Railroads, Pipelines, and Hazardous Materials. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
18 CFR 284.8 - Release of firm capacity on interstate pipelines.
Code of Federal Regulations, 2010 CFR
2010-04-01
... paragraph (h)(3) of this section; (ii) A release of capacity to a marketer participating in a state...) A release to a marketer participating in a state-regulated retail access program exempt from bidding...
Crystallographic data processing for free-electron laser sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Thomas A., E-mail: taw@physics.org; Barty, Anton; Stellato, Francesco
2013-07-01
A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show thatmore » the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.« less
TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.
Clark, Lindsay V; Sacks, Erik J
2016-01-01
In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.
Byrne, Lauren M; Holt, Kathleen D; Richter, Thomas; Miller, Rebecca S; Nasca, Thomas J
2010-12-01
Increased focus on the number and type of physicians delivering health care in the United States necessitates a better understanding of changes in graduate medical education (GME). Data collected by the Accreditation Council for Graduate Medical Education (ACGME) allow longitudinal tracking of residents, revealing the number and type of residents who continue GME following completion of an initial residency. We examined trends in the percent of graduates pursuing additional clinical education following graduation from ACGME-accredited pipeline specialty programs (specialties leading to initial board certification). Using data collected annually by the ACGME, we tracked residents graduating from ACGME-accredited pipeline specialty programs between academic year (AY) 2002-2003 and AY 2006-2007 and those pursuing additional ACGME-accredited training within 2 years. We examined changes in the number of graduates and the percent of graduates continuing GME by specialty, by type of medical school, and overall. The number of pipeline specialty graduates increased by 1171 (5.3%) between AY 2002-2003 and AY 2006-2007. During the same period, the number of graduates pursuing additional GME increased by 1059 (16.7%). The overall rate of continuing GME increased each year, from 28.5% (6331/22229) in AY 2002-2003 to 31.6% (7390/23400) in AY 2006-2007. Rates differed by specialty and for US medical school graduates (26.4% [3896/14752] in AY 2002-2003 to 31.6% [4718/14941] in AY 2006-2007) versus international medical graduates (35.2% [2118/6023] to 33.8% [2246/6647]). The number of graduates and the rate of continuing GME increased from AY 2002-2003 to AY 2006-2007. Our findings show a recent increase in the rate of continued training for US medical school graduates compared to international medical graduates. Our results differ from previously reported rates of subspecialization in the literature. Tracking individual residents through residency and fellowship programs provides a better understanding of residents' pathways to practice.
Characterization and Validation of Transiting Planets in the TESS SPOC Pipeline
NASA Astrophysics Data System (ADS)
Twicken, Joseph D.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Wohler, Bill
2018-06-01
Light curves for Transiting Exoplanet Survey Satellite (TESS) target stars will be extracted and searched for transiting planet signatures in the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. Targets for which the transiting planet detection threshold is exceeded will be processed in the Data Validation (DV) component of the Pipeline. The primary functions of DV are to (1) characterize planets identified in the transiting planet search, (2) search for additional transiting planet signatures in light curves after modeled transit signatures have been removed, and (3) perform a comprehensive suite of diagnostic tests to aid in discrimination between true transiting planets and false positive detections. DV data products include extensive reports by target, one-page summaries by planet candidate, and tabulated transit model fit and diagnostic test results. DV products may be employed by humans and automated systems to vet planet candidates identified in the Pipeline. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. We describe the Data Validation component of the SPOC Pipeline. The diagnostic tests exploit the flux (i.e., light curve) and pixel time series associated with each target to support the determination of the origin of each purported transiting planet signature. We also highlight the differences between the DV components for Kepler and TESS. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Operational experience in underwater photogrammetry
NASA Astrophysics Data System (ADS)
Leatherdale, John D.; John Turner, D.
Underwater photogrammetry has become established as a cost-effective technique for inspection and maintenance of platforms and pipelines for the offshore oil industry. A commercial service based in Scotland operates in the North Sea, USA, Brazil, West Africa and Australia. 70 mm cameras and flash units are built for the purpose and analytical plotters and computer graphics systems are used for photogrammetric measurement and analysis of damage, corrosion, weld failures and redesign of underwater structures. Users are seeking simple, low-cost systems for photogrammetric analysis which their engineers can use themselves.
An unusual and persistent contamination of drinking water by cutting oil.
Rella, R; Sturaro, A; Parvoli, G; Ferrara, D; Doretti, L
2003-02-01
Drinking water contamination by materials, such as cutting oil, used to set up pipelines is an uncommon but possible event. This paper describes the analytical procedures used to identify the components of that contaminant in drinking water. Volatile and semi-volatile chemical species, responsible for an unpleasant taste and odour, were recognised by solid phase microextraction and GC/MS techniques. Among the volatile compounds, the presence of xylenes, bornyl acetate and diphenyl ether was confirmed by certificate standards and quantified in the most contaminated samples.
The year's new drugs & biologics 2016: Part II - Trends and highlights of an unforgettable year.
Graul, A I; Dulsat, C; Tracy, M; Cruces, E
2017-02-01
This eagle's-eye overview of the drug industry in 2016 provides insight into some of last year's top stories, including disease outbreaks that drove R&D, orphan drug development, pipeline attrition, drug pricing, and the ongoing movement in M&A. We also consider recent political events in the U.S. and U.K. and their potential impact on the industry in the years to come, and take a glimpse into the crystal ball to anticipate the new drugs that may be approved in 2017. Copyright 2017 Clarivate Analytics.
ERIC Educational Resources Information Center
Whitebook, Marcy; Sakai, Laura; Kipnis, Fran; Bellm, Dan; Almaraz, Mirella
2010-01-01
Interest in expanding access to higher education has been driven by concerns about ethnic and linguistic stratification within the early childhood workforce, and building a pipeline for diversifying the early care and education (ECE) field's leadership. "Cohort" B.A. completion programs, which target small groups of adults working in ECE…
Practice-based evidence: profiling the safety of cilostazol by text-mining of clinical notes.
Leeper, Nicholas J; Bauer-Mehren, Anna; Iyer, Srinivasan V; Lependu, Paea; Olson, Cliff; Shah, Nigam H
2013-01-01
Peripheral arterial disease (PAD) is a growing problem with few available therapies. Cilostazol is the only FDA-approved medication with a class I indication for intermittent claudication, but carries a black box warning due to concerns for increased cardiovascular mortality. To assess the validity of this black box warning, we employed a novel text-analytics pipeline to quantify the adverse events associated with Cilostazol use in a clinical setting, including patients with congestive heart failure (CHF). We analyzed the electronic medical records of 1.8 million subjects from the Stanford clinical data warehouse spanning 18 years using a novel text-mining/statistical analytics pipeline. We identified 232 PAD patients taking Cilostazol and created a control group of 1,160 PAD patients not taking this drug using 1:5 propensity-score matching. Over a mean follow up of 4.2 years, we observed no association between Cilostazol use and any major adverse cardiovascular event including stroke (OR = 1.13, CI [0.82, 1.55]), myocardial infarction (OR = 1.00, CI [0.71, 1.39]), or death (OR = 0.86, CI [0.63, 1.18]). Cilostazol was not associated with an increase in any arrhythmic complication. We also identified a subset of CHF patients who were prescribed Cilostazol despite its black box warning, and found that it did not increase mortality in this high-risk group of patients. This proof of principle study shows the potential of text-analytics to mine clinical data warehouses to uncover 'natural experiments' such as the use of Cilostazol in CHF patients. We envision this method will have broad applications for examining difficult to test clinical hypotheses and to aid in post-marketing drug safety surveillance. Moreover, our observations argue for a prospective study to examine the validity of a drug safety warning that may be unnecessarily limiting the use of an efficacious therapy.
Practice-Based Evidence: Profiling the Safety of Cilostazol by Text-Mining of Clinical Notes
Iyer, Srinivasan V.; LePendu, Paea; Olson, Cliff; Shah, Nigam H.
2013-01-01
Background Peripheral arterial disease (PAD) is a growing problem with few available therapies. Cilostazol is the only FDA-approved medication with a class I indication for intermittent claudication, but carries a black box warning due to concerns for increased cardiovascular mortality. To assess the validity of this black box warning, we employed a novel text-analytics pipeline to quantify the adverse events associated with Cilostazol use in a clinical setting, including patients with congestive heart failure (CHF). Methods and Results We analyzed the electronic medical records of 1.8 million subjects from the Stanford clinical data warehouse spanning 18 years using a novel text-mining/statistical analytics pipeline. We identified 232 PAD patients taking Cilostazol and created a control group of 1,160 PAD patients not taking this drug using 1∶5 propensity-score matching. Over a mean follow up of 4.2 years, we observed no association between Cilostazol use and any major adverse cardiovascular event including stroke (OR = 1.13, CI [0.82, 1.55]), myocardial infarction (OR = 1.00, CI [0.71, 1.39]), or death (OR = 0.86, CI [0.63, 1.18]). Cilostazol was not associated with an increase in any arrhythmic complication. We also identified a subset of CHF patients who were prescribed Cilostazol despite its black box warning, and found that it did not increase mortality in this high-risk group of patients. Conclusions This proof of principle study shows the potential of text-analytics to mine clinical data warehouses to uncover ‘natural experiments’ such as the use of Cilostazol in CHF patients. We envision this method will have broad applications for examining difficult to test clinical hypotheses and to aid in post-marketing drug safety surveillance. Moreover, our observations argue for a prospective study to examine the validity of a drug safety warning that may be unnecessarily limiting the use of an efficacious therapy. PMID:23717437
Geolocation Support for Water Supply and Sewerage Projects in Azerbaijan
NASA Astrophysics Data System (ADS)
Qocamanov, M. H.; Gurbanov, Ch. Z.
2016-10-01
Drinking water supply and sewerage system designing and reconstruction projects are being extensively conducted in Azerbaijan Republic. During implementation of such projects, collecting large amount of information about the area and detailed investigations are crucial. Joint use of the aerospace monitoring and GIS play an essential role for the studies of the impact of environmental factors, development of the analytical information systems and others, while achieving the reliable performance of the existing and designed major water supply pipelines, as well as construction and exploitation of the technical installations. With our participation the GIS has been created in "Azersu" OJSC that includes systematic database of the drinking water supply and sewerage system, and rain water networks to carry out necessary geo information analysis. GIScreated based on "Microstation" platform and aerospace data. Should be mentioned that, in the country, specifically in large cities (i.e. Baku, Ganja, Sumqait, etc.,) drinking water supply pipelines cross regions with different physico-geographical conditions, geo-morphological compositions and seismotectonics.Mains water supply lines in many accidents occur during the operation, it also creates problems with drinking water consumers. In some cases the damage is caused by large-scale accidents. Long-term experience gives reason to say that the elimination of the consequences of accidents is a major cost. Therefore, to avoid such events and to prevent their exploitation and geodetic monitoring system to improve the rules on key issues. Therefore, constant control of the plan-height positioning, geodetic measurements for the detailed examination of the dynamics, repetition of the geodetic measurements for certain time intervals, or in other words regular monitoring is very important. During geodetic monitoring using the GIS has special significance. Given that, collecting geodetic monitoring measurements of the main pipelines on the same coordinate system and processing these data on a single GIS system allows the implementation of overall assessment of plan-height state of major water supply pipeline network facilities and the study of the impact of water supply network on environment and alternatively, the impact of natural processes on major pipeline.
Natural gas pipeline leaks across Washington, DC.
Jackson, Robert B; Down, Adrian; Phillips, Nathan G; Ackley, Robert C; Cook, Charles W; Plata, Desiree L; Zhao, Kaiguang
2014-01-01
Pipeline safety in the United States has increased in recent decades, but incidents involving natural gas pipelines still cause an average of 17 fatalities and $133 M in property damage annually. Natural gas leaks are also the largest anthropogenic source of the greenhouse gas methane (CH4) in the U.S. To reduce pipeline leakage and increase consumer safety, we deployed a Picarro G2301 Cavity Ring-Down Spectrometer in a car, mapping 5893 natural gas leaks (2.5 to 88.6 ppm CH4) across 1500 road miles of Washington, DC. The δ(13)C-isotopic signatures of the methane (-38.2‰ ± 3.9‰ s.d.) and ethane (-36.5 ± 1.1 s.d.) and the CH4:C2H6 ratios (25.5 ± 8.9 s.d.) closely matched the pipeline gas (-39.0‰ and -36.2‰ for methane and ethane; 19.0 for CH4/C2H6). Emissions from four street leaks ranged from 9200 to 38,200 L CH4 day(-1) each, comparable to natural gas used by 1.7 to 7.0 homes, respectively. At 19 tested locations, 12 potentially explosive (Grade 1) methane concentrations of 50,000 to 500,000 ppm were detected in manholes. Financial incentives and targeted programs among companies, public utility commissions, and scientists to reduce leaks and replace old cast-iron pipes will improve consumer safety and air quality, save money, and lower greenhouse gas emissions.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS PROGRAM..., Pipeline and Hazardous Materials Safety Administration. Competent Authority means a national agency that is responsible, under its national law, for the control or regulation of some aspect of hazardous materials...
Sahin, Sükran; Kurum, Ekrem
2009-09-01
Ecological monitoring is a complementary component of the overall environmental management and monitoring program of any Environmental Impact Assessment (EIA) report. The monitoring method should be developed for each project phase and allow for periodic reporting and assessment of compliance with the environmental conditions and requirements of the EIA. Also, this method should incorporate a variance request program since site-specific conditions can affect construction on a daily basis and require time-critical application of alternative construction scenarios or environmental management methods integrated with alternative mitigation measures. Finally, taking full advantage of the latest information and communication technologies can enhance the quality of, and public involvement in, the environmental management program. In this paper, a landscape-scale ecological monitoring method for major construction projects is described using, as a basis, 20 months of experience on the Baku-Tbilisi-Ceyhan (BTC) Crude Oil Pipeline Project, covering Turkish Sections Lot B and Lot C. This analysis presents suggestions for improving ecological monitoring for major construction activities.
Erwin, Katherine; Blumenthal, Daniel S; Chapel, Thomas; Allwood, L Vernon
2004-11-01
We evaluated collaboration among academic and community partners in a program to recruit African American youth into the health professions. Six institutions of higher education, an urban school system, two community organizations, and two private enterprises became partners to create a health career pipeline for this population. The pipeline consisted of 14 subprograms designed to enrich academic science curricula, stimulate the interest of students in health careers, and facilitate entry into professional schools and other graduate-level educational programs. Subprogram directors completed questionnaires regarding a sense of common mission/vision and coordination/collaboration three times during the 3-year project. The partners strongly shared a common mission and vision throughout the duration of the program, although there was some weakening in the last phase. Subprogram directors initially viewed coordination/collaboration as weak, but by midway through the project period viewed it as stronger. Feared loss of autonomy was foremost among several factors that threatened collaboration among the partners. Collaboration was improved largely through a process of building trust among the partners.
Coupling Network Computing Applications in Air-cooled Turbine Blades Optimization
NASA Astrophysics Data System (ADS)
Shi, Liang; Yan, Peigang; Xie, Ming; Han, Wanjin
2018-05-01
Through establishing control parameters from blade outside to inside, the parametric design of air-cooled turbine blade based on airfoil has been implemented. On the basis of fast updating structure features and generating solid model, a complex cooling system has been created. Different flow units are modeled into a complex network topology with parallel and serial connection. Applying one-dimensional flow theory, programs have been composed to get pipeline network physical quantities along flow path, including flow rate, pressure, temperature and other parameters. These inner units parameters set as inner boundary conditions for external flow field calculation program HIT-3D by interpolation, thus to achieve full field thermal coupling simulation. Referring the studies in literatures to verify the effectiveness of pipeline network program and coupling algorithm. After that, on the basis of a modified design, and with the help of iSIGHT-FD, an optimization platform had been established. Through MIGA mechanism, the target of enhancing cooling efficiency has been reached, and the thermal stress has been effectively reduced. Research work in this paper has significance for rapid deploying the cooling structure design.
NASA Technical Reports Server (NTRS)
Lee, Hyun H.
2012-01-01
MERTELEMPROC processes telemetered data in data product format and generates Experiment Data Records (EDRs) for many instruments (HAZCAM, NAVCAM, PANCAM, microscopic imager, Moessbauer spectrometer, APXS, RAT, and EDLCAM) on the Mars Exploration Rover (MER). If the data is compressed, then MERTELEMPROC decompresses the data with an appropriate decompression algorithm. There are two compression algorithms (ICER and LOCO) used in MER. This program fulfills a MER specific need to generate Level 1 products within a 60-second time requirement. EDRs generated by this program are used by merinverter, marscahv, marsrad, and marsjplstereo to generate higher-level products for the mission operations. MERTELEPROC was the first GDS program to process the data product. Metadata of the data product is in XML format. The software allows user-configurable input parameters, per-product processing (not streambased processing), and fail-over is allowed if the leading image header is corrupted. It is used within the MER automated pipeline. MERTELEMPROC is part of the OPGS (Operational Product Generation Subsystem) automated pipeline, which analyzes images returned by in situ spacecraft and creates level 1 products to assist in operations, science, and outreach.
Filling the Graduate Student Pipeline
NASA Astrophysics Data System (ADS)
Winey, Karen I.
2003-03-01
As a professor who relies on graduate students to participate in my research program, I work to ensure that the pipeline of graduate students is full. This presentation will discuss a variety of strategies that I have used to advertise the opportunities of graduate school, many of which use existing infrastructure at the University of Pennsylvania. These strategies involve a combination of public speaking, discussion groups, and faculty advising. During these exchanges it's important to both contrast the career opportunities for B.S., M.S. and Ph.D. degree holders and outline the financial facts about graduate school. These modest efforts have increased the number of Penn undergraduates pursuing doctorate degrees.
Detecting distant homologies on protozoans metabolic pathways using scientific workflows.
da Cruz, Sérgio Manuel Serra; Batista, Vanessa; Silva, Edno; Tosta, Frederico; Vilela, Clarissa; Cuadrat, Rafael; Tschoeke, Diogo; Dávila, Alberto M R; Campos, Maria Luiza Machado; Mattoso, Marta
2010-01-01
Bioinformatics experiments are typically composed of programs in pipelines manipulating an enormous quantity of data. An interesting approach for managing those experiments is through workflow management systems (WfMS). In this work we discuss WfMS features to support genome homology workflows and present some relevant issues for typical genomic experiments. Our evaluation used Kepler WfMS to manage a real genomic pipeline, named OrthoSearch, originally defined as a Perl script. We show a case study detecting distant homologies on trypanomatids metabolic pathways. Our results reinforce the benefits of WfMS over script languages and point out challenges to WfMS in distributed environments.
NASA Astrophysics Data System (ADS)
Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri
2017-10-01
We combine numerical tools to analyze the reflectance spectra of granular materials. Our motivation comes from the lack of tools when it comes to intimate mixing of materials and modeling space-weathering effects with nano- or micron-sized inclusions. The current practice is to apply a semi-physical models such as the Hapke models (e.g., Icarus 195, 2008). These are expressed in a closed form so that they are fast to apply. The problem is that the validity of the model is not guaranteed, and the derived properties related to particle scattering can be unrealistic (JQSRT 113, 2012).Our pipeline consists of individual scattering simulation codes and a main program that chains them together. The chain for analyzing a macroscopic target with space-weathered mineral would go as: (1) Scattering properties of small inclusions inside a host matrix are derived using exact Maxwell equation solvers. From the scattering properties, we use the so-called incoherent fields and Mueller matrices as input for the next step; (2) Scattering by a regolith grain is solved using a geometrical optics method with surface reflections, internal absorption, and internal diffuse scattering; (3) The radiative transfer simulation is executed inputting the regolith grains from the previous step as the scatterers in a macroscopic planar volume element.For the most realistic asteroid reflectance model, the chain would produce the properties of a planar surface element. Then, a shadowing simulation over the surface elements would be considered, and finally the asteroid phase function would be solved by integrating the bidirectional reflectance distribution function of the planar element over the object's realistic shape model.The tools in the proposed chain already exist, and practical task for us is to tie these together into an easy-to-use public pipeline. We plan to open the pipeline as a web-based open service a dedicated server, using Django application server and Python environment for the main functionality. The individual programs to be ran under the chain can still be programmed with Fortran, C, or other.We acknowledge the ERC AdG No. 320773 ‘SAEMPL’ and the computational resources provided by CSC — IT Center for Science Ltd., Finland.
Issues in electric power in India: Challenges and opportunities
NASA Astrophysics Data System (ADS)
Tongia, Rahul
This dissertation provides an examination of three facets of the Indian power program. The first issue we analyze is the current regulatory environment and guidelines in place for independent power producers and other generators, focusing on possible tradeoffs between prices and investor returns. The analysis shows that investor rates of return are significantly higher than the nominal 16% as stipulated by the Central Electricity Authority guidelines, and an uncertainty analysis reveals the relative importance of various input and project parameters. We discuss problems with the existing guidelines, and provide options for changes in policy. Adoption of modified guidelines that are more transparent and do not focus on project capital structures are likely to result in more affordable tariffs, less delays in project completion and yet provide adequate rates of return for investors. India's nuclear power program is based on indigenous materials and technology, with the potential for providing energy security for many decades. We examine the technical validity of this plan, especially the role of fast breeder reactors for extending the domestic uranium supplies. The analysis shows that breeding is unlikely to occur at anywhere near the rates envisioned, leading to a slow growth of fast breeder reactors. In addition, domestic uranium reserves restrict growth of Pressurized Heavy Water Reactors, which are likely to be the main contributors to nuclear capacity in the short term. To increase the share of nuclear power in the coming decades, India should consider the construction of a number of large thermal reactors based on indigenous and imported uranium. We also present policy options for such changes to India's nuclear power program. This dissertation examines in detail the policy, technology, and economics of an overland pipeline supplying natural gas to India and Pakistan. Such a pipeline would be shared by both countries, and would be a strong confidence building measure, offering a unique opportunity for cooperation. As natural gas pipelines exhibit significant economies of scale, a shared pipeline would also offer the lowest price natural gas for both countries. This study addresses some of the potential concerns, suggesting options for overcoming security of supply worries. (Abstract shortened by UMI.)
49 CFR 107.323 - ALJ's decision.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 2 2010-10-01 2010-10-01 false ALJ's decision. 107.323 Section 107.323 Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY... PROGRAM PROCEDURES Enforcement Compliance Orders and Civil Penalties § 107.323 ALJ's decision. (a) After...
LOOP marine and estuarine monitoring program, 1978-95 : volume 5 : demersal nekton.
DOT National Transportation Integrated Search
1998-01-01
The Louisiana Offshore Oil Port (LOOP) facilities in coastal Louisiana provide the United States with the country's only Superport for off-loading deep draft tankers. The facilities transport oil ashore through pipelines, and temporarily store oil be...
Directory of Transportation Education.
ERIC Educational Resources Information Center
Department of Transportation, Washington, DC.
This directory lists institutions of higher education that offer degree and non-degree programs in various transportation fields and modes, including aviation, highway, urban mass transportation, railroad, water transport, pipeline, intermodal, and environmental and consumer education. The book catalogs courses and degrees offered, names of…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
...EPA is amending the requirements under EPA's diesel sulfur program related to the sulfur content of locomotive and marine (LM) diesel fuel produced by transmix processors and pipeline facilities. These amendments will reinstate the ability of locomotive and marine diesel fuel produced from transmix by transmix processors and pipeline operators to meet a maximum 500 parts per million (ppm) sulfur standard outside of the Northeast Mid-Atlantic Area and Alaska and expand this ability to within the Northeast Mid-Atlantic Area provided that: the fuel is used in older technology locomotive and marine engines that do not require 15 ppm sulfur diesel fuel, and the fuel is kept segregated from other fuel. These amendments will provide significant regulatory relief for transmix processors and pipeline operators to allow the petroleum distribution system to function efficiently while continuing to transition the market to virtually all ultra-low sulfur diesel fuel (ULSD, i.e. 15 ppm sulfur diesel fuel) and the environmental benefits it provides.
Mapping of Brain Activity by Automated Volume Analysis of Immediate Early Genes.
Renier, Nicolas; Adams, Eliza L; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E; Kadiri, Lolahon; Umadevi Venkataraju, Kannan; Zhou, Yu; Wang, Victoria X; Tang, Cheuk Y; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc
2016-06-16
Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization, and quantification of the activity of all neurons across the entire brain, which has not, to date, been achieved in the mammalian brain. We introduce a pipeline for high-speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Last, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. Copyright © 2016 Elsevier Inc. All rights reserved.
Mapping of brain activity by automated volume analysis of immediate early genes
Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc
2016-01-01
Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021
CamBAfx: Workflow Design, Implementation and Application for Neuroimaging
Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John
2009-01-01
CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470
bcgTree: automatized phylogenetic tree building from bacterial core genomes.
Ankenbrand, Markus J; Keller, Alexander
2016-10-01
The need for multi-gene analyses in scientific fields such as phylogenetics and DNA barcoding has increased in recent years. In particular, these approaches are increasingly important for differentiating bacterial species, where reliance on the standard 16S rDNA marker can result in poor resolution. Additionally, the assembly of bacterial genomes has become a standard task due to advances in next-generation sequencing technologies. We created a bioinformatic pipeline, bcgTree, which uses assembled bacterial genomes either from databases or own sequencing results from the user to reconstruct their phylogenetic history. The pipeline automatically extracts 107 essential single-copy core genes, found in a majority of bacteria, using hidden Markov models and performs a partitioned maximum-likelihood analysis. Here, we describe the workflow of bcgTree and, as a proof-of-concept, its usefulness in resolving the phylogeny of 293 publically available bacterial strains of the genus Lactobacillus. We also evaluate its performance in both low- and high-level taxonomy test sets. The tool is freely available at github ( https://github.com/iimog/bcgTree ) and our institutional homepage ( http://www.dna-analytics.biozentrum.uni-wuerzburg.de ).
Point-of-care testing: applications of 3D printing.
Chan, Ho Nam; Tan, Ming Jun Andrew; Wu, Hongkai
2017-08-08
Point-of-care testing (POCT) devices fulfil a critical need in the modern healthcare ecosystem, enabling the decentralized delivery of imperative clinical strategies in both developed and developing worlds. To achieve diagnostic utility and clinical impact, POCT technologies are immensely dependent on effective translation from academic laboratories out to real-world deployment. However, the current research and development pipeline is highly bottlenecked owing to multiple restraints in material, cost, and complexity of conventionally available fabrication techniques. Recently, 3D printing technology has emerged as a revolutionary, industry-compatible method enabling cost-effective, facile, and rapid manufacturing of objects. This has allowed iterative design-build-test cycles of various things, from microfluidic chips to smartphone interfaces, that are geared towards point-of-care applications. In this review, we focus on highlighting recent works that exploit 3D printing in developing POCT devices, underscoring its utility in all analytical steps. Moreover, we also discuss key advantages of adopting 3D printing in the device development pipeline and identify promising opportunities in 3D printing technology that can benefit global health applications.
Alon, Sigal
2015-07-01
This study demonstrates the analytical leverage gained from considering the entire college pipeline-including the application, admission and graduation stages-in examining the economic position of various groups upon labor market entry. The findings, based on data from three elite universities in Israel, reveal that the process that shapes economic inequality between different ethnic and immigrant groups is not necessarily cumulative. Field of study stratification does not expand systematically from stage to stage and the position of groups on the field of study hierarchy at each stage is not entirely explained by academic preparation. Differential selection and attrition processes, as well as ambition and aspirations, also shape the position of ethnic groups in the earnings hierarchy and generate a non-cumulative pattern. These findings suggest that a cross-sectional assessment of field of study inequality at the graduation stage can generate misleading conclusions about group-based economic inequality among workers with a bachelor's degree. Copyright © 2015 Elsevier Inc. All rights reserved.
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
University of California Program for Analytical Cytology five-year report, 1982-1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, S.; Calkins, M.
1987-01-01
The Program for Analytical Cytology (PAC) was created by the Regents of the University of California on June 17, 1982. The purposes of the Program are to encourage research into theoretical, scientific, and engineering aspects of analytical cytology and into its biological and clinical applications.
LOOP marine and estuarine monitoring program, 1978-95 : volume 4 : zooplankton and ichthyoplankton.
DOT National Transportation Integrated Search
1998-01-01
The Louisiana Offshore Oil Port (LOOP) facilities in coastal Louisiana provide the United States with the country's only Superport for off-loading deep draft tankers. The three single-point mooring (SPM) structures connected by pipelines to a platfor...
49 CFR 199.215 - Alcohol concentration.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Alcohol concentration. 199.215 Section 199.215... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.215 Alcohol concentration. Each operator shall prohibit a covered employee from...
49 CFR 199.215 - Alcohol concentration.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Alcohol concentration. 199.215 Section 199.215... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.215 Alcohol concentration. Each operator shall prohibit a covered employee from...
Guidance on Biogas used to Produce CNG or LNG under the Renewable Fuel Standard Program
Provides EPA’s interpretation of biogas quality and RIN generation requirements that apply to renewable fuel production pathways involving the injection into a commercial pipeline of biogas for use in producing renewable CNG or renewable LNG.
Completion of development of robotics systems for inspecting unpiggable transmission pipelines.
DOT National Transportation Integrated Search
2013-02-01
This document presents the final report for a program focusing on the completion of the : research, development and demonstration effort, which was initiated in 2001, for the : development of two robotic systems for the in-line, live inspection of un...
USDA-ARS?s Scientific Manuscript database
Routine DNA testing. It’s done once you’ve Marker-Assisted Breeding Pipelined promising Qantitative Trait Loci within your own breeding program and thereby established the performance-predictive power of each DNA test for your germplasm under your conditions. By then you are ready to screen your par...
Boatright, Dowin; Tunson, Java; Caruso, Emily; Angerhofer, Christy; Baker, Brooke; King, Renee; Bakes, Katherine; Oberfoell, Stephanie; Lowenstein, Steven; Druck, Jeffrey
2016-11-01
In 2008, the Council of Emergency Medicine Residency Directors (CORD) developed a set of recruitment strategies designed to increase the number of under-represented minorities (URMs) in Emergency Medicine (EM) residency. We conducted a survey of United States (US) EM residency program directors to: describe the racial and ethnic composition of residents; ascertain whether each program had instituted CORD recruitment strategies; and identify program characteristics associated with recruitment of a high proportion of URM residents. The survey was distributed to accredited, nonmilitary US EM residency programs during 2013. Programs were dichotomized into high URM and low URM by the percentage of URM residents. High- and low-URM programs were compared with respect to size, geography, percentage of URM faculty, importance assigned to common applicant selection criteria, and CORD recruitment strategies utilized. Odds ratios and 95% confidence limits were calculated. Of 154 residency programs, 72% responded. The median percentage of URM residents per program was 9%. Only 46% of EM programs engaged in at least two recruitment strategies. Factors associated with higher resident diversity (high-URM) included: diversity of EM faculty (high-URM) (odds ratio [OR] 5.3; 95% confidence interval [CI] 2.1-13.0); applicant's URM status considered important (OR 4.9; 95% CI 2.1-11.9); engaging in pipeline activities (OR 4.8; 95% CI 1.4-15.7); and extracurricular activities considered important (OR 2.6; 95% CI 1.2-6.0). Less than half of EM programs have instituted two or more recruitment strategies from the 2008 CORD diversity panel. EM faculty diversity, active pipeline programs, and attention paid to applicants' URM status and extracurricular activities were associated with higher resident diversity. Copyright © 2016 Elsevier Inc. All rights reserved.
THE DIFFERENCE IMAGING PIPELINE FOR THE TRANSIENT SEARCH IN THE DARK ENERGY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessler, R.; Scolnic, D.; Marriner, J.
2015-12-15
We describe the operation and performance of the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from 2013 August through 2014 February. DES-SN is a search for transients in which ten 3 deg{sup 2} fields are repeatedly observed in the g, r, i, z passbands with a cadence of about 1 week. The observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernovae (SNe Ia) with the goal of measuring dark energy parameters. The essential DiffImgmore » functions are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are ∼130 detections per deg{sup 2} per observation in each band, of which only ∼25% are artifacts. Of the ∼7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least two separate nights, Monte Carlo (MC) simulations predict that 27% are expected to be SNe Ia or core-collapse SNe. Another ∼30% of the transients are artifacts in which a small number of observations satisfy the selection criteria for a single-epoch detection. Spectroscopic analysis shows that most of the remaining transients are AGNs and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies and to understand the DiffImg performance. The DiffImg efficiency measured with fake SNe agrees well with expectations from a MC simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 “shallow” fields with single-epoch 50% completeness depth ∼23.5, the SN Ia efficiency falls to 1/2 at redshift z ≈ 0.7; in our 2 “deep” fields with mag-depth ∼24.5, the efficiency falls to 1/2 at z ≈ 1.1. A remaining performance issue is that the measured fluxes have additional scatter (beyond Poisson fluctuations) that increases with the host galaxy surface brightness at the transient location. This bright-galaxy issue has minimal impact on the SNe Ia program, but it may lower the efficiency for finding fainter transients on bright galaxies.« less
The Difference Imaging Pipeline for the Transient Search in the Dark Energy Survey
NASA Astrophysics Data System (ADS)
Kessler, R.; Marriner, J.; Childress, M.; Covarrubias, R.; D'Andrea, C. B.; Finley, D. A.; Fischer, J.; Foley, R. J.; Goldstein, D.; Gupta, R. R.; Kuehn, K.; Marcha, M.; Nichol, R. C.; Papadopoulos, A.; Sako, M.; Scolnic, D.; Smith, M.; Sullivan, M.; Wester, W.; Yuan, F.; Abbott, T.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Carnero Rosell, A.; Carrasco Kind, M.; Castander, F. J.; Crocce, M.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Fausti Neto, A.; Flaugher, B.; Frieman, J.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuropatkin, N.; Li, T. S.; Maia, M. A. G.; Marshall, J. L.; Martini, P.; Miller, C. J.; Miquel, R.; Nord, B.; Ogando, R.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thaler, J.; Thomas, R. C.; Tucker, D.; Walker, A. R.; DES Collaboration
2015-12-01
We describe the operation and performance of the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from 2013 August through 2014 February. DES-SN is a search for transients in which ten 3 deg2 fields are repeatedly observed in the g, r, i, z passbands with a cadence of about 1 week. The observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernovae (SNe Ia) with the goal of measuring dark energy parameters. The essential DiffImg functions are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are ˜130 detections per deg2 per observation in each band, of which only ˜25% are artifacts. Of the ˜7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least two separate nights, Monte Carlo (MC) simulations predict that 27% are expected to be SNe Ia or core-collapse SNe. Another ˜30% of the transients are artifacts in which a small number of observations satisfy the selection criteria for a single-epoch detection. Spectroscopic analysis shows that most of the remaining transients are AGNs and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies and to understand the DiffImg performance. The DiffImg efficiency measured with fake SNe agrees well with expectations from a MC simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 “shallow” fields with single-epoch 50% completeness depth ˜23.5, the SN Ia efficiency falls to 1/2 at redshift z ≈ 0.7; in our 2 “deep” fields with mag-depth ˜24.5, the efficiency falls to 1/2 at z ≈ 1.1. A remaining performance issue is that the measured fluxes have additional scatter (beyond Poisson fluctuations) that increases with the host galaxy surface brightness at the transient location. This bright-galaxy issue has minimal impact on the SNe Ia program, but it may lower the efficiency for finding fainter transients on bright galaxies.
THE DIFFERENCE IMAGING PIPELINE FOR THE TRANSIENT SEARCH IN THE DARK ENERGY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessler, R.; Marriner, J.; Childress, M.
2015-11-06
We describe the operation and performance of the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from 2013 August through 2014 February. DES-SN is a search for transients in which ten 3 deg(2) fields are repeatedly observed in the g, r, i, z passbands with a cadence of about 1 week. The observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernovae (SNe Ia) with the goal of measuring dark energy parameters. The essential DiffImg functionsmore » are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are similar to 130 detections per deg(2) per observation in each band, of which only similar to 25% are artifacts. Of the similar to 7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least two separate nights, Monte Carlo (MC) simulations predict that 27% are expected to be SNe Ia or core-collapse SNe. Another similar to 30% of the transients are artifacts in which a small number of observations satisfy the selection criteria for a single-epoch detection. Spectroscopic analysis shows that most of the remaining transients are AGNs and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies and to understand the DiffImg performance. The DiffImg efficiency measured with fake SNe agrees well with expectations from a MC simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 "shallow" fields with single-epoch 50% completeness depth similar to 23.5, the SN Ia efficiency falls to 1/2 at redshift z approximate to 0.7; in our 2 "deep" fields with mag-depth similar to 24.5, the efficiency falls to 1/2 at z approximate to 1.1. A remaining performance issue is that the measured fluxes have additional scatter (beyond Poisson fluctuations) that increases with the host galaxy surface brightness at the transient location. This bright-galaxy issue has minimal impact on the SNe Ia program, but it may lower the efficiency for finding fainter transients on bright galaxies.« less
The Difference Imaging Pipeline for the Transient Search in the Dark Energy Survey
Kessler, R.
2015-09-09
We describe the operation and performance of the difference imaging pipeline (DiffImg) used to detect transients in deep images from the Dark Energy Survey Supernova program (DES-SN) in its first observing season from 2013 August through 2014 February. DES-SN is a search for transients in which ten 3 deg 2 fields are repeatedly observed in the g, r, i, zpassbands with a cadence of about 1 week. Our observing strategy has been optimized to measure high-quality light curves and redshifts for thousands of Type Ia supernovae (SNe Ia) with the goal of measuring dark energy parameters. The essential DiffImg functionsmore » are to align each search image to a deep reference image, do a pixel-by-pixel subtraction, and then examine the subtracted image for significant positive detections of point-source objects. The vast majority of detections are subtraction artifacts, but after selection requirements and image filtering with an automated scanning program, there are ~130 detections per deg 2 per observation in each band, of which only ~25% are artifacts. Of the ~7500 transients discovered by DES-SN in its first observing season, each requiring a detection on at least two separate nights, Monte Carlo (MC) simulations predict that 27% are expected to be SNe Ia or core-collapse SNe. Another ~30% of the transients are artifacts in which a small number of observations satisfy the selection criteria for a single-epoch detection. Spectroscopic analysis shows that most of the remaining transients are AGNs and variable stars. Fake SNe Ia are overlaid onto the images to rigorously evaluate detection efficiencies and to understand the DiffImg performance. Furthermore, the DiffImg efficiency measured with fake SNe agrees well with expectations from a MC simulation that uses analytical calculations of the fluxes and their uncertainties. In our 8 "shallow" fields with single-epoch 50% completeness depth ~23.5, the SN Ia efficiency falls to 1/2 at redshift z ≈ 0.7; in our 2 "deep" fields with mag-depth ~24.5, the efficiency falls to 1/2 at z ≈ 1.1. A remaining performance issue is that the measured fluxes have additional scatter (beyond Poisson fluctuations) that increases with the host galaxy surface brightness at the transient location. This bright-galaxy issue has minimal impact on the SNe Ia program, but it may lower the efficiency for finding fainter transients on bright galaxies.« less
Mason, Bonnie S; Ross, William; Ortega, Gezzer; Chambers, Monique C; Parks, Michael L
2016-09-01
Women and minorities remain underrepresented in orthopaedic surgery. In an attempt to increase the diversity of those entering the physician workforce, Nth Dimensions implemented a targeted pipeline curriculum that includes the Orthopaedic Summer Internship Program. The program exposes medical students to the specialty of orthopaedic surgery and equips students to be competitive applicants to orthopaedic surgery residency programs. The effect of this program on women and underrepresented minority applicants to orthopaedic residencies is highlighted in this article. (1) For women we asked: is completing the Orthopaedic Summer Internship Program associated with higher odds of applying to orthopaedic surgery residency? (2) For underrepresented minorities, is completing the Orthopaedic Summer Internship Program associated with higher odds of applying to orthopaedic residency? Between 2005 and 2012, 118 students completed the Nth Dimensions/American Academy of Orthopaedic Surgeons Orthopaedic Summer Internship Program. The summer internship consisted of an 8-week clinical and research program between the first and second years of medical school and included a series of musculoskeletal lectures, hands-on, practical workshops, presentation of a completed research project, ongoing mentoring, professional development, and counselling through each participant's subsequent years of medical school. In correlation with available national application data, residency application data were obtained for those Orthopaedic Summer Internship Program participants who applied to the match between 2011 through 2014. For these 4 cohort years, we evaluated whether this program was associated with increased odds of applying to orthopaedic surgery residency compared with national controls. For the same four cohorts, we evaluated whether underrepresented minority students who completed the program had increased odds of applying to an orthopaedic surgery residency compared with national controls. Fifty Orthopaedic Summer Internship scholars applied for an orthopaedic residency position. For women, completion of the Orthopaedic Summer Internship was associated with increased odds of applying to orthopaedic surgery residency (after summer internship: nine of 17 [35%]; national controls: 800 of 78,316 [1%]; odds ratio [OR], 51.3; 95% confidence interval [CI], 21.1-122.0; p < 0.001). Similarly, for underrepresented minorities, Orthopaedic Summer Internship completion was also associated with increased odds of orthopaedic applications from 2011 to 2014 (after Orthopaedic Summer Internship: 15 of 48 [31%]; non-Orthopaedic Summer Internship applicants nationally: 782 of 25,676 [3%]; OR, 14.5 [7.3-27.5]; p < 0.001). Completion of the Nth Dimensions Orthopaedic Summer Internship Program has a positive impact on increasing the odds of each student participant applying to an orthopaedic surgery residency program. This program may be a key factor in contributing to the pipeline of women and underrepresented minorities into orthopaedic surgery. Level III, therapeutic study.
System for measuring radioactivity of labelled biopolymers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, V.
1980-07-08
A system is described for measuring radioactivity of labelled biopolymers, comprising: a set of containers adapted for receiving aqueous solutions of biological samples containing biopolymers which are subsequently precipitated in said containers on particles of diatomite in the presence of a coprecipitator, then filtered, dissolved, and mixed with a scintillator; radioactivity measuring means including a detection chamber to which is fed the mixture produced in said set of containers; an electric drive for moving said set of containers in a stepwise manner; means for proportional feeding of said coprecipitator and a suspension of diatomite in an acid solution to saidmore » containers which contain the biological sample for forming an acid precipitation of biopolymers; means for the removal of precipitated samples from said containers; precipitated biopolymer filtering means for successively filtering the precipitate, suspending the precipitate, dissolving the biopolymers mixed with said scintillator for feeding of the mixture to said detection chamber; a system of pipelines interconnecting said above-recited means; and said means for measuring radioactivity of labelled biopolymers including, a measuring cell arranged in a detection chamber and communicating with said means for filtering precipitated biopolymers through one pipeline of said system of pipelines; a program unit electrically connected to said electric drive, said means for acid precipatation of biopolymers, said means for the removal of precipitated samples from said containers, said filtering means, and said radioactivity measuring device; said program unit adapted to periodically switch on and off the above-recited means and check the sequence of the radioactivity measuring operations; and a control unit for controlling the initiation of the system and for selecting programs.« less
NASA Astrophysics Data System (ADS)
Duan, Wenbo; Kirby, Ray; Mudge, Peter; Gan, Tat-Hean
2016-12-01
Ultrasonic guided waves are often used in the detection of defects in oil and gas pipelines. It is common for these pipelines to be buried underground and this may restrict the length of the pipe that can be successfully tested. This is because acoustic energy travelling along the pipe walls may radiate out into the surrounding medium. Accordingly, it is important to develop a better understanding of the way in which elastic waves propagate along the walls of buried pipes, and so in this article a numerical model is developed that is suitable for computing the eigenmodes for uncoated and coated buried pipes. This is achieved by combining a one dimensional eigensolution based on the semi-analytic finite element (SAFE) method, with a perfectly matched layer (PML) for the infinite medium surrounding the pipe. This article also explores an alternative exponential complex coordinate stretching function for the PML in order to improve solution convergence. It is shown for buried pipelines that accurate solutions may be obtained over the entire frequency range typically used in long range ultrasonic testing (LRUT) using a PML layer with a thickness equal to the pipe wall thickness. This delivers a fast and computationally efficient method and it is shown for pipes buried in sand or soil that relevant eigenmodes can be computed and sorted in less than one second using relatively modest computer hardware. The method is also used to find eigenmodes for a buried pipe coated with the viscoelastic material bitumen. It was recently observed in the literature that a viscoelastic coating may effectively isolate particular eigenmodes so that energy does not radiate from these modes into the surrounding [elastic] medium. A similar effect is also observed in this article and it is shown that this occurs even for a relatively thin layer of bitumen, and when the shear impedance of the coating material is larger than that of the surrounding medium.
2014-06-01
SCADA / ICS Cyber Test Lab initiated in 2013 Psychosocial – academic research exists,; opportunity for sharing and developing impact assessment...ecosystems and species at risk), accidents / system failure (rail; pipelines ; ferries CSSP strategy for the North Focus on regional l(and local) problem...Guidance; business planning; environmental scan; proposal evaluation; and performance measurement Program Risk Management – Guidelines for project
1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
.... PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline..., and safety policies for natural gas pipelines and for hazardous liquid pipelines. Both committees were...: Notice of advisory committee meeting. SUMMARY: This notice announces a public meeting of the Gas Pipeline...
NASA Technical Reports Server (NTRS)
Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)
2016-01-01
A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.
COSMOS: Carnegie Observatories System for MultiObject Spectroscopy
NASA Astrophysics Data System (ADS)
Oemler, A.; Clardy, K.; Kelson, D.; Walth, G.; Villanueva, E.
2017-05-01
COSMOS (Carnegie Observatories System for MultiObject Spectroscopy) reduces multislit spectra obtained with the IMACS and LDSS3 spectrographs on the Magellan Telescopes. It can be used for the quick-look analysis of data at the telescope as well as for pipeline reduction of large data sets. COSMOS is based on a precise optical model of the spectrographs, which allows (after alignment and calibration) an accurate prediction of the location of spectra features. This eliminates the line search procedure which is fundamental to many spectral reduction programs, and allows a robust data pipeline to be run in an almost fully automatic mode, allowing large amounts of data to be reduced with minimal intervention.
PIFEX: An advanced programmable pipelined-image processor
NASA Technical Reports Server (NTRS)
Gennery, D. B.; Wilcox, B.
1985-01-01
PIFEX is a pipelined-image processor being built in the JPL Robotics Lab. It will operate on digitized raster-scanned images (at 60 frames per second for images up to about 300 by 400 and at lesser rates for larger images), performing a variety of operations simultaneously under program control. It thus is a powerful, flexible tool for image processing and low-level computer vision. It also has applications in other two-dimensional problems such as route planning for obstacle avoidance and the numerical solution of two-dimensional partial differential equations (although its low numerical precision limits its use in the latter field). The concept and design of PIFEX are described herein, and some examples of its use are given.
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman
2011-06-01
This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.
75 FR 5258 - Hazardous Materials Transportation; Registration and Fee Assessment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 107 [Docket No. PHMSA-2009-0201 (HM-208H)] RIN 2137-AE47 Hazardous Materials Transportation... transportation, certain categories and quantities of hazardous materials. PHMSA's proposal would provide that...
Effect of Ethanol Blends and Batching Operations on SCC of Carbon Steel
DOT National Transportation Integrated Search
2011-02-08
This is the draft final report of the project on blending and batching (WP#325) of the Consolidated Program on Development of Guidelines for Safe and Reliable Pipeline Transportation of Ethanol Blends. The other two aspects of the consolidated progra...
36 CFR 222.1 - Authority and definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., such as survey and design, equipment, labor and material (or contract) costs, and on-the-ground... 16 contiguous western States. (21) Range Improvement means any activity or program designed to... such as: dams, ponds, pipelines, wells, fences, trails, seeding, etc. (B) Temporary which are short...
Building the Minority Faculty Development Pipeline.
ERIC Educational Resources Information Center
Gates, Paul E.; Ganey, James H.; Brown, Marc D.
2003-01-01
Describes efforts toward minority faculty development in dentistry, including those of Harlem Hospital-Columbia University School of Dentistry and Oral Surgery, the National Dental Association Foundation, and Bronx Lebanon Hospital Center. Explains that critical elements in the success of these programs are environment, selection criteria,…
Examination of Outside Forces Damage to Natural Gas Pipelines and Damage Prevention
DOT National Transportation Integrated Search
1987-07-01
The report looks at the problem of damage to underground facilities caused by excavation and related activities and the efforts that have been made in recent years to limit and control it through laws, regulations, and damage prevention programs, suc...
The Role of Mentoring in Leadership Development.
Crisp, Gloria; Alvarado-Young, Kelly
2018-06-01
This chapter discusses the role of mentoring in facilitating leadership development for students throughout the educational pipeline. Related literature is summarized and practical guidance is provided for designing, implementing, and evaluating programs with a focus toward developing students as leaders. © 2018 Wiley Periodicals, Inc.
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2012 CFR
2012-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2014 CFR
2014-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2013 CFR
2013-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit
Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188
OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping
2017-02-01
The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.
TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information
Struck, Torsten H
2014-01-01
Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118
MOCAT: a metagenomics assembly and gene prediction toolkit.
Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer
2012-01-01
MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.
$ANBA; a rapid, combined data acquisition and correction program for the SEMQ electron microprobe
McGee, James J.
1983-01-01
$ANBA is a program developed for rapid data acquisition and correction on an automated SEMQ electron microprobe. The program provides increased analytical speed and reduced disk read/write operations compared with the manufacturer's software, resulting in a doubling of analytical throughput. In addition, the program provides enhanced analytical features such as averaging, rapid and compact data storage, and on-line plotting. The program is described with design philosophy, flow charts, variable names, a complete program listing, and system requirements. A complete operating example and notes to assist in running the program are included.
Closing the Gaps and Filling the STEM Pipeline: A Multidisciplinary Approach
NASA Astrophysics Data System (ADS)
Doerschuk, Peggy; Bahrim, Cristian; Daniel, Jennifer; Kruger, Joseph; Mann, Judith; Martin, Cristopher
2016-08-01
There is a growing demand for degreed science, technology, engineering and mathematics (STEM) professionals, but the production of degreed STEM students is not keeping pace. Problems exist at every juncture along the pipeline. Too few students choose to major in STEM disciplines. Many of those who do major in STEM drop out or change majors. Females and minorities remain underrepresented in STEM. The success rate of college students who are from low-income background or first-generation students is much lower than that of students who do not face such challenges. Some of those who successfully complete their degree need help in making the transition to the workforce after graduation. A program at Lamar University takes a multidisciplinary approach to addressing these problems. It is designed to recruit, retain and transition undergraduates to careers in STEM, focusing its efforts on five science disciplines and on these "at-risk" students. The program was supported by a 5-year grant from the National Science Foundation and is supported through August 31, 2016 by Lamar University and a grant from ExxonMobil. A formal assessment plan documents the program's success. The program received an award from the Texas Higher Education Board for its contributions towards Closing the Gaps in Higher Education in Texas. This paper describes the program's theoretical framework, research questions, methods, evaluation plan, and instruments. It presents an analysis of the results achieved using these methods and implications for improvements to the program resulting from lessons learned.
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory Committee AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. [[Page...
Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs
ERIC Educational Resources Information Center
Veregin, Howard
2015-01-01
Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…
Status of the TESS Science Processing Operations Center
NASA Astrophysics Data System (ADS)
Jenkins, Jon Michael; Caldwell, Douglas A.; Davies, Misty; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill
2018-06-01
The Transiting Exoplanet Survey Satellite (TESS) was selected by NASA’s Explorer Program to conduct a search for Earth’s closest cousins starting in 2018. TESS will conduct an all-sky transit survey of F, G and K dwarf stars between 4 and 12 magnitudes and M dwarf stars within 200 light years. TESS is expected to discover 1,000 small planets less than twice the size of Earth, and to measure the masses of at least 50 of these small worlds. The TESS science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler science pipeline. Like the Kepler pipeline, the TESS pipeline provides calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline searches through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline generates a suite of diagnostic metrics for each transit-like signature, and then extracts planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search are modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST. Synthetic sample data products are available at https://archive.stsci.edu/tess/ete-6.html.Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Canary: an atomic pipeline for clinical amplicon assays.
Doig, Kenneth D; Ellul, Jason; Fellowes, Andrew; Thompson, Ella R; Ryland, Georgina; Blombery, Piers; Papenfuss, Anthony T; Fox, Stephen B
2017-12-15
High throughput sequencing requires bioinformatics pipelines to process large volumes of data into meaningful variants that can be translated into a clinical report. These pipelines often suffer from a number of shortcomings: they lack robustness and have many components written in multiple languages, each with a variety of resource requirements. Pipeline components must be linked together with a workflow system to achieve the processing of FASTQ files through to a VCF file of variants. Crafting these pipelines requires considerable bioinformatics and IT skills beyond the reach of many clinical laboratories. Here we present Canary, a single program that can be run on a laptop, which takes FASTQ files from amplicon assays through to an annotated VCF file ready for clinical analysis. Canary can be installed and run with a single command using Docker containerization or run as a single JAR file on a wide range of platforms. Although it is a single utility, Canary performs all the functions present in more complex and unwieldy pipelines. All variants identified by Canary are 3' shifted and represented in their most parsimonious form to provide a consistent nomenclature, irrespective of sequencing variation. Further, proximate in-phase variants are represented as a single HGVS 'delins' variant. This allows for correct nomenclature and consequences to be ascribed to complex multi-nucleotide polymorphisms (MNPs), which are otherwise difficult to represent and interpret. Variants can also be annotated with hundreds of attributes sourced from MyVariant.info to give up to date details on pathogenicity, population statistics and in-silico predictors. Canary has been used at the Peter MacCallum Cancer Centre in Melbourne for the last 2 years for the processing of clinical sequencing data. By encapsulating clinical features in a single, easily installed executable, Canary makes sequencing more accessible to all pathology laboratories. Canary is available for download as source or a Docker image at https://github.com/PapenfussLab/Canary under a GPL-3.0 License.
Byrne, Lauren M.; Holt, Kathleen D.; Richter, Thomas; Miller, Rebecca S.; Nasca, Thomas J.
2010-01-01
Background Increased focus on the number and type of physicians delivering health care in the United States necessitates a better understanding of changes in graduate medical education (GME). Data collected by the Accreditation Council for Graduate Medical Education (ACGME) allow longitudinal tracking of residents, revealing the number and type of residents who continue GME following completion of an initial residency. We examined trends in the percent of graduates pursuing additional clinical education following graduation from ACGME-accredited pipeline specialty programs (specialties leading to initial board certification). Methods Using data collected annually by the ACGME, we tracked residents graduating from ACGME-accredited pipeline specialty programs between academic year (AY) 2002–2003 and AY 2006–2007 and those pursuing additional ACGME-accredited training within 2 years. We examined changes in the number of graduates and the percent of graduates continuing GME by specialty, by type of medical school, and overall. Results The number of pipeline specialty graduates increased by 1171 (5.3%) between AY 2002–2003 and AY 2006–2007. During the same period, the number of graduates pursuing additional GME increased by 1059 (16.7%). The overall rate of continuing GME increased each year, from 28.5% (6331/22229) in AY 2002–2003 to 31.6% (7390/23400) in AY 2006–2007. Rates differed by specialty and for US medical school graduates (26.4% [3896/14752] in AY 2002–2003 to 31.6% [4718/14941] in AY 2006–2007) versus international medical graduates (35.2% [2118/6023] to 33.8% [2246/6647]). Conclusion The number of graduates and the rate of continuing GME increased from AY 2002–2003 to AY 2006–2007. Our findings show a recent increase in the rate of continued training for US medical school graduates compared to international medical graduates. Our results differ from previously reported rates of subspecialization in the literature. Tracking individual residents through residency and fellowship programs provides a better understanding of residents' pathways to practice. PMID:22132288
Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy
2017-03-01
Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
... Texas. Pipelines include El Paso Natural Gas, Transwestern Pipeline, Natural Gas Pipeline Co. of America, Northern Natural Gas, Delhi Pipeline, Oasis Pipeline, EPGT Texas and Lone Star Pipeline. The Platt's [[Page... pipelines. These pipelines bring in natural gas from fields in the Gulf Coast region and ship it to major...
NASA Astrophysics Data System (ADS)
Gunes, Ersin Fatih
Turkey is located between Europe, which has increasing demand for natural gas and the geographies of Middle East, Asia and Russia, which have rich and strong natural gas supply. Because of the geographical location, Turkey has strategic importance according to energy sources. To supply this demand, a pipeline network configuration with the optimal and efficient lengths, pressures, diameters and number of compressor stations is extremely needed. Because, Turkey has a currently working and constructed network topology, obtaining an optimal configuration of the pipelines, including an optimal number of compressor stations with optimal locations, is the focus of this study. Identifying a network design with lowest costs is important because of the high maintenance and set-up costs. The quantity of compressor stations, the pipeline segments' lengths, the diameter sizes and pressures at compressor stations, are considered to be decision variables in this study. Two existing optimization models were selected and applied to the case study of Turkey. Because of the fixed cost of investment, both models are formulated as mixed integer nonlinear programs, which require branch and bound combined with the nonlinear programming solution methods. The differences between these two models are related to some factors that can affect the network system of natural gas such as wall thickness, material balance compressor isentropic head and amount of gas to be delivered. The results obtained by these two techniques are compared with each other and with the current system. Major differences between results are costs, pressures and flow rates. These solution techniques are able to find a solution with minimum cost for each model both of which are less than the current cost of the system while satisfying all the constraints on diameter, length, flow rate and pressure. These results give the big picture of an ideal configuration for the future state network for the country of Turkey.
49 CFR 199.202 - Alcohol misuse plan.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Alcohol misuse plan. 199.202 Section 199.202... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.202 Alcohol misuse plan. Each operator must maintain and follow a written alcohol...
49 CFR 199.202 - Alcohol misuse plan.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Alcohol misuse plan. 199.202 Section 199.202... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.202 Alcohol misuse plan. Each operator must maintain and follow a written alcohol...
49 CFR 199.237 - Other alcohol-related conduct.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Other alcohol-related conduct. 199.237 Section 199... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.237 Other alcohol-related conduct. (a) No operator shall...
49 CFR 199.237 - Other alcohol-related conduct.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 3 2011-10-01 2011-10-01 false Other alcohol-related conduct. 199.237 Section 199... MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Alcohol Misuse Prevention Program § 199.237 Other alcohol-related conduct. (a) No operator shall...
Education for Rural Practice: A Saga of Pipelines and Plumbers. Commentary.
ERIC Educational Resources Information Center
Norris, Thomas E.
2000-01-01
Current efforts to address the severe and worsening shortage of rural physicians include attracting and preparing rural students for medical school, enhancing medical school curricula, and placing and retaining rural physicians. Recommendations include creating incentives for successful rural training programs, encouraging innovation in rural…
Business Analytics in Practice and in Education: A Competency-Based Perspective
ERIC Educational Resources Information Center
Mamonov, Stanislav; Misra, Ram; Jain, Rashmi
2015-01-01
Business analytics is a fast-growing area in practice. The rapid growth of business analytics in practice in the recent years is mirrored by a corresponding fast evolution of new educational programs. While more than 130 graduate and undergraduate degree programs in business analytics have been launched in the past 5 years, no commonly accepted…
Bartels, Stephen J; Lebowitz, Barry D; Reynolds, Charles F; Bruce, Martha L; Halpain, Maureen; Faison, Warachal E; Kirwin, Paul D
2010-01-01
This report summarizes the findings and recommendations of an expert consensus workgroup that addressed the endangered pipeline of geriatric mental health (GMH) researchers. The workgroup was convened at the Summit on Challenges in Recruitment, Retention, and Career Development in Geriatric Mental Health Research in late 2007. Major identified challenges included attracting and developing early-career investigators into the field of GMH research; a shortfall of geriatric clinical providers and researchers; a disproportionate lack of minority researchers; inadequate mentoring and career development resources; and the loss of promising researchers during the vulnerable period of transition from research training to independent research funding. The field of GMH research has been at the forefront of developing successful programs that address these issues while spanning the spectrum of research career development. These programs serve as a model for other fields and disciplines. Core elements of these multicomponent programs include summer internships to foster early interest in GMH research (Summer Training on Aging Research Topics-Mental Health Program), research sponsorships aimed at recruitment into the field of geriatric psychiatry (Stepping Stones), research training institutes for early career development (Summer Research Institute in Geriatric Psychiatry), mentored intensive programs on developing and obtaining a first research grant (Advanced Research Institute in Geriatric Psychiatry), targeted development of minority researchers (Institute for Research Minority Training on Mental Health and Aging), and a Web-based clearinghouse of mentoring seminars and resources (MedEdMentoring.org). This report discusses implications of and principles for disseminating these programs, including examples of replications in fields besides GMH research.
Good, now keep going: challenging the status quo in STEM pipeline and access programs
NASA Astrophysics Data System (ADS)
Wiseman, Dawn; Herrmann, Randy
2018-03-01
This contribution engages in conversation with McMahon, Griese, and Kenyon (this issue) to consider how the SURE program they describe represents a pragmatic approach to addressing the issue of underrepresentation of Indigenous people in STEM post-secondary programs. We explore how such programs are generally positioned and how they might be positioned differently to challenge the status quo within Western post-secondary institutions. The challenge arises from moving beyond the immediate pragmatics of addressing an identifiable issue framed as a problem to considering how post-secondary institutions and people developing access recruitment programs might begin unlearning colonialism.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
El-Jundi, I.M.
Qatar NGL/2 plant, commissioned in December, 1979, was designed to process the associated gas from the offshore crude oil fields of Qatar. The dehydrated sour lean gas and wet sour liquids are transported via two separate lines to Umm Said NGL Complex about 120 kms. from the central offshore station. The liquids line 300 mm diameter (12 inch) has suffered general and severe pitting corrosion. The lean gas line 600 mm diameter (24 inch) has suffered corrosion and extensively hydrogen induced cracking (HIC), also known as HIPC. Both lines never performed to their design parameters and many problems in themore » downstream facilities have been experienced. All efforts to clean the liquids lines from the solids (debris) have failed. This inturn interfered with the planned corrosion control programe, thus allowing corrosion to continue. Investigation work has been done by various specialists in an attempt to find the origin of the solids and to recommend necessary remedial actions. Should lines fall from pitting corrosion, the effect of liquids leak at a pressure of about 11000 kpa will be very dangerous especially if it occurs onshore. In order to protect the NGL-2 operations against possible risks, both interms of safety as well as losses in revenue, critically sections of the pipelines have been replaced, whilst the whole gas liquids pipelines would be replaced shortly. Supplementary documents to the API standards were prepared by QPC for the replaced pipelines.« less
A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.
Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas
2013-05-01
Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. Copyright © 2013 Elsevier Inc. All rights reserved.
A pipeline for comprehensive and automated processing of electron diffraction data in IPLT
Schenk, Andreas D.; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas
2013-01-01
Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library & Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. PMID:23500887
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-20
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials... for natural gas pipelines and for hazardous liquid pipelines. Both committees were established under...
77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...
Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.
Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A
2005-04-07
Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.
Total Magnetic Field Signatures over Submarine HVDC Power Cables
NASA Astrophysics Data System (ADS)
Johnson, R. M.; Tchernychev, M.; Johnston, J. M.; Tryggestad, J.
2013-12-01
Mikhail Tchernychev, Geometrics, Inc. Ross Johnson, Geometrics, Inc. Jeff Johnston, Geometrics, Inc. High Voltage Direct Current (HVDC) technology is widely used to transmit electrical power over considerable distances using submarine cables. The most commonly known examples are the HVDC cable between Italy and Greece (160 km), Victoria-Tasmania (300 km), New Jersey - Long Island (82 km) and the Transbay cable (Pittsburg, California - San-Francisco). These cables are inspected periodically and their location and burial depth verified. This inspection applies to live and idle cables; in particular a survey company could be required to locate pieces of a dead cable for subsequent removal from the sea floor. Most HVDC cables produce a constant magnetic field; therefore one of the possible survey tools would be Marine Total Field Magnetometer. We present mathematical expressions of the expected magnetic fields and compare them with fields observed during actual surveys. We also compare these anomalies fields with magnetic fields produced by other long objects, such as submarine pipelines The data processing techniques are discussed. There include the use of Analytic Signal and direct modeling of Total Magnetic Field. The Analytic Signal analysis can be adapted using ground truth where available, but the total field allows better discrimination of the cable parameters, in particular to distinguish between live and idle cable. Use of a Transverse Gradiometer (TVG) allows for easy discrimination between cable and pipe line objects. Considerable magnetic gradient is present in the case of a pipeline whereas there is less gradient for the DC power cable. Thus the TVG is used to validate assumptions made during the data interpretation process. Data obtained during the TVG surveys suggest that the magnetic field of a live HVDC cable is described by an expression for two infinite long wires carrying current in opposite directions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2011-0127] Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical Hazardous Liquid Pipeline Safety Standards Committee AGENCY: Pipeline and Hazardous Materials...
FutureGen 2.0 Pipeline and Regional Carbon Capture Storage Project - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, Chris; Wortman, David; Brown, Chris
The U.S. Department of Energy’s (DOE) FutureGen 2.0 Program involves two projects: (1) the Oxy-Combustion Power Plant Project and (2) the CO2 Pipeline and Storage Project. This Final Technical Report is focused on the CO2 Pipeline and Storage Project. The FutureGen 2.0 CO2 Pipeline and Storage Project evolved from an initial siting and project definition effort in Phase I, into the Phase II activity consisting permitting, design development, the acquisition of land rights, facility design, and licensing and regulatory approvals. Phase II also progressed into construction packaging, construction procurement, and targeted early preparatory activities in the field. The CO2 Pipelinemore » and Storage Project accomplishments were significant, and in some cases unprecedented. The engineering, permitting, legal, stakeholder, and commercial learnings substantially advance the nation’s understanding of commercial-scale CO2 storage in deep saline aquifers. Voluminous and significant information was obtained from the drilling and the testing program of the subsurface, and sophisticated modeling was performed that held up to a wide range of scrutiny. All designs progressed to the point of securing construction contracts or comfort letters attesting to successful negotiation of all contract terms and willing execution at the appropriate time all major project elements – pipeline, surface facilities, and subsurface – as well as operations. While the physical installation of the planned facilities did not proceed in part due to insufficient time to complete the project prior to the expiration of federal funding, the project met significant objectives prior to DOE’s closeout decision. Had additional time been available, there were no known, insurmountable obstacles that would have precluded successful construction and operation of the project. Due to the suspension of the project, site restoration activities were developed and the work was accomplished. The site restoration efforts are also documented in this report. All permit applications had been submitted to all agencies for those permits or approvals required prior to the start of project construction. Most of the requisite permits were received during Phase II. This report includes information on each permitting effort. Successes and lessons learned are included in this report that will add value to the next generation of carbon storage efforts.« less
Pydpiper: a flexible toolkit for constructing novel registration pipelines.
Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.
Pydpiper: a flexible toolkit for constructing novel registration pipelines
Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.
2014-01-01
Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069
Central Stars of Planetary Nebulae in the LMC
NASA Technical Reports Server (NTRS)
Bianchi, Luciana
2004-01-01
In FUSE cycle 2's program B001 we studied Central Stars of Planetary Nebulae (CSPN) in the Large Magellanic Could. All FUSE observations have been successfully completed and have been reduced, analyzed and published. The analysis and the results are summarized below. The FUSE data were reduced using the latest available version of the FUSE calibration pipeline (CALFUSE v2.2.2). The flux of these LMC post-AGB objects is at the threshold of FUSE's sensitivity, and thus special care in the background subtraction was needed during the reduction. Because of their faintness, the targets required many orbit-long exposures, each of which typically had low (target) count-rates. Each calibrated extracted sequence was checked for unacceptable count-rate variations (a sign of detector drift), misplaced extraction windows, and other anomalies. All the good calibrated exposures were combined using FUSE pipeline routines. The default FUSE pipeline attempts to model the background measured off-target and subtracts it from the target spectrum. We found that, for these faint objects, the background appeared to be over-estimated by this method, particularly at shorter wavelengths (i.e., < 1000 A). We therefore tried two other reductions. In the first method, subtraction of the measured background is turned off and and the background is taken to be the model scattered-light scaled by the exposure time. In the second one, the first few steps of the pipeline were run on the individual exposures (correcting for effects unique to each exposure such as Doppler shift, grating motions, etc). Then the photon lists from the individual exposures were combined, and the remaining steps of the pipeline run on the combined file. Thus, more total counts for both the target and background allowed for a better extraction.
A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines
2011-01-01
Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples. PMID:21352538
A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.
Cieślik, Marcin; Mura, Cameron
2011-02-25
Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples.
Granatum: a graphical single-cell RNA-Seq analysis pipeline for genomics scientists.
Zhu, Xun; Wolfgruber, Thomas K; Tasato, Austin; Arisdakessian, Cédric; Garmire, David G; Garmire, Lana X
2017-12-05
Single-cell RNA sequencing (scRNA-Seq) is an increasingly popular platform to study heterogeneity at the single-cell level. Computational methods to process scRNA-Seq data are not very accessible to bench scientists as they require a significant amount of bioinformatic skills. We have developed Granatum, a web-based scRNA-Seq analysis pipeline to make analysis more broadly accessible to researchers. Without a single line of programming code, users can click through the pipeline, setting parameters and visualizing results via the interactive graphical interface. Granatum conveniently walks users through various steps of scRNA-Seq analysis. It has a comprehensive list of modules, including plate merging and batch-effect removal, outlier-sample removal, gene-expression normalization, imputation, gene filtering, cell clustering, differential gene expression analysis, pathway/ontology enrichment analysis, protein network interaction visualization, and pseudo-time cell series construction. Granatum enables broad adoption of scRNA-Seq technology by empowering bench scientists with an easy-to-use graphical interface for scRNA-Seq data analysis. The package is freely available for research use at http://garmiregroup.org/granatum/app.
Machine-learning-based real-bogus system for the HSC-SSP moving object detection pipeline
NASA Astrophysics Data System (ADS)
Lin, Hsing-Wen; Chen, Ying-Tung; Wang, Jen-Hung; Wang, Shiang-Yu; Yoshida, Fumi; Ip, Wing-Huen; Miyazaki, Satoshi; Terai, Tsuyoshi
2018-01-01
Machine-learning techniques are widely applied in many modern optical sky surveys, e.g., Pan-STARRS1, PTF/iPTF, and the Subaru/Hyper Suprime-Cam survey, to reduce human intervention in data verification. In this study, we have established a machine-learning-based real-bogus system to reject false detections in the Subaru/Hyper-Suprime-Cam Strategic Survey Program (HSC-SSP) source catalog. Therefore, the HSC-SSP moving object detection pipeline can operate more effectively due to the reduction of false positives. To train the real-bogus system, we use stationary sources as the real training set and "flagged" data as the bogus set. The training set contains 47 features, most of which are photometric measurements and shape moments generated from the HSC image reduction pipeline (hscPipe). Our system can reach a true positive rate (tpr) ˜96% with a false positive rate (fpr) ˜1% or tpr ˜99% at fpr ˜5%. Therefore, we conclude that stationary sources are decent real training samples, and using photometry measurements and shape moments can reject false positives effectively.
VIV analysis of pipelines under complex span conditions
NASA Astrophysics Data System (ADS)
Wang, James; Steven Wang, F.; Duan, Gang; Jukes, Paul
2009-06-01
Spans occur when a pipeline is laid on a rough undulating seabed or when upheaval buckling occurs due to constrained thermal expansion. This not only results in static and dynamic loads on the flowline at span sections, but also generates vortex induced vibration (VIV), which can lead to fatigue issues. The phenomenon, if not predicted and controlled properly, will negatively affect pipeline integrity, leading to expensive remediation and intervention work. Span analysis can be complicated by: long span lengths, a large number of spans caused by a rough seabed, and multi-span interactions. In addition, the complexity can be more onerous and challenging when soil uncertainty, concrete degradation and unknown residual lay tension are considered in the analysis. This paper describes the latest developments and a ‘state-of-the-art’ finite element analysis program that has been developed to simulate the span response of a flowline under complex boundary and loading conditions. Both VIV and direct wave loading are captured in the analysis and the results are sequentially used for the ultimate limit state (ULS) check and fatigue life calculation.
The TESS Transiting Planet Search Predicted Recovery and Reliability Rates
NASA Astrophysics Data System (ADS)
Smith, Jeffrey C.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill
2018-06-01
The Transiting Exoplanet Survey Satellite (TESS) will search for transiting planet signatures via the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. We report on predicted transit recovery and reliability rates for planetary signatures. These estimates are based on simulated runs of the pipeline using realistic stellar models and transiting planet populations along with best estimates for instrumental noise, thermal induced focus changes, instrumental drift and stochastic artifacts in the light curve data. Key sources of false positives are identified and summarized. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
DEVELOPMENT OF AN ENVIRONMENTALLY BENIGN MICROBIAL INHIBITOR TO CONTROL INTERNAL PIPELINE CORROSION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristine L. Lowe; Bill W. Bogan; Wendy R. Sullivan
2004-07-30
The overall program objective is to develop and evaluate environmentally benign agents or products that are effective in the prevention, inhibition, and mitigation of microbially influenced corrosion (MIC) in the internal surfaces of metallic natural gas pipelines. The goal is to develop one or more environmentally benign (a.k.a. ''green'') products that can be applied to maintain the structure and dependability of the natural gas infrastructure. Previous testing indicated that the growth, and the metal corrosion caused by pure cultures of sulfate reducing bacteria were inhibited by hexane extracts of some pepper plants. This quarter tests were performed with mixed bacterialmore » cultures obtained from natural gas pipelines. Treatment with the pepper extracts affected the growth and metabolic activity of the microbial consortia. Specifically, the growth and metabolism of sulfate reducing bacteria was inhibited. The demonstration that pepper extracts can inhibit the growth and metabolism of sulfate reducing bacteria in mixed cultures is a significant observation validating a key hypothesis of the project. Future tests to determine the effects of pepper extracts on mature/established biofilms will be performed next.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Parts... Register (75 FR 72878) titled: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting...
ERIC Educational Resources Information Center
Roach, Ronald
2005-01-01
Race-conscious affirmative action in higher education survived a close challenge in 2003 when the U.S. Supreme Court ruled that race was a valid academic admission criteria in the "Grutter v. Bollinger" case. Two years later, a number of "pipeline" programs to help under-represented minorities gain admission to and complete graduate school have…
Disrupting the Pipeline: Critical Analyses of Student Pathways through Postsecondary STEM Education
ERIC Educational Resources Information Center
Metcalf, Heather E.
2014-01-01
Critical mixed methods approaches allow us to reflect upon the ways in which we collect, measure, interpret, and analyze data, providing novel alternatives for quantitative analysis. For institutional researchers, whose work influences institutional policies, programs, and practices, the approach has the transformative ability to expose and create…
A USEPA-sponsored field demonstration program was conducted to gather technically reliable cost and performance information on the electro-scan (FELL -41) pipeline condition assessment technology. Electro-scan technology can be used to estimate the magnitude and location of pote...
METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 9: UNDERGROUND PIPELINES
The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...
49 CFR 107.807 - Approval of non-domestic chemical analyses and tests.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 2 2014-10-01 2014-10-01 false Approval of non-domestic chemical analyses and tests. 107.807 Section 107.807 Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS PROGRAM...
77 FR 62596 - Interim Guidance on State Freight Plans and State Freight Advisory Committees
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
...), Maritime Administration (MARAD), Pipeline and Hazardous Materials Safety Administration (PHSMA), Research... Research Program (23 U.S.C. 505). They may also use carryover balances from National Highway System funds... system's users and to the general public (for example, reductions in crashes, fatalities, and injuries...
ERIC Educational Resources Information Center
Masterson, Kathryn
2008-01-01
The University of Michigan at Ann Arbor is offering a development internship program that is designed to give students real-world experience working in development jobs and the chance to meet major donors and network with alumni. Its goals are lofty: to create a pipeline of young people for the development profession; diversify the fund-raising…
GAPS OF DECISION SUPPORT MODELS FOR PIPELINE RENEWAL AND RECOMMENDATIONS FOR IMPROVEMENT - Paper
As part of the U.S. Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, one key area of research pursued, in collaboration with wastewater and water utilities, was a study of the current approaches available for making rehabilitation versus replac...
See, Randolph B.; Schroder, LeRoy J.; Willoughby, Timothy C.
1988-01-01
During 1986, the U.S. Geological Survey operated three programs to provide external quality-assurance monitoring of the National Atmospheric Deposition Program and National Trends Network. An intersite-comparison program was used to assess the accuracy of onsite pH and specific-conductance determinations at quarterly intervals. The blind-audit program was used to assess the effect of routine sample handling on the precision and bias of program and network wet-deposition data. Analytical results from four laboratories, which routinely analyze wet-deposition samples, were examined to determine if differences existed between laboratory analytical results and to provide estimates of the analytical precision of each laboratory. An average of 78 and 89 percent of the site operators participating in the intersite-comparison met the network goals for pH and specific conductance. A comparison of analytical values versus actual values for samples submitted as part of the blind-audit program indicated that analytical values were slightly but significantly (a = 0.01) larger than actual values for pH, magnesium, sodium, and sulfate; analytical values for specific conductance were slightly less than actual values. The decreased precision in the analyses of blind-audit samples when compared to interlaboratory studies indicates that a large amount of uncertainty in network deposition data may be a result of routine field operations. The results of the interlaboratory comparison study indicated that the magnitude of the difference between laboratory analyses was small for all analytes. Analyses of deionized, distilled water blanks by participating laboratories indicated that the laboratories had difficulty measuring analyte concentrations near their reported detection limits. (USGS)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... interconnect pipelines to four existing offshore pipelines (Dauphin Natural Gas Pipeline, Williams Natural Gas Pipeline, Destin Natural Gas Pipeline, and Viosca Knoll Gathering System [VKGS] Gas Pipeline) that connect to the onshore natural gas transmission pipeline system. Natural gas would be delivered to customers...
Freight pipelines: Current status and anticipated future use
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-07-01
This report is issued by the Task Committee on Freight Pipelines, Pipeline Division, ASCE. Freight pipelines of various types (including slurry pipeline, pneumatic pipeline, and capsule pipeline) have been used throughout the world for over a century for transporting solid and sometimes even package products. Recent advancements in pipeline technology, aided by advanced computer control systems and trenchless technologies, have greatly facilitated the transportation of solids by pipelines. Today, in many situations, freight pipelines are not only the most economical and practical means for transporting solids, they are also the most reliable, safest and most environmentally friendly transportation mode. Increasedmore » use of underground pipelines to transport freight is anticipated in the future, especially as the technology continues to improve and surface transportation modes such as highways become more congested. This paper describes the state of the art and expected future uses of various types of freight pipelines. Obstacles hindering the development and use of the most advanced freight pipeline systems, such as the pneumatic capsule pipeline for interstate transport of freight, are discussed.« less
Development of airframe design technology for crashworthiness.
NASA Technical Reports Server (NTRS)
Kruszewski, E. T.; Thomson, R. G.
1973-01-01
This paper describes the NASA portion of a joint FAA-NASA General Aviation Crashworthiness Program leading to the development of improved crashworthiness design technology. The objectives of the program are to develop analytical technology for predicting crashworthiness of structures, provide design improvements, and perform full-scale crash tests. The analytical techniques which are being developed both in-house and under contract are described, and typical results from these analytical programs are shown. In addition, the full-scale testing facility and test program are discussed.
Derck, Jordan; Zahn, Kate; Finks, Jonathan F; Mand, Simanjit; Sandhu, Gurjit
2016-01-01
Racial minorities continue to be underrepresented in medicine (URiM). Increasing provider diversity is an essential component of addressing disparity in health delivery and outcomes. The pool of students URiM that are competitive applicants to medical school is often limited early on by educational inequalities in primary and secondary schooling. A growing body of evidence recognizing the importance of diversifying health professions advances the need for medical schools to develop outreach collaborations with primary and secondary schools to attract URiMs. The goal of this paper is to describe and evaluate a program that seeks to create a pipeline for URiMs early in secondary schooling by connecting these students with support and resources in the medical community that may be transformative in empowering these students to be stronger university and medical school applicants. The authors described a medical student-led, action-oriented pipeline program, Doctors of Tomorrow, which connects faculty and medical students at the University of Michigan Medical School with 9th grade students at Cass Technical High School (Cass Tech) in Detroit, Michigan. The program includes a core curriculum of hands-on experiential learning, development, and presentation of a capstone project, and mentoring of 9th grade students by medical students. Cass Tech student feedback was collected using focus groups, critical incident written narratives, and individual interviews. Medical student feedback was collected reviewing monthly meeting minutes from the Doctors of Tomorrow medical student leadership. Data were analyzed using thematic analysis. Two strong themes emerged from the Cass Tech student feedback: (i) Personal identity and its perceived effect on goal achievement and (ii) positive affect of direct mentorship and engagement with current healthcare providers through Doctors of Tomorrow. A challenge noted by the medical students was the lack of structured curriculum beyond the 1st year of the program; however, this was complemented by their commitment to the program for continued longitudinal development. The authors propose that development of outreach pipeline programs that are context specific, culturally relevant, and established in collaboration with community partners have the potential to provide underrepresented students with opportunities and skills early in their formative education to be competitive applicants to college and ultimately to medical school.
NASA Sounding Rocket Program Educational Outreach
NASA Technical Reports Server (NTRS)
Rosanova, G.
2013-01-01
Educational and public outreach is a major focus area for the National Aeronautics and Space Administration (NASA). The NASA Sounding Rocket Program (NSRP) shares in the belief that NASA plays a unique and vital role in inspiring future generations to pursue careers in science, mathematics, and technology. To fulfill this vision, the NSRP engages in a variety of educator training workshops and student flight projects that provide unique and exciting hands-on rocketry and space flight experiences. Specifically, the Wallops Rocket Academy for Teachers and Students (WRATS) is a one-week tutorial laboratory experience for high school teachers to learn the basics of rocketry, as well as build an instrumented model rocket for launch and data processing. The teachers are thus armed with the knowledge and experience to subsequently inspire the students at their home institution. Additionally, the NSRP has partnered with the Colorado Space Grant Consortium (COSGC) to provide a "pipeline" of space flight opportunities to university students and professors. Participants begin by enrolling in the RockOn! Workshop, which guides fledgling rocketeers through the construction and functional testing of an instrumentation kit. This is then integrated into a sealed canister and flown on a sounding rocket payload, which is recovered for the students to retrieve and process their data post flight. The next step in the "pipeline" involves unique, user-defined RockSat-C experiments in a sealed canister that allow participants more independence in developing, constructing, and testing spaceflight hardware. These experiments are flown and recovered on the same payload as the RockOn! Workshop kits. Ultimately, the "pipeline" culminates in the development of an advanced, user-defined RockSat-X experiment that is flown on a payload which provides full exposure to the space environment (not in a sealed canister), and includes telemetry and attitude control capability. The RockOn! and RockSat-C elements of the "pipeline" have been successfully demonstrated by five annual flights thus far from Wallops Flight Facility. RockSat-X has successfully flown twice, also from Wallops. The NSRP utilizes launch vehicles comprised of military surplus rocket motors (Terrier-Improved Orion and Terrier-Improved Malemute) to execute these missions. The NASA Sounding Rocket Program is proud of its role in inspiring the "next generation of explorers" and is working to expand its reach to all regions of the United States and the international community as well.
NASA Astrophysics Data System (ADS)
Gonzalez, Eliseo A.
Fostering resiliency and educational success in students that are faced with adversity is not a simple task. The gap in educational success and achievement among low-income, first generation, traditionally marginalized students continues to be significant. California's educational system needs to stop the hemorrhaging from its educational pipeline, also known as the P-20 pipeline, of all students, especially those groups of students with larger gaps in educational attainment. One potential path towards fixing California's educational pipeline for all students is to form and keep partnerships with programs such as Upward Bound, AVID, and Math Engineering Science Achievement (MESA). In 2010-11, the California Department of Education (CDE) reported that over 51% of students enrolled in California's school system and 51% of all California high school seniors were Latino were Latino. Of the 231,231 Latino high school seniors, 79%, graduated. However, of those that graduated, only 26%, met University of California/California State University (UC/CSU) college entrance requirements. Even though 79% of Latinos graduated, 74% did not qualify to apply to a UC or CSU. If the majority of Latino students continue to fall through holes in the educational pipeline, companies will continue to look abroad to fill STEM jobs that remain unfilled by American workers (California Department of Education [CDE], 2012). Alongside the U.S.'s current economic woes, the lack of college preparedness and knowledge by parents and students has led to a decrease in first generation, low-income Latino students' higher education enrollment (Camacho & Lord, 2011). With strong and positive leadership from family, supplemented by the MESA program, these youths can exert their resiliency, face adversity, and overcome extraordinary barriers. Leaders in education such as teachers, coordinators, advisers, administrators, and parents are in the best position to teach students about resilience (Ginsburg, 2007). The American Psychological Association (APA Practice Central, n.d.) defined resilience as, "the ability to handle stress and respond more positively to difficult events" (para. 1). MESA is structured to help foster resiliency and academic success in first generation, low-income, traditionally marginalized students. This study examined the role that leadership and programs such as MESA have in fostering or increasing resiliency attributing to increased academic success among first generation, low-income Latino students in the STEM fields.